Is it possible to run onnx runtime with cuda/tensorRT EP on GPU tensors? #14397
Unanswered
bpodrygajlo
asked this question in
Other Q&A
Replies: 1 comment
-
Can't help with 1st hand knowledge, as I'm using DML EP, but... Also did you check [the sample test (C# API)]?(https://github.com/microsoft/onnxruntime/tree/main/csharp/test/Microsoft.ML.OnnxRuntime.Tests.Common) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, haven't found a good sample for this: my data is already in GPU memory and onnxrutime seems to assume it has to copy data from host to GPU for CUDA/TensorRT EPs. Is there a way to configure onnxruntime/provider for this not to happen?
Currently I am using
MemoryInfo::CreateCpu
to create the tensors.Beta Was this translation helpful? Give feedback.
All reactions