multithread run inference on the same model using 2 sessions(c++) #14073
-
Is it thread-safe for multiple threads to call a session? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Calling inference session from different is safe. Yes, weights are shared. See |
Beta Was this translation helpful? Give feedback.
Calling inference session from different is safe. Yes, weights are shared. See
testRunModelMultipleThreads
test inonnxruntime/onnxruntime/test/python/onnxruntime_test_python.py
. For C++ example, seeRunWithOneSessionMultiThreadsInference
test inonnxruntime/onnxruntime/test/providers/tensorrt/tensorrt_basic_test.cc