Skip to content

multithread run inference on the same model using 2 sessions(c++) #14073

Discussion options

You must be logged in to vote

Calling inference session from different is safe. Yes, weights are shared. See testRunModelMultipleThreads test in onnxruntime/onnxruntime/test/python/onnxruntime_test_python.py. For C++ example, see RunWithOneSessionMultiThreadsInference test in onnxruntime/onnxruntime/test/providers/tensorrt/tensorrt_basic_test.cc

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@shiqingzhangCSU
Comment options

Answer selected by shiqingzhangCSU
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants