Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

如何设置导出engine模型的batch呢? #210

Open
tianlongyang-bot opened this issue Apr 23, 2024 · 7 comments
Open

如何设置导出engine模型的batch呢? #210

tianlongyang-bot opened this issue Apr 23, 2024 · 7 comments

Comments

@tianlongyang-bot
Copy link

转onnx 再转engine 默认batch为1,如何自定义?batch=1多路推理速度很慢

@XhHello
Copy link

XhHello commented Apr 23, 2024

转onnx 再转engine 默认batch为1,如何自定义?batch=1多路推理速度很慢

是不是可以在pt转onnx时,设置batch的值。请问你可以推理视频或CSI视频流吗

@tianlongyang-bot
Copy link
Author

好像是转onnx设置batch , 对于jetson TX1 NX 开发板,batch设置多少合适呢?可以推理视频,我试过例子了,作者的代码能跑通,但是是1batch

@XhHello
Copy link

XhHello commented Apr 23, 2024

好像是转onnx设置batch , 对于jetson TX1 NX 开发板,batch设置多少合适呢?可以推理视频,我试过例子了,作者的代码能跑通,但是是1batch

看你的显存吧,显存大就可以多设置点,我也不是很懂。请问你用的哪个文件推理的视频啊,需要修改代码吗

@tianlongyang-bot
Copy link
Author

我是用的deepstream -app 推理的,文件就是csrc/deepstream/deepstream_app_config.txt 我只是走一下流程

@triple-Mu
Copy link
Owner

没全没有做batch>1的。deepstream应该支持动态batch的,您可以探索以后为仓库提pr吗?

@tianlongyang-bot
Copy link
Author

在 export-det.py 中的

 parser.add_argument('--input-shape',
                        nargs='+',
                        type=int,
                        default=[8, 3, 640, 640],
                        help='Model input shape only for api builder')

default=[1, 3, 640, 640] 改成 default=[8, 3, 640, 640] 后续是否是静态 8 batch 推理,可以一次推理8张图片,是这样的吗

没全没有做batch>1的。deepstream应该支持动态batch的,您可以探索以后为仓库提pr吗?

@ChenJian7578
Copy link

转onnx 再转engine 默认batch为1,如何自定义?batch=1多路推理速度很慢

用triton不就行了 起多个实例

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants