-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement frame skipping #19
Comments
Hi, TheAxelBoy This method looks like a method for sequential processing of frames. Since the WebcamVideoStream class in helper.py has been updated to the latest frame information in the thread, it can be said that this skip has already been done. |
Hi, So I just changed the code a little bit to apply processing on every x frame only.
After that I added the new parameter in every call of
It is a quite naive approach and I used it in combination with Medianflow to track cars on a road, where not every frame needs to be run through the detector. I get about 50 fps (Jetson TX2) with skipping every second frame. |
Hi @TheAxelBoy, That sounds interesting! I looked at the OpenCV Tracking API. It seems easy to use. |
The OpenCV Tracking API is crap. i implemented it and it is slow as hell. |
Hi @naisy Although I don't like the OpenCV Tracking API, I built a multitracker using the medianflow single tracker as a base which works quite ok and has about 50 fps on its own. A better approach would probably be to write a Python wrapper for the Nvidia VisionWorks medianflow tracker. I will fork the code and post the frame skipping code on my profile. |
Implementing frame skipping would increase performance by a lot, as usually not every frame a second is needed for specific tasks. I tried to do it like this:
This approach leads to a lot of problems, where the split_model part uses and displays older images. As I am not familiar with Python threading, multiprocessing and queues, probably someone with more experience could implement this or give hints for the right direction.
The text was updated successfully, but these errors were encountered: