Skip to content

Latest commit

 

History

History
62 lines (44 loc) · 3.21 KB

detectnet-tracking.md

File metadata and controls

62 lines (44 loc) · 3.21 KB

Back | Next | Contents
Object Detection

Object Tracking on Video

Although with the accuracy of modern detection DNNs you can essentially do "tracking by detection", some degree of temporal filtering can be beneficial to smooth over blips in the detections and temporary occlusions in the video. jetson-inference includes basic (but fast) multi-object tracking using frame-to-frame IOU (intersection-over-union) bounding box comparisons from High-Speed Tracking-by-Detection Without Using Image Information (DeepStream has more comprehensive tracking implementations available from here).

To enable tracking with detectnet/detectnet.py, run it with the --tracking flag.

# Download test video
wget https://nvidia.box.com/shared/static/veuuimq6pwvd62p9fresqhrrmfqz0e2f.mp4 -O pedestrians.mp4

# C++
$ detectnet --model=peoplenet --tracking pedestrians.mp4 pedestrians_tracking.mp4

# Python
$ detectnet.py --model=peoplenet --tracking pedestrians.mp4 pedestrians_tracking.mp4

There are other tracking settings you can change with the following command-line options:

objectTracker arguments:
  --tracking               flag to enable default tracker (IOU)
  --tracker-min-frames=N   the number of re-identified frames for a track to be considered valid (default: 3)
  --tracker-lost-frames=N  number of consecutive lost frames before a track is removed (default: 15)
  --tracker-overlap=N      how much IOU overlap is required for a bounding box to be matched (default: 0.5)

These settings can be also made via Python with the detectNet.SetTrackerParams() function or in C++ by the objectTracker API:

Python

from jetson_utils import detectNet

net = detectNet()

net.SetTrackingEnabled(True)
net.SetTrackingParams(minFrames=3, dropFrames=15, overlapThreshold=0.5)

C++

#include <jetson-inference/detectNet.h>
#include <jetson-inference/objectTrackerIOU.h>

detectNet* net = detectNet::Create();

net->SetTracker(objectTrackerIOU::Create(3, 15, 0.5f));

To play around with these settings interactively, you can use the Flask webapp from your browser.

Next | Semantic Segmentation
Back | Using TAO Detection Models

© 2016-2023 NVIDIA | Table of Contents