Skip to content

OpenMMLab's Next Generation Action Understanding Toolbox and Benchmark

License

Notifications You must be signed in to change notification settings

edenfrenkel/mmaction2

 
 

Repository files navigation

Introduction

Documentation actions codecov PyPI LICENSE Average time to resolve an issue Percentage of issues still open

MMAction2 is an open-source toolbox for action understanding based on PyTorch. It is a part of the OpenMMLab project.

The master branch works with PyTorch 1.3+.

Major Features

  • Modular design

    We decompose the action understanding framework into different components and one can easily construct a customized action understanding framework by combining different modules.

  • Support for various datasets

    The toolbox directly supports multiple datasets, UCF101, Kinetics-400, Something-Something V1&V2, Moments in Time, Multi-Moments in Time, THUMOS14, etc.

  • Support for multiple action understanding frameworks

    MMAction2 implements popular frameworks for action understanding:

    • For action recognition, various algorithms are implemented, including TSN, TSM, R(2+1)D, I3D, SlowOnly, SlowFast, Non-local.

    • For temporal action localization, we implement BSN, BMN.

  • Well tested and documented

    We provide detailed documentation and API reference, as well as unittests.

License

This project is released under the Apache 2.0 license.

Benchmark

Model input io backend batch size x gpus MMAction2 (s/iter) MMAction (s/iter) Temporal-Shift-Module (s/iter) PySlowFast (s/iter)
TSN 256p rawframes Memcached 32x8 0.32 0.38 0.42 x
TSN 256p dense-encoded video Disk 32x8 0.61 x x TODO
I3D heavy 256p videos Disk 8x8 0.34 x x 0.44
I3D 256p rawframes Memcached 8x8 0.43 0.56 x x
TSM 256p rawframes Memcached 8x8 0.31 x 0.41 x
Slowonly 256p videos Disk 8x8 0.32 TODO x 0.34
Slowfast 256p videos Disk 8x8 0.69 x x 1.04
R(2+1)D 256p videos Disk 8x8 0.45 x x x

Details can be found in benchmark.

ModelZoo

Supported methods for action recognition:

Supported methods for action localization:

Results and models are available in the README.md of each method's config directory. A summary can be found in the model zoo page.

Installation

Please refer to install.md for installation.

Data Preparation

Please refer to data_preparation.md for a general knowledge of data preparation.

Get Started

Please see getting_started.md for the basic usage of MMAction2. There are also tutorials for finetuning models, adding new dataset, designing data pipeline, and adding new modules.

A Colab tutorial is also provided. You may preview the notebook here or directly run on Colab.

Contributing

We appreciate all contributions to improve MMAction2. Please refer to CONTRIBUTING.md for the contributing guideline.

Acknowledgement

MMAction2 is an open source project that is contributed by researchers and engineers from various colleges and companies. We appreciate all the contributors who implement their methods or add new features, as well as users who give valuable feedbacks. We wish that the toolbox and benchmark could serve the growing research community by providing a flexible toolkit to reimplement existing methods and develop their own new models.

About

OpenMMLab's Next Generation Action Understanding Toolbox and Benchmark

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 97.4%
  • Shell 2.5%
  • Dockerfile 0.1%