SLAC
A Sparsely Labeled ACtions Dataset
200 classes, 520K untrimmed videos, 1.75M clip annotations

About

This project presents a novel video dataset, named SLAC (Sparsely Labeled ACtions), for action recognition and localization. It consists of over 520K untrimmed videos and 1.75M clip annotations spanning 200 action categories. Using our proposed framework, annotating a video clip takes merely 8.8 seconds on average, representing a saving in labeling time of over 95% compared to the traditional procedure of manual trimming and localization of actions. We show that our large-scale dataset can be used to effectively pretrain action recognition and detection models, significantly improving final metrics on smaller-scale benchmarks after fine-tuning, eg. HMDB-51, UCF-101, ActivityNet, Kinetics.

Large-scale Dataset

SLAC dataset includes:

  • 200 classes
  • 520K untrimmed videos
  • 1.75M clip annotations

Efficient Annotations

SLAC annotations cost:

  • 8.8 seconds per clip
  • 30.6 seconds per video
  • Savings in time: >95%

Benchmarking

SLAC pretrained model improves:

  • Action recognition:
    Kinetics, UCF-101 and HMDB-51
  • Action localization:
    THUMOS, ActivityNet

Explore

Click on REFRESH to find new samples.

Each row shows the sampled clips from one video, their corresponding start and end times (start, end), and the annotations (Positive or Negative).

Download

Download our paper, dataset, code and pretrained models.

Paper

  • Paper released on arXiv.
  • Supplementary materials released.

Dataset

  • Coming soon.

Code and Models

  • Coming soon.

If you find our work helpful, please cite the following paper:

          @article{zhao2017slac,
            title={SLAC: A Sparsely Labeled Dataset for Action Classification and Localization},
            author={Zhao, Hang and Yan, Zhicheng and Wang, Heng and Torresani, Lorenzo and Torralba, Antonio},
            journal={arXiv preprint arXiv:1712.09374},
            year={2017}
          }

Our Team

This work is a joint effort of several researchers.

Hang
Zhao

MIT

Zhicheng
Yan

Facebook Research

Heng
Wang

Facebook Research

Lorenzo
Torresani

Dartmouth College

Manohar
Paluri

Facebook Research

Antonio
Torralba

MIT