UCF101: A Dataset of 101 Human Actions Classes From Videos in The Wild
2012-12-03Code Available0· sign in to hype
Khurram Soomro, Amir Roshan Zamir, Mubarak Shah
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/mcgridles/LENSpytorch★ 9
- github.com/rlaengud123/CMC_LRCNpytorch★ 0
- github.com/ryanchesler/comma-speed-challengetf★ 0
- github.com/doronharitan/human_activity_recognition_LRCNpytorch★ 0
- github.com/wushidonguc/two-stream-action-recognition-kerasnone★ 0
- github.com/niveditarahurkar/CS231N-ActionRecognitionpytorch★ 0
- github.com/Alexyuda/action_recognitionpytorch★ 0
Abstract
We introduce UCF101 which is currently the largest dataset of human actions. It consists of 101 action classes, over 13k clips and 27 hours of video data. The database consists of realistic user uploaded videos containing camera motion and cluttered background. Additionally, we provide baseline action recognition results on this new dataset using standard bag of words approach with overall performance of 44.5%. To the best of our knowledge, UCF101 is currently the most challenging dataset of actions due to its large number of classes, large number of clips and also unconstrained nature of such clips.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| UCF101 | Baseline UCF101 | 3-fold Accuracy | 43.9 | — | Unverified |