Recognition of Activities from Eye Gaze and Egocentric Video
2018-05-18Unverified0· sign in to hype
Anjith George, Aurobinda Routray
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
This paper presents a framework for recognition of human activity from egocentric video and eye tracking data obtained from a head-mounted eye tracker. Three channels of information such as eye movement, ego-motion, and visual features are combined for the classification of activities. Image features were extracted using a pre-trained convolutional neural network. Eye and ego-motion are quantized, and the windowed histograms are used as the features. The combination of features obtains better accuracy for activity classification as compared to individual features.