SOTAVerified

Recognizing Activities of Daily Living from Egocentric Images

2017-04-13Unverified0· sign in to hype

Alejandro Cartas, Juan Marín, Petia Radeva, Mariella Dimiccoli

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Recognizing Activities of Daily Living (ADLs) has a large number of health applications, such as characterize lifestyle for habit improvement, nursing and rehabilitation services. Wearable cameras can daily gather large amounts of image data that provide rich visual information about ADLs than using other wearable sensors. In this paper, we explore the classification of ADLs from images captured by low temporal resolution wearable camera (2fpm) by using a Convolutional Neural Networks (CNN) approach. We show that the classification accuracy of a CNN largely improves when its output is combined, through a random decision forest, with contextual information from a fully connected layer. The proposed method was tested on a subset of the NTCIR-12 egocentric dataset, consisting of 18,674 images and achieved an overall accuracy of 86% activity recognition on 21 classes.

Tasks

Reproductions