UT Austin Villa Publications

Sorted by DateClassified by Publication TypeClassified by TopicSorted by First Author Last Name

Robot-centric Activity Recognition 'in the Wild'

Ilaria Gori, Jivko Sinapov, Priyanka Khante, Peter Stone, and J.K. Aggarwal. Robot-centric Activity Recognition 'in the Wild'. In Proceedings of the International Conference on Social Robotics (ICSR), October 2015.

Download

[PDF]361.3kB  

Abstract

This paper considers the problem of recognizing spontaneoushuman activities from a robot's perspective. We present a novel dataset,where data is collected by an autonomous mobile robot moving aroundin a building and recording the activities of people in the surroundings.Activities are not specified beforehand and humans are not prompted toperform them in any way. Instead, labels are determined on the basisof the recorded spontaneous activities. The classification of such activi-ties presents a number of challenges, as the robot's movement affects itsperceptions. To address it, we propose a combined descriptor that, alongwith visual features, integrates information related to the robot’s actions.We show experimentally that such information is important for classify-ing natural activities with high accuracy. Along with initial results forfuture benchmarking, we also provide an analysis of the usefulness andimportance of the various features for the activity recognition task.

BibTeX

@inproceedings{ICSR2015-gori,
  title={Robot-centric Activity Recognition 'in the Wild'},
  author={Ilaria Gori and Jivko Sinapov and Priyanka Khante and Peter Stone and J.K.\ Aggarwal},
  booktitle={Proceedings of the International Conference on Social Robotics (ICSR)},
  month={October},
  year={2015},
  abstract={This paper considers the problem of recognizing spontaneous
human activities from a robot's perspective. We present a novel dataset,
where data is collected by an autonomous mobile robot moving around
in a building and recording the activities of people in the surroundings.
Activities are not specified beforehand and humans are not prompted to
perform them in any way. Instead, labels are determined on the basis
of the recorded spontaneous activities. The classification of such activi-
ties presents a number of challenges, as the robot's movement affects its
perceptions. To address it, we propose a combined descriptor that, along
with visual features, integrates information related to the robot’s actions.
We show experimentally that such information is important for classify-
ing natural activities with high accuracy. Along with initial results for
future benchmarking, we also provide an analysis of the usefulness and
importance of the various features for the activity recognition task.},
}

Generated by bib2html.pl (written by Patrick Riley ) on Tue Nov 19, 2024 10:29:30