UT Austin Villa Publications

Sorted by DateClassified by Publication TypeClassified by TopicSorted by First Author Last Name

Socially CompliAnt Navigation Dataset (SCAND): A Large-Scale Dataset Of Demonstrations For Social Navigation

Haresh Karnan, Anirudh Nair, Xuesu Xiao, Garrett Warnell, Soren Pirk, Alexander Toshev, Justin Hart, Joydeep Biswas, and Peter Stone. Socially CompliAnt Navigation Dataset (SCAND): A Large-Scale Dataset Of Demonstrations For Social Navigation. Robotics and Automation Letters (RA-L), 2022, 7:11807–14, October 2022.
Dataset; Poster; Video Presentation

Download

[PDF]4.2MB  

Abstract

Social navigation is the capability of an autonomous agent, such as a robot, to navigate in a "socially compliant" manner in the presence of other intelligent agents such as humans. With the emergence of autonomously navigating mobile robots in human-populated environments (e.g., domestic service robots in homes and restaurants and food delivery robots on public sidewalks), incorporating socially compliant navigation behaviors on these robots becomes critical to ensuring safe and comfortable human-robot coexistence. To address this challenge, imitation learning is a promising framework, since it is easier for humans to demonstrate the task of social navigation rather than to formulate reward functions that accurately capture the complex multi-objective setting of social navigation. The use of imitation learning and inverse reinforcement learning to social navigation for mobile robots, however, is currently hindered by a lack of large-scale datasets that capture socially compliant robot navigation demonstrations in the wild. To fill this gap, we introduce Socially CompliAnt Navigation Dataset ( SCAND )—a large-scale, first-person-view dataset of socially compliant navigation demonstrations. Our dataset contains 8.7 hours, 138 trajectories, 25 miles of socially compliant, human tele-operated driving demonstrations that comprises multi-modal data streams including 3D lidar, joystick commands, odometry, visual and inertial information, collected on two morphologically different mobile robots—a Boston Dynamics Spot and a Clearpath Jackal—by four different human demonstrators in both indoor and outdoor environments. We additionally perform preliminary analysis and validation through real-world robot experiments and show that navigation policies learned by imitation learning on SCAND generate socially compliant behaviors.

BibTeX

@Article{RAL22-Karnan,
  author = {Haresh Karnan and Anirudh Nair and Xuesu Xiao and Garrett Warnell and Soren Pirk and Alexander Toshev and Justin Hart and Joydeep Biswas and Peter Stone},
  title = {Socially CompliAnt Navigation Dataset (SCAND): A Large-Scale Dataset Of Demonstrations For Social Navigation},
  journal = {Robotics and Automation Letters (RA-L), 2022},
  location = {Japan},
  month = {October},
  year = {2022},
  pages={11807-14},
  volume={7},issue="4",
  abstract = {Social navigation is the capability of an autonomous agent, such as a robot, to navigate in a "socially compliant" manner in the presence of other intelligent agents such as humans. With the emergence of autonomously navigating mobile robots in human-populated environments (e.g., domestic service robots in homes and restaurants and food delivery robots on public sidewalks), incorporating socially compliant navigation behaviors on these robots becomes critical to ensuring safe and comfortable human-robot coexistence. To address this challenge, imitation learning is a promising framework, since it is easier for humans to demonstrate the task of social navigation rather than to formulate reward functions that accurately capture the complex multi-objective setting of social navigation. The use of imitation learning and inverse reinforcement learning to social navigation for mobile robots, however, is currently hindered by a lack of large-scale datasets that capture socially compliant robot navigation demonstrations in the wild. To fill this gap, we introduce Socially CompliAnt Navigation Dataset ( SCAND )—a large-scale, first-person-view dataset of socially compliant navigation demonstrations. Our dataset contains 8.7 hours, 138 trajectories, 25 miles of socially compliant, human tele-operated driving demonstrations that comprises multi-modal data streams including 3D lidar, joystick commands, odometry, visual and inertial information, collected on two morphologically different mobile robots—a Boston Dynamics Spot and a Clearpath Jackal—by four different human demonstrators in both indoor and outdoor environments. We additionally perform preliminary analysis and validation through real-world robot experiments and show that navigation policies learned by imitation learning on SCAND generate socially compliant behaviors.},
  wwwnote={
<A href="https://dataverse.tdl.org/dataset.xhtml?persistentId=doi:10.18738/T8/0PRYRH">Dataset</a>; 
<a href="https://hareshkarnan.github.io/data/SCAND_poster.pdf">Poster</a>; 
<a href="https://youtu.be/QgBfMjWpQIw">Video Presentation</a>}
}

Generated by bib2html.pl (written by Patrick Riley ) on Sun Nov 24, 2024 20:30:02