Peter Stone's Selected Publications

Classified by TopicClassified by Publication TypeSorted by DateSorted by First Author Last NameClassified by Funding Source


Improving Grounded Natural Language Understanding through Human-Robot Dialog

Improving Grounded Natural Language Understanding through Human-Robot Dialog.
Jesse Thomason, Aishwarya Padmakumar, Jivko Sinapov, Nick Walker, Yuqian Jiang, Harel Yedidsion, Justin Hart, Peter Stone, and Raymond Mooney.
In Proceedings of the International Conference on Robotics and Automation (ICRA 2019), May 2019.

Download

[PDF]1.6MB  

Abstract

Natural language understanding for robotics can require substantial domain- and platform-specific engineering. For example, for mobile robots to pick-and-place objects in an environment to satisfy human commands, we can specify the language humans use to issue such commands, and connect concept words like red can to physical object properties. One way to alleviate this engineering for a new domain is to enable robots in human environments to adapt dynamically — continually learning new language constructions and perceptual concepts. In this work, we present an end-to-end pipeline for translating natural language commands to discrete robot actions, and use clarification dialogs to jointly improve language parsing and concept grounding. We train and evaluate this agent in a virtual setting on Amazon Mechanical Turk, and we transfer the learned agent to a physical robot platform to demonstrate it in the real world.

BibTeX Entry

@InProceedings{ICRA19-thomason,
  author = {Jesse Thomason and Aishwarya Padmakumar and Jivko Sinapov and Nick Walker and Yuqian Jiang and Harel Yedidsion and Justin Hart and Peter Stone and Raymond Mooney},
  title = {Improving Grounded Natural Language Understanding through Human-Robot Dialog},
  booktitle = {Proceedings of the International Conference on Robotics and Automation (ICRA 2019)},
  location = {Montreal, Canada},
  month = {May},
  year = {2019},
  abstract = {
  Natural language understanding for robotics can require substantial domain- 
  and platform-specific engineering. For example, for mobile robots to 
  pick-and-place objects in an environment to satisfy human commands, we can 
  specify the language humans use to issue such commands, and connect concept 
  words like red can to physical object properties. One way to alleviate this 
  engineering for a new domain is to enable robots in human environments to 
  adapt dynamically — continually learning new language constructions and 
  perceptual concepts. In this work, we present an end-to-end pipeline for 
  translating natural language commands to discrete robot actions, and use 
  clarification dialogs to jointly improve language parsing and concept 
  grounding. We train and evaluate this agent in a virtual setting on Amazon 
  Mechanical Turk, and we transfer the learned agent to a physical robot 
  platform to demonstrate it in the real world.
  },
}

Generated by bib2html.pl (written by Patrick Riley ) on Tue Nov 19, 2024 10:24:43