Temporal Grounding Graphs for Language Understanding with Accrued Visual-Linguistic Context release_svwhky2zl5dzfa36na2msqbkde

by Rohan Paul, Andrei Barbu, Sue Felshin, Boris Katz, Nicholas Roy

Published in International Joint Conference on Artificial Intelligence by International Joint Conferences on Artificial Intelligence Organization.

2017   p4506-4514

Abstract

A robot's ability to understand or ground natural language instructions is fundamentally tied to its knowledge about the surrounding world. We present an approach to grounding natural language utterances in the context of factual information gathered through natural-language interactions and past visual observations. A probabilistic model estimates, from a natural language utterance, the objects, relations, and actions that the utterance refers to, the objectives for future robotic actions it implies, and generates a plan to execute those actions while updating a state representation to include newly acquired knowledge from the visual-linguistic context. Grounding a command necessitates a representation for past observations and interactions; however, maintaining the full context consisting of all possible observed objects, attributes, spatial relations, actions, etc., over time is intractable. Instead, our model, Temporal Grounding Graphs, maintains a learned state representation for a belief over factual groundings, those derived from natural-language interactions, and lazily infers new groundings from visual observations using the context implied by the utterance. This work significantly expands the range of language that a robot can understand by incorporating factual knowledge and observations of its workspace into its inference about the meaning and grounding of natural-language utterances.
In application/xml+jats format

Archived Files and Locations

application/pdf  14.6 MB
file_w6upatlay5a45oovgdgkn2zcwu
web.archive.org (webarchive)
dspace.mit.edu (web)
application/pdf  14.6 MB
file_cuiqzstsfzbntoeiwggkfm7wvu
web.archive.org (webarchive)
www.ijcai.org (web)
Read Archived PDF
Preserved and Accessible
Type  paper-conference
Stage   published
Year   2017
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: edc3d81f-1e32-4da5-85f0-12e270bad2d0
API URL: JSON