Project Info


Confidence-Based Category Transition of Spatial Gestures

Tom Williams | twilliams@mines.edu

Situated human-human communication typically involves a combination of both natural language and gesture, especially deictic gestures intended to draw the listener’s attention to target referents. To engage in natural communication, robots must thus be similarly enabled not only to generate natural language, but to generate the appropriate gestures to accompany that language. In this project, students will examine the gestures humans use to accompany spatial language, specifically the way that these gestures continuously degrade in specificity and then discretely transition into non-deictic gestural forms along with decreasing confidence in referent location. Students will further use collected data to design more human-like gestures for language-capable robots.

For more information:
https://mirrorlab.mines.edu/publications/stogsdill2020hrinlg/

Grand Challenge: Reverse-engineer the brain

Student Preparation


Qualifications

Familiarity with both human-subjects research and machine learning research techniques.

Time Commitment

5 hours/week

Skills/Techniques Gained

Students will gain interdisciplinary knowledge in machine learning, robotics, psychology, and communication.

Mentoring Plan

Students will meet with Dr. Williams and/or a graduate student at least once per week. The project is expected to lead to publication.