Research Highlight: Glove Prototype Interprets American Sign Language

May 15, 2008

A student team advised by Professor Priya Narasimhan has created a research prototype of a glove that can translate American Sign Language (ASL) into text and sound. Narasimhan, who has led research in other assistive technologies, approached students with an idea for a gesture-recognition glove as part of her Embedded Systems Design capstone course.

The glove works through sensors that detect finger movements and then send signatures of these movements to a smart phone through Bluetooth technology. The smart phone, which is loaded with gesture-to-text and text-to-speech programs, then translates the signatures into audible words. The goal is to enable a person with a hearing disability to communicate with people who do not know ASL.

Three ECE seniors, Bhargav Bhat, Jorge Meza and Hemant Sikaria, and an ECE graduate student, Wesley Jin, made up the team that created the prototype during the Spring 2008 course. Narasimhan and the team intend to continue its development.

Narasimhan's previous research in assistive technology includes the Trinetra project for the blind. She is co-director of the CyLab Mobility Research Center, based at Carnegie Mellon in Silicon Valley.


Related Links