The problem so far with sensor gloves for translating American Sign Language has been that they only interpret letters, which are signed by quick finger gestures – whereas sign language also has more complex gestures for words and meaning.
A team project started by Kendall Lowrey and other students from Carnegie Mellon University means to fix that problem, and also connect the glove to a voicebox shield for on-the-spot verbalization of the translated signs.
It uses flex sensors for the fingers, and an accelerometer in an attempt to interpret arm gestures. There’s also a tiny LCD, an even tinier speaker, a bunch of wires and inputs (belonging mostly to an Arduino Mega), and a lot of maths and audio filters goings on.
Currently the prototype glove can only recognize the full alphabet and about ten words, but if it can do that, it can also be expanded to do a whole lot more.