
When the wearer moves his tongue, this forces air around the mouth, causing pressure changes that provide a unique signature of the movement.
The earphone detects the tongue being pressed against the sides. This generates different vibrations, depending upon the number of times you press it. The vibrations are depicted as signals on the monitor’s screen. The device carries out preset functions according to the signals
The pressure changes are conveyed from the mouth to the ear canal via the Eustachian tube. The microphone detects the pressure shifts and transcribes them into electrical signals, which are sent to a computer that then converts them into commands to steer the wheelchair.
The chair, invented by Ravi Vaidyanathan at the UK-based University of Southampton and Lalit Gupta of US-based Southern Illinois
University Carbondale, is an improvement on present technology.
Quadriplegics typically have to suck or blow into a straw, which can be unhygienic, or move a computer cursor to guide the wheelchair.
The tongue-guided device is to be launched by American company Switch-IT by the end of the year, according to the report, which appears in next Saturday’s issue of New Scientist.
The InnerPilot technology could also be adapted to enable disabled persons to control lights, electrical appliances, and personal computers on a hands-free basis.
“This technology also has applications in computer gaming and military or law enforcement markets for applications that require the hands to be free for other tasks,” noted Jim West, CEO of Think-A-Move.
http://www.mumbaimirror.com/net/mmp...&contentid=2007070602375057834f1fcbb&pageno=1