Improving touch-screen accuracy The personal touch

OVER the past few years, the way in which people point has changed. Throughout most of human history, small things, such as fingers, had been used to point to bigger things, such as antelopes. But thanks to gadgets with touch-screens, people are now tying to point to things smaller than their fingers, such as hyperlinks or items on a crowded map. As would be expected, accuracy suffers.

Software engineers have cleverly concealed the extent of the most obvious difficulty, which is that people tend to miss the intended target. When using a touch-screen keyboard, for example, the letters already tapped out give the software clues about which letter might come next. But, at present, software cannot correct for a person who, say, consistently points left of where he intends to point. Nor can it take account of fingers poked from different angles: approaching from directly above a screen often produces a different result from coming in at a low angle.

Patrick Baudisch, a computer scientist at the Hasso Plattner Institute in Potsdam, Germany, thinks that he can improve the accuracy of touch-screens by identifying exactly where individuals intend to point. He and his graduate student, Christian Holz, have used a fingerprint scanner like those employed by the immigration authorities in some of the world’s more paranoid countries to capture information on the orientation of a person’s finger as it touches the screen, as well as the finger’s yaw, pitch and roll once it is in contact.
Click here to find out more!

When a finger is placed on the scanner’s screen, it scatters light coming from within the device. A camera collects this light and produces a high-quality fingerprint. The researchers used this information to develop a pointing profile by determining the core of a person’s “point”—that place on their finger where their pointing intention is centred, regardless of the direction from which the finger has come or the surface area of the contact. They are hoping, too, to slim the device down using a technology called “in cell” which, in effect, adds a light sensor to every pixel in a screen.

To hasten the adoption of their technology, Dr Baudisch and Mr Holz have also developed algorithms to match a person’s fingerprint to his pointing profile. These algorithms could be built into gadget software so that, for example, a mobile phone recognises who is using it and adjusts itself accordingly. It might even try to guess which number you will ring next.
 
Back
Top