Over the past decade, touch screens have changed the way we interact with our electronic devices. Gone are the days of clicking and pecking at keyboards, with these gestures replaced by swipes, taps, and long-presses on most of our newer devices. From the early years spent swapping out vaccuum tubes and reading light indicators, human interaction with computers has been constantly evolving. Can Qeexo's FingerSense usher in the next era in manual input?
While still in development, FingerSense promises to utilize more than just our fingertips. Knuckles, nails, and even different parts of a stylus can be differentiated, allowing for many different input modes.
The technology operates by collecting data from both the touch screen and the microphone. The tap a fingernail makes on the screen sounds different than that of a stylus, and FingerSense knows it.
By differentiating between all of these different taps, FingerSense makes various operations easier to perform by having associated actions for each gesture.
The technology is not quite ready, but as seen in the video above, FingerSense is already functional and being tested on a Nexus 5 (and seemingly from the images, a Galaxy S5). Perhaps the future is not so far away, after all.