Over the past decade, touch screens have changed the way we interact with our electronic devices. Gone are the days of clicking and pecking at keyboards, with these gestures replaced by swipes, taps, and long-presses on most of our newer devices. From the early years spent swapping out vaccuum tubes and reading light indicators, human interaction with computers has been constantly evolving. Can Qeexo's FingerSense usher in the next era in manual input?
While still in development, FingerSense promises to utilize more than just our fingertips. Knuckles, nails, and even different parts of a stylus can be differentiated, allowing for many different input modes.
The technology operates by collecting data from both the touch screen and the microphone. The tap a fingernail makes on the screen sounds different than that of a stylus, and FingerSense knows it.
By differentiating between all of these different taps, FingerSense makes various operations easier to perform by having associated actions for each gesture.
The technology is not quite ready, but as seen in the video above, FingerSense is already functional and being tested on a Nexus 5 (and seemingly from the images, a Galaxy S5). Perhaps the future is not so far away, after all.
Want to master Microsoft Excel and take your work-from-home job prospects to the next level? Jump-start your career with our Premium A-to-Z Microsoft Excel Training Bundle from the new Gadget Hacks Shop and get lifetime access to more than 40 hours of Basic to Advanced instruction on functions, formula, tools, and more.