Imagine a mobile device that intelligently anticipates your intended action even before you touch the screen.
That’s what’s being presented in Pre-Touch Sensing for Mobile Interaction. Ken Hinckley, a principal researcher at Microsoft who led the project, said the research is based on a whole different philosophy of interaction design.
The research uses the phone’s ability to sense how you are gripping the device as well as when and where the fingers are approaching it.
“It uses the hands as a window to the mind,” Hinckley said.
By allowing the interfaces to adapt to you, on the fly, they are always tailored to the specific context of how you are currently holding or using your phone.
“I think it has huge potential for the future of mobile interaction,” he said. “And I say this as one of the very first people to explore the possibilities of sensors on mobile phones, including the now ubiquitous capability to sense and auto-rotate the screen orientation.”
Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.
Join us every Wednesday night at 8pm ET for Ask an Engineer!
Maker Business — The first step in making: a PCB
Wearables — Try maximum twinkling
Electronics — Behold…the power of pseudocode.
Biohacking — All Day EEG Recording Tools
Python for Microcontrollers — PyCon US 2018 Rocked!
No comments yet.
Sorry, the comment form is closed at this time.