Microsoft is experimenting with display technology and has shown off a research project that would enable a smartphone's touchscreen to predict your movements.
The technology, called "pre-touch sensing," uses sensors embedded in a device to predict how and when you'll touch its display. It can detect how you're holding the phone as well as interactions with the display itself.
Microsoft highlights how this technology could work with a video app. When it senses a hand near the display, playback controls appear on screen. If you happen to be using your device one-handed, these controls will appear on the side where your hand is.
“I think it has huge potential for the future of mobile interaction,” principal researcher at Microsoft, Ken Hinckley said. “And I say this as one of the very first people to explore the possibilities of sensors on mobile phones, including the now ubiquitous capability to sense and auto-rotate the screen orientation.”
This project is still in its early phases, and it's unclear when, or even if, it will make its way to consumer devices.