Short Bytes: In the future, you’ll be able to control Windows 10 using your eyes. A new feature in the making called Eye Control would allow you to do various things with the help of supported hardware. The new tool is an accessibility feature to make Windows available to a wider range of users and it’s available in Windows Insider Build 16257.Microsoft is preparing to make one significant addition to its list of accessibility features in Windows 10. Expressing concerns for patients of neuromuscular illness, like ALS, Microsoft has announced that eye tracking would be an inbuilt feature in future Windows 10 releases.
The upcoming feature, called Eye Control, would allow users to operate Windows 10 without using their hands. It would require them to make eye movements which would be tracked by appropriate eye tracking hardware. Microsoft has partnered with Tobii Eye for the same.
A user would be able to open/close apps, scroll through the screen by focusing on the relevant area of the screen. And the eye tracking won’t be limited to basic tasks; users would also be able to type using the on-screen keyboard.
Eye Control is currently available as a beta for Windows 10 Insiders (Build 16257), but the company is yet to announce its availability on mainstream Windows 10 builds. The idea for the new accessibility feature first sparked during Microsoft’s employee Hackathon called One Week in 2014. Back then, Steve Gleason, a former NFL player diagnosed with ALS, motivated Microsoft in an email to build an eye tracking tech. He wanted to become more self-dependent and talk to others with ease.
Eye Gaze got shortlisted for the first prize from around 3,000 internal teams and their project had significant similarities to what Steve had suggested. The Eye Gaze Wheelchair allowed Steve to move around using his eyes.
Got something to add? Drop your thoughts and feedback.