Eye Control By Microsoft


How can Windows empower every person and every organization on the planet to achieve more? Our ongoing mission is behind a feature that Windows Insiders voted as one of their top 10 favorites: Eye Control.

While Eye Control was first released in-box during the Fall Creators Update, it’s seen some great improvements with the April 2018 Update. We sat down with Microsoft engineer Jake Cohen to get the story behind the accessibility feature that enables users to control Windows with their eyes and the help of a compatible eye-tracking device (such as Tobii and EyeTech).

“Accessibility has been super important for us for the past 20-plus years,” Jake explained. “For the past few years, we’ve working hard to really aspire towards our mission statement by empowering every person of every level of ability.”

The Windows team started on the road to developing Eye Control during the 2014 Microsoft company-wide hackathon, when Steve Gleason, an NFL football player who had played with the New Orleans Saints, emailed Microsoft with a challenge. Living with ALS, Steve wanted to spark technology that could help people with ALS and that would help him communicate more easily, play with his son, and move his wheelchair independently.

Since the hackathon, Microsoft has been working closely with Team Gleason, the nonprofit foundation founded by Steve, to develop technologies that will empower people living with ALS. Continuing this work, the Windows team has been steadily implementing built-in eye tracking to Windows 10.

With a compatible device, Eye Control leverages infrared lighting and cameras to detect where a user’s eyes are looking relative to the screen. Windows takes that information and enables a user to control a mouse or keyboard.

“Eye Control starts with a launch pad, which is UI that’s always present on the screen,” Jake said. “When you dwell your eyes on an icon, which is the act of fixating your eyes somewhere on the screen and waiting, it’ll activate a click. So it’s basically a press and hold with your eyes.”

“You have access on the launch pad to the mouse, the keyboard, text-to-speech, and now in the April 2018 release, many more options like Quick Access, Start, Task View, Device Calibration, Settings, and more. For browsing the Web or scrolling an app, you can also fixate your eyes somewhere on the screen and then use the arrows that are provided to scroll up and down using your eyes.”

What’s in store for the future of Eye Control? Jake’s team is continuing to work with Microsoft Research and Team Gleason to collect feedback from the ALS community and make Eye Control even more useful.

“It’s really inspiring to get this feedback because we hear people say, ‘This is amazing technology. This is really helping me,’” Jake said. “And also, ‘This is the next thing I need.’ It’s about empowering them to do everything they can think of, not just a subset of interactions or abilities.”
Jake also mentioned that sparking more third-party tools is a priority for the team.

“The next step we’re taking is releasing public developer APIs and open-source libraries to allow third-party developers to build apps and experiences that can leverage eye tracking,” Jake said.
“Imagine all of the gaps that third-party developers can fill for customers who are living with mobility impairments. It comes down to Microsoft’s core roots. We can’t fulfill this mission statement alone to empower everyone–we have to build a platform that empowers everyone to empower other people. I’m excited see what developers can come up with in order to make an impact.”

(Source: windows.com )
 

Comments

Popular posts from this blog

Malware may spread via FB Messenger

First Look Roundup: Apple iMac Pro

Everything You Think You Know About AI Is Wrong