Microsoft's Windows and Cortana absolutely must learn sign language

Windows Central

WinC Bot
Staff member
Dec 17, 2013
73,677
64
0
Visit site
nadella-ignite-2017.jpg

Microsoft's inclusive design efforts must include sign language recognition in Windows 10 and Cortana to support the deaf and hard of hearing community.
Microsoft's inclusive design mission is guiding the company in ensuring its products and services are designed from conception forward with every user of all ability levels in mind. Though the company has made admirable progress in this regard, it is still a work in progress.

Microsoft's Seeing AI app helps people with blindness, Project Fizzyo supports children with Cystic Fibrosis, the Emma Watch and Project Emma aids people with Parkinson's Disease and Microsoft's Immersive Reader helps children with Dyslexia. There are millions of people with varying levels of abilities who are either excluded from interacting with the technologies of modern society or whose physical limitations prevent their full participation in everyday tasks.

Microsoft has embraced the challenge of creating specific solutions, like the tremor-halting Emma Watch, which targets a particular aspect of a disability. It has also incorporated solutions that level the playing field into its technologies, like gaze control in Windows, which enables people with immobility to navigate the OS. Given this integrated solution for people with para- or quadriplegia, a similar OS level solution that enables Windows or Cortana to understand sign language for the 466 million people with disabling hearing loss, in a world where "speaking to AI is becoming the norm" seems like a natural goal for Microsoft. And given that a developer "modified Alexa" to do just that we know that it's also possible.

Full story from the WindowsCentral blog...
 

Members online

Forum statistics

Threads
323,275
Messages
2,243,560
Members
428,053
Latest member
JoshRos