You would think deafness had significant hereditary origins, but parents usually don’t see it coming. The National Institute on Deafness and Other Communication Disorders reports that nine out of 10 deaf children are born to hearing parents. Children who are born deaf pick up sign language like any other child learning his or her first language, but parents have an uphill battle to teach themselves this kinetic language. SmartSign, a program that offers digital sign language lessons, is out to improve its software with the help of some cutting edge technology.
Google Glass is the next generation of mobile technology. SmartSign has developed an app that uses this wearable technology to deliver lessons throughout the day. When Google Glass hits the open market sometime in 2014, parents of deaf children may look to this device to assist their sign-language learning.
Developed by the Georgia Tech Research Corporation, SmartSign is an Android app that teaches American Sign Language (ASL) via your mobile phone. This program features demonstrations of 993 ASL signs. You can organize vocabulary by category and track your progress with regular report cards. The SmartSign app also enables users to record themselves on camera for playback and corrections. When users can’t think of a sign, they can type a word into the dictionary feature and watch the demonstration. It’s the most comprehensive sign language app in the Google Play store, and it’s about to upgrade to Glass.
Google Glass and Sign Language
The SmartSign Google Glass app takes sign language lessons from your smartphone’s screen to your immediate view. The Glass app will push lessons directly to the small display in the corner of the user’s view. Users can view their signs and the lessons simultaneously. With both visuals in the forefront, it’s easier to master and remember signs. SmartSign will deliver lessons throughout the day, so students will receive a regular dose of practice. The video below demonstrates SmartSign on Google Glass.
The first SmartSign app is still in beta testing, and developers are still planning future features. A “record and search” function would record gestures and deliver the translation. Someday, this technology could translate sign language all-
together. As users record information, SmartSign may look to record data on cloud servers. This would enable users to access their progress from any platform, whether they’re using a smartphone, Google Glass or desktop computer. For a better understanding of cloud storage, visit Top10CloudStorage.com for more details. Technology is coming together to enhance sign language learning, and it’s only the beginning.
Post submitted by Paige Calahan