Following my recent find of “seeing with your ears“, today I ran across some research being done at the University of Washington that will allow “hearing-impaired users might soon be able to use sign language over a mobile phone, like in Japan or Sweden”. The technology is based on the American Sign Language (ASL):
“The goal of the MobileASL project is to increase accessibility by making the mobile telecommunications network available to the signing Deaf community. Video cell phones enable Deaf users to communicate in their native language, American Sign Language (ASL).”
According to the article, a big obstacle in the U.S. is, not surprisingly, the ability to do two-way video using mobiles. Anyway, another interesting way in which mobile phones will be used in the near future.
Via Golden Swamp and SmartMobs
Image Credit: UW (see the ZDNet article)
Posted in Cell Phones, Mobile phones, Mobile web, Research, Ubiquitous Computing, Visual Learning
Tagged adaptive software, ASL, Cell Phones, deaf, hearing impaired, Mobile phones, mobileASL, UW
As I was perusing Phoload, a new sharing site with free mobile downloads, I ran across the vOICe MIDlet for Mobile Camera Phone, which is a cool piece of technology for mobile devices. According to the description,
The vOICe seeing-with-sound MIDlet translates live views from a camera phone into sounds that you hear via the phone’s speaker or headset, thus targetting sensory substitution applications for the totally blind, and even synthetic vision. The vOICe uses pitch for height and loudness for brightness in a left to right scan of any view. Includes talking color identifier.
According to the developer’s site, it runs on laptops and UMPCs with cameras, but I think the most fascinating and useful version is the one that runs on camera phones. There is lots of info on this site, but make sure to read the user comments. Makes you appreciate the gift of vision a whole lot more!
Image Credit: the vOICe MIDlet for Mobile Camera Phones site: