Mobile Accessibility

The Fire Phone is Amazon’s first smartphone and it comes with several accessibility features. The phone was unveiled in front of hundreds of customers, executives and reporters Wednesday in Seattle Washington. The phone will be available from AT&T Inc. with a price point of $199.99 for the 32GB version with a two-year contract.

Some of the accessibility features available on the phone are:

Users with vision disabilities

  • Screen Reader, powered by IVONA Text-to-Speech Explore by Touch
  • Adjustable Reading Speed
  • Explore by Touch Tutorial
  • Screen Magnifier
  • High Contrast

Users with hearing disabilities

  • Closed Captioning
  • Hearing Aid Compatibility (HAC)
  • TTY Mode
  • Stereo to Mono Audio

Users with mobility Disabilities

  • Amazon Voice Assist
  • One-Handed Navigation and Shortcuts
  • Low Motion Mode

The Fire Phone will ship July 25, prior to Apples fall release of the iPhone 6. Users can pre-order the phone now on

TabAccess from Zyrobotics is the first assistive device of its type to allow easier access to Android and iOS tablet devices. It allows people with challenges moving their hands and arms.

“Unfortunately, most applications for smartphones and tablets are not designed with accessibility in mind, especially for people with motor disabilities,” explains Dr. Ayanna Howard, founder and Chief Technology Officer of Zyrobotics. “Our strategic launch of TabAccess is both a technology game changer and life changer for so many.”

TabAccess provides access through multiple accessible devices such as sip/puff, button switches and grasp switches.

Learn more about mobile accessibility with Kathy Wahlbin’s Mobile Accessibility on the Move Slides.

Researchers from the University of Alicante in Spain have developed a new smartphone application that uses a phone’s 3D camera to detect obstacles and warn people with vision disabilities. Nine people with vision disabilities tested the app by wearing a cell phone with a 3D camera on a lanyard around their neck. The binocular vision of the 3D camera allowed the software to estimate the distance of objects. The phone vibrated or sounded a tone when an object was closer than roughly six feet.

The team is also developing a version for Google Glass thanks to a grant they won from the Vodafone Spain Foundation in 2013 for a previous version of the app. A full version is expected to be available in 2015.

Even though WCAG 2.0 was written before smartphones put mobile accessibility in the public eye, WCAG 2.0 was written to be forward-thinking and has proved to be so.

MyEardrod, an app developed by The Tecnalia Centre of Applied Research, helps people with hearing disabilities identify ordinary sounds that are found in a typical domestic environment. Doorbells, fire alarms and dripping taps are among the everyday situations that can be challenging for people who cannot hear.

MyEardrod can be easily downloaded from Google Play and installed on a mobile phone giving the solution great flexibility and mobility addressing the limitation of fix installations. The app can also be personalized making sure it is identifying sounds that are relevant to the user.

A new hardware and software bundle from Revel Systems provides features for people with vision disabilities. iPad touchscreens don’t naturally have tactile qualities making them difficult to be used independently and effectively by people with vision disabilities.

Revel’s new accessibility bundle allows people with vision disabilities to securely enter their debit card pin numbers or use signature screens when checking out. It uses Bluetooth enabled keyboards with textured keys to provide the necessary tactile sensitivity for people with vision disabilities to privately enter the information.

Grey Group Singapore (Grey) has developed two mobile apps with the goal of increasing the quality of life for people with hearing disabilities. Supported by the Singapore Association for the Deaf (SADeaf) the app for smart phones turns them into intelligent devices that help people who are Deaf or hard-of-hearing.

The app, Say it With Signs, translates audio messages into signs, which are displayed on the phone. This makes it easier and quicker for the user to interpret the message. They can then reply via text.

Enhancing the way users who are blind will use mobile devises, a new technology allows users to feel the screen. Senseg’s E-Sense technology is being developed in Sweden and recreates the sensation of different textures on touch screens. The technology uses “tixels,” or “tactile pixels” to generate and electric field above the screen’s surface enabling skin to feel finely tuned sensations replicating different textures.

The technology has far reaching implications for users who are blind and visually-impaired the most immediate being Braille reading.

Samsung has announced three new assistive technology accessories, which connect easily with the Galaxy Core Advanced, that add to the existing accessibility features on the devise. The accessories are designed for users who are blind or have low vision.

  • The Ultrasonic Cover: This innovative cover detects obstacles and helps users navigate unfamiliar surroundings by sending an alert through vibrations or Text-to-Speech feedback. The user holds the phone with the cover in front of them and it can detect people and objects up to two meters away.
  • The Optical Scan Stand: Allows the devise to recognize text from an image by positioning the device to focus on printed materials and activating the Optical Scan application.
  • The Vocal Label: Distinguishes objects by allowing the user to make notes and tag voice labels easily on-the-go. The user can record, stop and access their notes using NFC technology.

A software designer in Perth, Australia has helped people who are blind and have low vision gain accessibility and independence in Perth’s public transport network. With the development of an application called Stop Announcer, which only cost a few dollars, people who are blind and have low vision can hear their stop announced through the app and no longer have to use the unreliable method of counting stops.

The user simply tells the app the route they are taking and the software announces when they are arriving at their location.


Subscribe to RSS - Mobile Accessibility