Currently users with vision disabilities who read text on an iPad by scanning their finger across the text and have it read aloud to them, experience slow reading speeds. This is described in a paper called, “Digital Reading Support for The Blind by Multimodal Interaction,” by Francis Quek, a Texas A&M professor of visualization and Yasmine N. El-Glaly, assistant professor of computer science at Port Said University in Egypt.
The two professors have developed two major refinements that improve the experience of people with vision disabilities who rely on iPads as a touch-based reading device. Currently if a user runs their finger too quickly along the lines of text or changes the software’s access mode to read bigger chunks of words it is easy for them to lose their place or wonder between lines without realizing it.
To address these issues, the professors developed software for the iPad that predicts the direction of a user’s finger. This audibly renders words in sequence then alerts the reader if they stray from the line. The work was supported with a $302,000 grant.