May 16, 2019
As smartphones continue to deliver more and more computing power, it’s important that the benefits extend to all users. A particularly exciting area within accessible design is the creation of software and add-on hardware aimed at making life easier for users with visual impairments, who number in the hundreds of millions worldwide.
This is also exciting because, in some ways, touchscreens have made life more difficult for people who are blind or visually impaired. According to the IEEE Computer Society, “Flat digital touchpads have replaced physical buttons, which blind users could previously distinguish with their fingers.” To navigate appliance interfaces, “Blind people must rely on a sighted assistant to identify button functions and apply Braille labels,” hampering their independence.
As such, the Computer Society writes about the impact that two new mobile technologies — VizLens and Facade — could have on solving this problem.
VizLens precisely guides the user on how to photograph the interface using their mobile device. “The app can tell if it is partially out of frame by detecting whether the corners of the interface are inside the camera frame. If they are not, the app will say something like ‘Move phone to the right,’” according to the article.
Once an image is captured, the transcription of all the interface elements gets crowdsourced online. The next time the user points their device at the interface, the app will tell them which command is which on the panel.
Facade works similarly in terms of capturing and labelling images. But instead of continuing to function through the phone, it allows the user to quickly print 3D tactile overlays, removing the need to rely on an assistant for the process. These braille guides can then be attached to microwaves and other appliances so the user can read what each button says.
Innovation in accessibility isn’t limited to phones. In a paper published in IEEE Transactions on Neural Systems and Rehabilitation Engineering, researchers from MIT showcase a belt that uses a sensor array to detect obstacles near the user. A haptic chest strap issues alerts on the distances of nearby surfaces in all directions like walls, stairs and hanging obstacles, allowing the wearer to walk freely without relying on a probing cane.
For the researchers, the goal was simple: “Embedding small and lightweight sensors along with haptic devices in clothes allows for seamless integration of wearable navigation technology into the lives of visually impaired people.” And of the test subjects, 82% were satisfied with the experience.