AI for the blind creating a more connective future

About 285 million people worldwide are blind or have low vision. Tasks such as using public transportation, navigating busy streets, shopping at retail stores or recognizing people can pose added challenges for these individuals. But augmented reality (AR) technology is helping transform their experiences.

Unlike virtual reality, augmented reality technology displays digital information over the physical world, ensuring a more life-like user experience. But where energy, manufacturing and other major enterprises have streamlined operational tasks with these glasses, startup company Aira saw a unique consumer opportunity for blind and low-vision people.

Aira created an assistive technology platform that uses augmented reality and smart glasses to connect those with diminished vision to certified human agents. Via the tap of a button, an Aira customer, or “Explorer,” connects to an agent who can see from the Explorer’s perspective through the glasses’ camera. The agent then offers visual descriptions of the surrounding environment on-demand. But for agents to communicate with Explorers in near-real time, Aira needed a powerful and reliable network connection. So in 2016, Aira came to the AT&T Foundry in Houston for help providing wireless connectivity for their solution.

One of our connected themes across Foundry projects is learning by doing. And getting to a viable, market-ready prototype is rarely a one-and-done process. We assisted Aira in experimenting with different hardware, onboarding our connectivity, and testing the full solution to prepare for their launch in January 2017.

Milliseconds matter when working in real-time with an individual who is blind, so we also used AT&T Dynamic Traffic Management to ensure a safe and seamless experience for Explorers. This innovative mobile data solution is designed to help businesses prioritize critical applications and data transmissions through the AT&T 4G LTE network. This helps alleviate the effects of congestion on the network, especially when responsiveness is a top priority – like navigating to a busy crosswalk.

Putting the “AI” in Aira

For some activities, connecting to a live human agent is necessary. Guiding a marathon runner or describing in detail to an Explorer the moment the sun eclipses behind the moon, for example. But what if AI could replace a human agent for simpler tasks?

As Aira gained user feedback during its first months in operation, one of the most requested tasks they received from Explorers was reading for medication recognition. What’s in the pill bottle in my hand? So our team helped Aira develop a recognition model for Aira’s AI assistant, Chloe, to correctly identify prescriptions and over-the-counter medications. This wasn’t a simple update. We had to use machine learning to train the technology to properly identify a wide variety of common medications. This meant taking photos of different types of pill bottles from numerous pharmacies and doing so from varying angles and in different environments. This ensured the AI solution could accurately identify the medication labels under a diverse set of conditions, such as font, text layout and lighting, and subsequently match the prescription with a database of existing medications.

The end result? An agent-free and a hands-free experience. Explorers simply hold the prescription up to the glasses and say “Hey Chloe, read this.” The algorithm then analyzes the object or text on the label and identifies it aloud to the customer. The camera design allows for 120 degrees of viewability, both vertically and horizontally.

Our work with Aira was the first customer project to commercialize out of our Foundry in Houston. Since graduating from prototype to commercial product, Aira has helped travelers navigate busy airports, brought its assistive technology to new college freshmen, moved into other countries, aided athletes and expanded its service into 5,300 AT&T retail locations across the country.

View More AT&T Foundry Stories