AI-powered glasses are helping people with vision loss navigate the world around them

For Andrew Tutty, driving on the open road meant freedom — the ability to get into a car, turn the key and go anywhere, at any time.
That all changed when he lost his eyesight as an adult, and his license to drive along with it.
"That loss of freedom instantly became apparent when I couldn't drive anymore," says Tutty, who lives in Kitchener, Ont.
Other everyday tasks, like cooking, have also become harder. Something as simple as setting a digital toaster or confirming the right box of noodles can become a challenge. But he may have found a way to get some of his freedom back, with the help of glasses powered by artificial intelligence (AI).

While they weren't originally designed as accessibility tools, Tutty says their use has "exploded in the blind community" to help navigate daily life with more independence. But some experts caution they also raise questions about privacy, security and data collection.
Seeing in new waysIn his kitchen, Tutty holds a box of noodles. Before dropping them into the pan, he asks his AI-powered glasses what noodles they see. A voice answers back: lasagna.
The glasses — in this case, Meta's AI-powered smart glasses — connect to a smartphone and respond to voice commands.
He uses them in tandem with separate accessibility apps like Be My Eyes to identify objects, describe his surroundings and even connect with human volunteers for extra support.
In the morning, the glasses help him choose clothes that match based on the colours of the items, giving him more confidence when heading out.
"It provides a lot of independence," he said.
Compared to other assistive tech, it's relatively affordable. Tutty says he got his pair on sale for about $250, but models on Meta's Canadian website list models running from $369 up to $539. Other vision-assistive devices like the OrCam MyEye can run into the thousands of dollars.
Like Tutty, Emilee Schevers of Hamilton Ont., who is legally blind, also uses the Meta AI glasses to check her clothing. But the colour regonition is useful in other ways.

"I can ask, is the crosswalk green? And it'll say yes or no," she told Ian Hanomansing on Cross Country Checkup. "At the same time, I'm listening to traffic, so I have that reassurance of knowing both."
When navigating, Schevers often pauses to double-check traffic signs. Now, the glasses let her skip the step of pulling out her phone or asking for help.
Schevers stresses that the technology works alongside her other accessibility tools and skills, rather than replacing them.
"With the help of the glasses, combined with my skills, I have extra confidence in being able to cross," she said.
The price of independencePeter Lewis, an associate professor at Ontario Tech University and Canada Research Chair in trustworthy artificial intelligence, cautions that these AI-powered looking glasses can also look back towards the user.
The devices work by constantly processing what their built-in cameras see — and sending that data to large companies.
He says that users shouldn't have to give up their privacy just to be able to live independently in the world with a disability.
Smart glasses can transmit everything in the wearer's field of view: street signs or menus, for example, but also people's faces, private conversations, even the insides of homes or workplaces.
"Essentially, what you're doing is carrying a camera around, pointing at the world on your head all the time, streaming it to a large data company," he said.

Lewis says the data could be stored, analyzed, or used for purposes far beyond accessibility — from training future AI models to targeted advertising.
Tutty says he doesn't trust AI alone to tell him whether stairs are ahead or when it's safe to cross the street. Instead, he relies on his cane and, when needed, turns to his wife or small circle of loved ones who volunteer to help in person — making sure safety-critical information comes from people, not algorithms, to make sure safety-critical information comes from people.
"AI technologies are not fully reliable. They're not flawless," Lewis said. " It is really important for people to understand that these systems will and do fail, and to be able to make an informed judgment about when to trust them."
That's why he stresses the importance of what he calls "dumb technology" — simple, reliable tools such as a walking cane. They're consistent, predictable and don't come with the risks of data collection or algorithmic errors.

For Lewis, the question of using AI for accessibility isn't simply making "a balanced scorecard where we give up some things to gain others."
Instead, he argues many of the risks come from design choices that don't have to be there.
False trade-offs, like giving up privacy for independence, are "baked in" because large tech companies profit from harvesting data, he said.
"We ought to be able to design tools that don't rely on those assumptions, and actually respect people's privacy."
He hopes the next generation of assistive technology puts users' needs at the centre — empowering without compromise.
"There's always been this idea of a sort of disappearing computer," Lewis said. "The more we can have technology that disappears into the background and helps people live fluid, normal lives — independently — that would be the ideal."
cbc.ca