For years, everyday details of life have been just out of reach for Jordan Reiche.
The 33-year-old addiction counselor from Tampa can still make out vague shapes. But faces blur into soft outlines, like looking through thick fog.
Now, a new piece of technology is quietly changing that experience.
Reiche recently received a pair of Ray-Ban Meta AI Wayfarer smart glasses through a pilot program run by Dogs Inc. — and for the first time in years, everyday objects around her are beginning to come into focus in a new way.
Reiche was diagnosed with Retinitis Pigmentosa when she was just seven years old.
The genetic condition slowly damages the retina’s cells, causing vision to deteriorate over time. Many people with the disorder lose most of their sight by adulthood.
By the time Reiche was 12, she had already been declared legally blind. Today, she estimates she has about five percent of her vision left.
“I can make out some shapes,” she has said. “But faces are a blur.”
The new glasses work with Meta AI and include a tiny speaker built into the frame.
When connected to a smartphone via Bluetooth, they can describe what’s in front of the wearer. They can read signs, identify objects and colors, and even scan text from receipts, menus, or mail.
For someone navigating daily life with limited vision, those details matter.
A stack of unopened letters on the counter, for example, is no longer just a mystery. Even checking something as simple as the expiration date on a carton of milk becomes possible.
Reiche had encountered the glasses before.
She tested a pair at Best Buy, curious about what they might do for her daily routine.
But the roughly $500 price tag made the decision difficult — especially for a device she couldn’t fully test beforehand.
The pilot program through Dogs Inc. changed that. Reiche received the glasses at no personal cost, giving her the chance to try the technology in her everyday life.
“I’ve been wanting this for so long,” she said when she received them.
Technology isn’t the only support helping Reiche navigate the world.
For the past seven years, she’s had a guide dog named Blue, who helps her avoid obstacles and move safely through unfamiliar spaces.
Together they’ve already traveled widely — from an Alaskan cruise to hikes in the Appalachian Mountains, and even a trip through Ireland.
Before Blue, Reiche often kept her head down, focusing on the ground to avoid hazards.
Now, she says she gets to look up more often.
The glasses may add another layer of freedom.
Reiche hopes they’ll help with tasks like reading airport gate numbers while traveling or sorting through mail at home.
She works remotely as an addiction counselor, meeting clients over Zoom. But outside of work, small daily tasks can still present challenges — from grocery shopping to navigating unfamiliar places.
That’s where assistive technology is beginning to play a bigger role.
AI-powered wearables are still new, but stories like Reiche’s hint at how they might reshape everyday independence for people with visual impairments.
Guide dogs, mobility training, and assistive tools have long helped people navigate the world safely.
Now, artificial intelligence is starting to add another layer — offering real-time descriptions of the environment in ways that were once impossible.
For Reiche, that means new possibilities for travel, daily errands, and moments many people take for granted.
She’s already thinking about future trips — perhaps to Joshua Tree or Amsterdam.
This time, she’ll bring Blue.
And a pair of glasses that can help describe the world around her.
