AI spurs ‘revolution’ for some visually impaired people

Louise Plunkett Louise Plunkett stands in front of a hedge, holding her cane.Louise Plunkett

AI has helped visually-impaired Louise Plunket

“AI has revolutionised my daily life,” says Louise Plunkett from Norwich.

Ms Plunkett has a genetic eye condition called Stargardt disease, a rare condition that causes progressive vision loss, which she says, “impacts everything I do”.

“I can’t recognise people, even my own husband or my children. When my children were younger, I used to have to teach them how to come to me when I met them at the school playground.”

Ms Plunkett is comfortable with digital tools – her business advises companies on how to ensure their online content is suitable for the visually impaired community.

She has used services like Alexa, Google Home and Siri for years, helping with tasks like setting alarms and weather checks.

Now she is finding an assistant called Be My AI useful.

The app uses ChatGPT to generate and then read out detailed descriptions of pictures.

“I’m quite a stubborn person,” says Ms Plunkett. “I don’t like asking for help or admitting I need help, so using the AI tool is useful for things when other humans aren’t around.”

She says she might use it to check which is the female toilets, or read the ingredients on food packaging, or read a letter.

However, she feels that AI can sometimes be hit or miss. “The downside with AI is that sometimes it gives you too many details. You sometimes just want the basic information of what is in front of you, but it will go above and beyond, and offer up mood and emotions.

“For example, it might say ‘a swirling carpet evoking memories of times gone by’. It feels like it is one step too far.”

Be My AI was developed by Danish firm Be My Eyes. Its original service put human volunteers in touch with its clients. Via mobile phones the volunteers would describe what was in front of the person with vision problems.

However, some of its 600,000 users are switching to their AI tool for help, says Jesper Hvirring Henriksen, chief technology officer.

“We have a woman who was one of our first users 10 years ago, and within the first six months [of releasing Be My AI], she did more than 600 image descriptions.”

He’s also discovering people are using the app in ways they hadn’t imagined. “We’re finding people using it to check pictures that have been sent to them on WhatsApp groups,” he says.

“Maybe they’re not going to call another human each time to ask them about a picture sent on a WhatsApp group, but they use AI.”

By my Eyes Lady holds up her phone in front of her eyes - the phone is displaying an image of her eyes.By my Eyes

Be My Eyes connects volunteers with the visually impaired

As for where it might go in the future, he says live streaming video – with the tech describing buildings and movements around them – might be an area they move into. “This is going to be a gamechanger. It’s like having a little person in your shirt pocket all day telling you what is going on.”

Be My Eyes, which is free to users, makes money by signing up companies to its paid-for directory service where they can provide information and numbers to the blind and low-vision community.

Mr Henrikson says AI won’t replace the need for human connection.

“At Be My Eyes, people are still choosing to call a volunteer too. The blind population in the Western world are generally not young when they start to experience vision loss… it’s more skewed towards the elderly population and this [AI] might add a later extra of complexity. Humans are faster and potentially more accurate.”

WeWalk A lady holding a cane crosses a pedestrian crossingWeWalk

WeWalk is an AI-powered cane that detects obstacles and gives directions

Other firms also have products to help those visually impaired.

Featuring a voice assistant, WeWalk is an AI-powered cane that detects obstacles and offers accessible navigation and live public transport updates.

Connecting to a smartphone app with in-built mapping, it can tell users where places of interest are, including where the nearest café is in over 3,000 cities.

“The cane is very important for us, it helps navigation and is a very important symbol as it shows our independence and automacy,” says Gamze Sofuoğlu, WeWalk’s product manager.

“Our latest version helps users navigate the cane through voice commentary, for example when say take me home or the nearest café it can starts navigating, and you can get information about public transport. You don’t need to touch your phone. It provides freedom for blind and low vision people.”

Ms Sofuoğlu, who is blind, says she has been using it in cities she has visited recently such as Lisbon and Rome.

Robin Spinks, head of inclusive design at the RNIB (Royal National Institute of Blind People), and who has low vision, is a huge advocate of AI – he uses AI most days.

For example, he turns to ChatGPT to assist with his workflow, giving him a summary of development in certain areas in relation to work, or even to help plan a paddle board trip, and to the Google Gemini AI tool to help him locate items.

Last year was all about conversational AI and Chat GPT, he says. Now he argues 2024 is the year of what he called “multimodal AI”.

He goes on to say: “That might be showing video and images, and being able to extract meaningful information and assist you in an exciting way.”

He points to Google Gemini. “For example, with that you can record meetings and it assists with you voice labels and an account of a meeting, it’s genuinely helpful and it’s about making people’s lives easier.”

Mr Spinks says AI has been transformational for people who are blind or low vision.

“I sympathise with people who are genuinely scared of AI but when you have a disability, if something can genuinely add value and be helpful that has to be a great thing. The benefits are too great to ignore.”

More Technology of Business

Source link

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *