NUS team develops headset for visually impaired to 'see', Latest Singapore News - The New Paper
Singapore

NUS team develops headset for visually impaired to 'see'

A new headset developed by a team of university researchers aims to give “sight” to the visually impaired – at a price tag of under $500.

Dubbed AiSee, the prototype works by analysing images seen through a built-in camera and giving information about the object to the user through a verbal prompt.

Associate Professor Suranga Nanayakkara, lead researcher of Project AiSee at the National University of Singapore (NUS), said: “We want to fundamentally rethink how interfaces between human and technology can be made to fit the abilities and expectations of the target users.”

During his postdoctoral studies at the Massachusetts Institute of Technology in 2012, Prof Nanayakkara saw how a blind friend would take pictures of lecture notes with his phone camera.

He would use his hands to feel the edges of the paper and then hold the phone above it to take a picture before using KNFB Reader – a mobile app for blind, low-vision and other print-disabled users that converts text to speech – to listen to the text.

Prof Nanayakkara thought an all-in-one device with a camera could do the job faster and better.

AiSee is a compact device that lets users identify objects by holding them up and capturing an image with the press of a button. Its built-in camera extracts features such as text and logos for processing.

An AI-powered image processing unit in the headset then uses large language models (LLMs) such as Open AI’s ChatGPT 4.0 to comprehend and respond to the user’s queries quickly.

The headset bypasses ears, transmitting sound through the skull. This lets visually impaired users receive auditory information while being able to still hear what is going on around them, making it safer for them, especially during risky situations.

AiSee, which was first developed in 2018 by Prof Nanayakkara and his team, has since been modified from a finger-worn (ring-like) interface into a headphone, making it hands-free and easy to wear.

The new prototype is supported by LLMs to allow users to have a more natural interaction with the device.

Prof Nanayakkara and his team are working to make the next prototype lighter than its current 140g and adjustable to fit all head sizes. The camera button will also be replaced with a wake word to capture images.

The time taken for the AI to process the information and respond will be reduced, and the AI will also be made to answer multiple questions.

There are also plans to commercialise the product, pending tie-ups with the private sector.

It is hoped that the product can be priced at under $500 – 10 times cheaper than existing assistive tools, like Israeli firms OrCam Technologies’ wearable camera OrCam MyEye Pro, and six times cheaper than the Netherlands-based Envision Glasses Home Edition, built on the since-discontinued Google Glass Enterprise Edition 2.

TECHNOLOGY AND RESEARCHAI/ARTIFICIAL INTELLIGENCEDISABILITIESNUS