- The School of Computing at the National University of Singapore has introduced AiSee, an affordable wearable assistive device that helps visually impaired people “see” objects around them with the help of artificial intelligence.
- With the help of artificial intelligence, AiSee will identify the object and provide more information when queried by the user.
- Compared to most wearable companion devices that require smartphone pairing, AiSee operates as a standalone system without the need for any additional devices.
Grocery shopping is a common activity for most of us, but for visually impaired people, identifying grocery items can be daunting. from the School of Computing, National University of Singapore A group of researchers at (NUS Computing) have introduced AiSee, an affordable wearable assistive device that helps visually impaired people “see” objects around them with the help of artificial intelligence.
Visually impaired individuals face daily challenges, especially with object identification, which is vital for both simple and complex decision-making processes. Although breakthroughs in artificial intelligence have significantly improved visual recognition capabilities, real-world applications of these advanced technologies remain challenging and error-prone. First developed in 2018 and gradually upgraded over a five-year period, AiSee aims to overcome these challenges by leveraging the latest artificial intelligence technologies.
Associate Professor Suranga Nanayakkara from the Department of Information Systems and Analytics at NUS Computing, principal investigator of the AiSee Project, said about the device: “Our aim with AiSee is to empower users with a more natural interaction. By following a human-centered design process, we found reasons to question the typical approach to using a camera-enhanced eyewear. People with visual impairments may be reluctant to wear glasses to avoid stigma. “Therefore, we recommend alternative hardware that includes a discreet bone conduction headset.”
The user simply holds an object and activates the built-in camera to capture an image of the object. With the help of artificial intelligence, AiSee will identify the object and provide more information when queried by the user.
Compared to most wearable companion devices that require smartphone pairing, AiSee operates as a standalone system without the need for any additional devices.
AiSee consists of three basic components:
Eye: Vision engine computer software
AiSee includes a micro-camera that captures the user’s field of view. This makes up the software component of AiSee, also called the “image engine computer.” The software can extract features such as text, logos and labels from the captured image for processing.
Brain: Artificial intelligence-supported image processing unit and interactive question-answer system
After the user takes a photo of the object of interest, AiSee uses advanced cloud-based artificial intelligence algorithms to process and analyze the captured images to identify the object. The user can also ask a series of questions to learn more about the object.
AiSee uses advanced text-to-speech and speech-to-text recognition and processing technology to identify objects and understand the user’s queries. Backed by a broad language model, AiSee excels in interactive Q&A exchanges, enabling the system to accurately understand the user’s queries and respond to them quickly and informatively.
Speaker: Bone conduction sound system
AiSee’s headset uses bone conduction technology, which transmits sound through the bones of the skull. This allows visually impaired individuals to effectively receive auditory information while also accessing external sounds such as conversations or traffic noise. This is especially vital for visually impaired people, as environmental sounds provide the information necessary to make decisions, especially in safety-related situations.
Assoc. Dr. Nanayakkara explained: “Currently, visually impaired people in Singapore do not have access to this level of advanced assistive AI technology. Therefore, we believe that AiSee has the potential to enable visually impaired people to independently perform tasks that currently require assistance. Our next step is to make AiSee affordable and accessible to the masses. “We are making further improvements to achieve this, including a more ergonomic design and a faster processing unit.”
NUS student Mark Myres, who helped test AiSee as a visually impaired user, said the product was very useful for both partially sighted and blind people.
Associate Professor. Nanayakkara and his team are currently in talks with SG Enable in Singapore to conduct user testing with visually impaired people. The findings will help improve AiSee’s features and performance.
Beyond this project, SG Enable also aims to collaborate with NUS to explore how artificial intelligence, human-computer interface and assistive technology can provide more technological options for people with disabilities.
Compiled by: Esin Özcan