A team of researchers has developed a wearable assistive device that helps people with visual impairment ‘see’ objects around them with the help of artificial intelligence (AI). AiSee aims to overcome limitations by leveraging state-of-the-art AI technologies.
The wearable incorporates a discreet bone conduction headphone. The user simply needs to hold an object and activate the in-built camera to capture an image of the object. With the help of AI, AiSee will identify the object, and it will also provide more information when queried by the user.
AiSee incorporates a micro-camera that captures the user's field of view. This forms the software component of AiSee, also referred to as the vision engine computer. The software is capable of extracting features such as text, logos, and labels from the captured image for processing.
After the user snaps a photo of the object of interest, AiSee utilises sophisticated cloud-based AI algorithms to process and analyse the captured images to identify the object. The user can also ask a range of questions to find out more about the object.
AiSee employs advanced text-to-speech and speech-to-text recognition and processing technology to identify objects and comprehend the user’s queries. Powered by a large language model, AiSee excels in interactive question-and-answer exchanges, enabling the system to accurately comprehend and respond to the user’s queries in a prompt and informative manner.
The headphone of AiSee utilises bone conduction technology, which enables sound transmission through the bones of the skull. This ensures that individuals with visual impairment can effectively receive auditory information while still having access to external sounds, such as conversations or traffic noise. This is particularly vital for visually impaired people as environmental sounds provide essential information for decision-making, especially in situations involving safety considerations.