Scientists from IBM Research (IBM) and Carnegie Mellon University (CMU) have launched an open platform software system designed to support the creation of smartphone apps that can enable the blind to better navigate their surroundings
The researchers used the platform to create a free pilot app, called NavCog, that draws on existing sensors and cognitive technologies to inform blind people on the CMU campus about their surroundings by ‘whispering’ into their ears through ear-buds or by creating vibrations on smartphones.
The app analyses signals from Bluetooth beacons located along walkways and from smartphone sensors to help enable users to move without human assistance, whether inside campus buildings or outdoors. Researchers are exploring additional capabilities for future versions of the app to detect who is approaching and their mood.
The first set of cognitive assistance tools for developers is now available via the cloud through IBM Bluemix. The open toolkit consists of an app for navigation, a map editing tool and localisation algorithms that can help the blind identify in near real time where they are, which direction they are facing and additional surrounding environmental information. The computer vision navigation application tool turns smartphone images of the surrounding environment into a 3-D space model to help improve localisation and navigation.
CMU’s School of Computer Science has posted a new YouTube video – Cognitive Assistant for the Visually Impaired – showcasing the project.