Shape-changing device helps people with visual impairment perform location task

A groundbreaking navigation device can help people with visual impairment perform a location task as well as sighted people, new research shows.

Shape-changing device helps people with visual impairment perform location task

Researchers from Imperial College London, working with the company MakeSense Technology and the charity Bravo Victor, have developed a shape-changing device called Shape that helps people with visual impairment navigate through haptic perception – the way people understand information about objects through touch. The device, which looks like a torch, bends to indicate where a person needs to move and straightens when the user is facing the correct direction.

In a study published in Nature Scientific Reports, researchers tested how well people with visual impairment were able to locate targets in a 3D virtual reality (VR) space using Shape and vibration feedback technology – which is commonly used to help visually impaired people navigate. Sighted individuals were also recruited for the study to locate the targets in the VR space using only their natural vision.

Dr Ad Spiers, lead researcher for the study, from Imperial’s Department of Electrical and Electronic Engineering, said: “The exciting thing about this study is we’ve managed to demonstrate that Shape can help people with visual impairment perform a navigation task as well as sighted people. This is something that we haven’t seen before with other navigation devices.

“Shape is unusual because it uses our ability to understand information through touch in a way that goes beyond vibration. Humans have an innate ability to feel and interpret shapes through our hands, with very little concentration. Exploiting this allows us to create a device that is simple to learn and isn’t tiring to use.”

The study compared the results of 10 participants with visual impairment and 10 sighted participants, testing their ability to locate targets as quickly as possible in a controlled indoor environment, measuring the time taken to locate virtual targets and the efficiency in locating these targets.

The trial found that there was no significant difference in the performance between visually impaired participants using Shape and sighted participants using only natural vision. It also found that participants with visual impairment located targets significantly faster using Shape than with vibration technology. Feedback showed that participants with visual impairment preferred using Shape to vibration technology.

It is hoped that the device, which is believed to be the most advanced of its kind, could be the future of navigation technology for visual impairment, as the Shape device has notable advantages over current tools used to guide people with visual impairment.

Dr Robert Quinn, CEO of MakeSense Technology, said: “The impressive results from this study demonstrate the enormous potential of this technology to make life changing improvements in mobility for people with visual impairment.

“Building upon the research described in this paper, MakeSense is developing a blind wayfinding product which leverages the latest advancements in spatial artificial intelligence and computer vision without the need for interpretive training. We are aiming for our first product to be available from the end of 2025.”

Currently, individuals with visual impairment most commonly use aids such as white canes or guide dogs. While guide dogs are often effective, they require expensive expert training and can cost thousands of pounds per year to keep. White canes enable navigation through a process of elimination by telling users where not to go, rather than where they should go. This process limits a user's ability to navigate freely in complex environments.

Recent developments with technology have tended to focus on using auditory interfaces, which give audio cues such as “turn left at the next corner”, or vibration feedback, which alerts a user through vibration patterns that indicate where to move.

Auditory interfaces can prevent people from hearing important warning sounds of imminent hazards and can dampen users’ ability to engage fully with the world. Vibration feedback can lead to numbness after prolonged periods of use and studies have shown users can become quickly irritated and distracted by frequent vibration sensations.

In order to test the performance of Shape against vibration technology and natural sight in a controlled environment, the researchers designed a simulation of real-world navigation that reduced the possibility of significant variation between experiments.

In a real-world navigation scenario, it is expected that there would be significant variation in conditions due to changes in weather and the presence of other pedestrians or objects. It is also expected that there will often be multiple potential target options in a real-world scenario rather than the single targets which were presented individually in the experiment.

Further research is needed to understand how the Shape device performs in more variable real-world scenarios.

The Shape device was developed working with MakeSense Technology, a startup company which was co-founded at Imperial by Dr Robert Quinn - an Imperial PhD graduate in Mechanical Engineering. The company received support in its early stages from Imperial’s thriving entrepreneurial ecosystem, which aims to develop innovative solutions with the potential to change the world for better.

Following the completion of the Shape study, MakeSense has worked on developing the technology further to be used for real-world outdoor navigation. It is hoped that the device could be ready for practical use in real-world environments in the coming years.

The research published in Nature Scientific Reports was supported by funding from Innovate UK’s SMART Grant, which was awarded to MakeSense Technology Ltd, Bravo Victor, and Imperial College London.

'A shape-changing haptic navigation interface for vision impairment' by Adam J. Spiers et al. is published in Nature Scientific Reports.