A new support system for the visually impaired is based on artificial intelligence and a stereoscopic camera. Designed to go unnoticed, the system is contained in a backpack and banana bag, and offers voice guidance.
Visually impaired people could soon do without cane and guide dogs thanks to a new device developed by researchers at the University of Georgia in the United States. The system is carried in a backpack and a banana bag to go unnoticed. It uses cameras and artificial intelligence (AI) to perceive the world around and guide the wearer.
The wearer is equipped with a Luxonis Oak-D camera, placed in a vest that contains holes for the lenses, or in the banana bag. This camera contains two lenses for stereoscopic vision, in particular to detect the distance of objects, and a third with a 4K definition (3,840 x 2,160 pixels) in color. It is powered by a battery in the banana bag and has a battery life of eight hours.
Neural networks in the camera
The backpack contains a computer, connected to the camera, and a GPS system. However, most of the processing happens directly in the camera, which integrates an Intel Movidius Myriad X chip. Thanks to the use of deep neural networks, the camera detects obstacles, both on the ground and in the air, such as tree branches, and identifies signs, other people, pedestrian crossings, ground level differences and even building entrances.
The system provides voice guidance through a Bluetooth headset connected to the computer, which also allows the wearer to use voice commands, such as asking for a description of the surroundings or recording places in memory. The researchers intend to publish their project under a free license, allowing all DIYers to recreate it or propose their own version. The system currently requires a computer, but we imagine that in the future it could rely on a smartphone, or perhaps one day be integrated into connected glasses.