Wednesday, May 2, 2012

Research team uses robot eye technology to help the blind

A research team from Pierre and Marie Curie University in Paris have ported technology originally developed to help robots maneuver in real world environments to Braille enabled devices that help vision impaired people do the same.

The new technology uses a pair of glasses affixed with sensors and cameras connected to hand-held devices that allow the blind wearer to “feel” the 3D environment around them.

The team from the university's Institute of Intelligent Systems and Robotics will be presenting their findings at this month’s IEEE International Conference on Robotics and Automation.

The new technology incorporates cameras and sensors initially developed for use with robotics technology, along with image processing hardware and software, to gather information about the surrounding environment.

From there though, the two technologies diverge. Instead of the data collected being routed to AI systems for use by a robot, it moves instead to a processor that converts 3D imagery to Braille signals that are passed along to real-time Braille devices in the hands of the person using the device.

The result, the team says, is situational awareness unlike anything else currently available to assist the blind in moving around without benefit of a cane, guide dog or other assistive device.

To create true 3D imagery, the system uses two cameras on each side of the glasses, which are connected to an image processor that picks out objects, edges and other pertinent details of the view ahead.

From that, a map is generated to represent the scene from the user’s perspective. That map, along with data from a gyroscope and accelerometers that provide information about speed and current location, is then converted by another processor to a series of signals that are sent to the Braille devices in the hand.

Each is a flat pad with pins that can be raised or lowered using heat, to create a virtual tactile real-time image of the surrounding environment and because the system generates new maps at a rate of about ten per second, the user is able to move, the developers say, at walking speed through a real world environment.

Only time will tell of course, just how useful the new technology will be, as those that use it will likely develop a connection with the system over time akin to the way others grow accustomed to keyboards, canes and guide dogs, and only they will be able to judge whether new systems such as this are as useful as they appear in demonstrations.

This is not the only robotics project to be re-purposed. Software that predicts how far a robot has travelled based on information from its on-board sensors is being modified to track a person's movements based on their stride length. 

The low-cost system, being developed by Eelke Folmer and Kostas Bekris at the University of Nevada in Reno would help blind people navigate around buildings using just a smartphone.

The new system uses freely available 2D digital indoor maps and the smartphone's built-in accelerometer and compass. Directions are provided using synthetic speech. 

To help the smartphone calibrate and adjust to a user's individual stride length, the user must initially use touch to detect the landmarks in their environment, such as corridor intersections, doors and elevators. 

The system will be presented at the IEEE International Conference on Robotics and Automation in St Paul, Minnesota, in May.

David Ross at the Atlanta Vision Loss Center in Decatur, Georgia, says that the sensing problems faced by robots and blind people are similar but there are big differences. "Sensing systems developed for mobile robots may have some application, but must be adapted considerably to suit a wide variety of human needs and situations," he says.

No comments:

Post a Comment