John Tolley, November 6, 2016
You?re at the grocery store. You?re looking for that one crucial, yet out-of-the-ordinary, ingredient for the recipe that you?ve been dying to make. You scour the shelves from top to bottom, moving from aisle to aisle. Your frustration grows.
Now imagine that the item you?re looking for isn?t some hard-to-find import, but rather something as common as a box of the apple-cinnamon oatmeal you enjoy. The situation would be downright maddening, no?
Yet, for the visually impaired, this is an all too common experience when shopping alone.
Now a group of Penn State researchers led by Vijaykrishnan Narayanan, professor of computer science and engineering and electrical engineering in the College of Engineering, are looking to alleviate that frustration via The Third Eye Project.
The Third Eye is a device developed in a multi-university research project funded by the National Science Foundation as part of its Expeditions in Computing work called Visual Cortex on Silicon. The project aims to model the human visual cortex in order to provide assistive technology to people with visual impairment.
According to researcher John Carroll, the innovative endeavor combines image recognition, computer learning, and user feedback to streamline the shopping process.
?The concept is that a person with visual impairment has a camera mounted on a glove,? says Carroll, Distinguished Professor of Information Sciences and Technology at PSU. ?They direct their hand towards places and targets of interest. The computer recognition system takes the camera data, understands what?s in front of the camera and signals back to the person through vibrations on the glove or verbal or tone signaling through ear phones.?
Since 2014, The Third Eye Project team has been working with members of the local chapter of the National Federation for the Blind to identify specific needs.
While shopping with a trusted friend or assistant is preferred for most visually impaired persons, it isn?t always an option. Barcode-scanning devices grant a level of independence, but require the shopper to locate the UPC on a number of packages before finding the right one. For Michelle McManus, who has been blind since birth, the independence those devices afford almost isn?t worth the hassle.
?You have to find the right aisle,? says McManus, who works as an IT Specialist for Penn State. ?You then have to scan products in that aisle to see if they?re similar to what you?re looking for, and then go to the next aisle. It?s not a time saver.?
A device like the Third Eye glove however would drastically cut down on the guesswork involved. That?s because the camera on the glove will be looking not for a specially coded label to scan, but will be working much like the human eye to visually identify the product. The packaging of a product will be compared against a database of product images to help shoppers correctly locate their desired item. All the shopper has to do is scan the shelf with the glove.
In addition to shopping assistance, the team is also looking at how augmented reality and driver assistance technologies could lead to drastically more independent lives for the visually impaired.
As McManus sees it, that freedom will pay off in dividends.
?[The Third Eye Project] strengthens peoples? mobility skills and ability to get out and do more. You learn more about your community because you?re out and you can hear people talking. It gives you more engagement with the people that are around you.?