Accessibility is one of the areas technology companies have been focusing on and improving in the past few years. The new feature available on the Amazon Echo Show helps the blind or low-vision customers to identify everyday household pantry items that are difficult to distinguish by touch. The feature makes use of computer vision and machine learning to recognize what item is placed before it. It'll be available on the first- and second-generation versions of this device. This Alexa-powered smart speaker is geared towards kitchens as it helps out with kitchen-related tasks, such as setting timers and watching recipe videos.
Users simply need to say things like "Alexa, what am I holding?" or "Alexa, what's in my hand?" And then the Echo Show will give verbal cues to inform the users what the product is. Amazon worked with blind Amazon employees, including its principal accessibility engineer, Josh Miele. They got feedback from both blind and low-vision customers and collaborated with the Vista Center for the Blind in Santa Cruz. It's currently only available in the US, but we're hoping it gets a broader rollout in the future.
Source: TechCrunch