« 'Civilization VI' heads to the PS4 and Xbox | Main | Twitter for iOS lets you pin lists to your Home timeline »
Tuesday
Sep242019

Amazon Echo Show helps identify household pantry items in the US

Accessibility is one of the areas technology companies have been focusing on and improving in the past few years. The new feature available on the Amazon Echo Show helps the blind or low-vision customers to identify everyday household pantry items that are difficult to distinguish by touch. The feature makes use of computer vision and machine learning to recognize what item is placed before it. It'll be available on the first- and second-generation versions of this device. This Alexa-powered smart speaker is geared towards kitchens as it helps out with kitchen-related tasks, such as setting timers and watching recipe videos.

Users simply need to say things like "Alexa, what am I holding?" or "Alexa, what's in my hand?" And then the Echo Show will give verbal cues to inform the users what the product is. Amazon worked with blind Amazon employees, including its principal accessibility engineer, Josh Miele. They got feedback from both blind and low-vision customers and collaborated with the Vista Center for the Blind in Santa Cruz. It's currently only available in the US, but we're hoping it gets a broader rollout in the future.

Source: TechCrunch

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>