Amazon Is Developing A Wearable That Can Read Human Emotions, Because That’s What The AI Overlords Demanded
Oh, c’mon, Amazon! We already have to deal with Google, Facebook, Instagram, et al, trying to “hijack our minds,” but now you’re getting into that business too?!
Isn’t it bad enough that ONCE AGAIN a database of tens of millions of Instagram users (and growing) were just exposed online to anyone who wants to see it?
According to TechCrunch, “The database, hosted by Amazon Web Services, was left exposed and without a password allowing anyone to look inside.”
Can you imagine what would happen if a company had access to data that was able to drill all the way down to everyone’s emotions? Because that’s what Amazon is trying to do with a new wearable device they are developing.
Internal documents reviewed by Bloomberg claim that the device is being described as a “health and wellness product.”
Designed to work with a smartphone app, the device has microphones paired with software that can discern the wearer’s emotional state from the sound of his or her voice, according to the documents and a person familiar with the program. Eventually the technology could be able to advise the wearer how to interact more effectively with others, the documents show.
It’s unclear how far along the project is, or if it will ever become a commercial device. Amazon gives teams wide latitude to experiment with products, some of which will never come to market. Work on the project, code-named Dylan, was ongoing recently, according to the documents and the person, who requested anonymity to discuss an internal matter. A beta testing program is underway, this person said, though it’s unclear whether the trial includes prototype hardware, the emotion-detecting software or both.
A U.S. patent filed in 2017 describes a system in which voice software uses analysis of vocal patterns to determine how a user is feeling, discerning among “joy, anger, sorrow, sadness, fear, disgust, boredom, stress, or other emotional states.” The patent, made public last year, suggests Amazon could use knowledge of a user’s emotions to recommend products or otherwise tailor responses.
A diagram in the patent filing says the technology can detect an abnormal emotional condition and shows a sniffling woman telling Alexa she’s hungry. The digital assistant, picking up that she has a cold, asks the woman if she would like a recipe for chicken soup.
Uh, hard pass. Amazon already has its employees listening to everything people say to Alexa (and laughing about it), I don’t think we need them to know what kind of mood we’re all in at any given moment.
You’ll have to excuse me if I don’t believe you.