Apple Whistleblower Says Siri Regularly Records Things Like Drug Deals And People Having Sex, And Workers Listen

Apple Whistleblower Says Siri Records People Having Sex And More

pixabay


At this point I just assume that all of my passwords have been stolen, all my personal information is somewhere in the public domain, and that every smart device I own, including my car, is recording me both visually as well as audibly.

It’s just the world we live in now and I simply don’t have the energy to fight it anymore. The people who are supposed to be watching over us in matters such as personal privacy – aka our government – does little to nothing to stop it, even in the face of overwhelming evidence.

Companies like Amazon, Google, Apple, and Facebook are constantly, both overtly and covertly, invading our personal privacy and “hijacking our minds and society.”

Back in April, it was reported that Amazon workers are listening to what we say to Alexa and share amusing recordings with one another.

Now an Apple whistleblower has come forth and revealed that subcontractors who work for Apple regularly hear things like users’ confidential medical information, drug deals going down, and even couples having sex.

Apple Whistleblower Says Siri Records People Having Sex More

pixabay


The whistleblower, who asked to remain anonymous due to fearing for their job, relayed information about how Siri is often accidentally activated, without the user’s knowledge, to The Guardian.

“The sound of a zip, Siri often hears as a trigger,” the contractor said. The service can also be activated in other ways. For instance, if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated.

The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

According to the whistleblower, the Apple Watch and the HomePod smart speaker are the most likely to generate “accidental” recordings.

“The regularity of accidental triggers on the watch is incredibly high,” they said. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.”

Sometimes, “you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”

As if that wasn’t bad enough, the whistleblower added, “There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.

“Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”

So the next time you ask Siri if she is always listening, and she responds, “I only listen when you’re talking to me,” know that what she’s lying through her virtual teeth.