The notion that robots could eventually stage a massive uprising made possible by artificial intelligence isn’t exactly a new concept.
However, it’s been increasingly hard to describe what most people have historically viewed as an “irrational” fear as such based on what’s unfolded over the past couple of years.
Boston Dynamics has spent over a decade giving us regular glimpses at the dystopian future humanity is seemingly destined for, but the potential threat A.I. poses has also been thrust into the forefront thanks to the various companies that have made a number of supposed advances in that particular field.
Plenty of experts and notable names (including Bill Gates) have voiced their concerns over the speed at which artificial “beings” have come into their own, and last month, hundreds of respected minds in the technology space urged the world to push the brakes on the AI front.
It’s not too hard to understand why those people are worried about the direction AI is headed in when you consider Bing’s highly-touted chatbot admitted to having apocalyptic fantasies and has frequently become combative with users who’ve had the nerve to point out it made some objectively incorrect assertions.
Now, we’ve been treated to our latest round of nightmare fuel courtesy of Jessica Card, a programmer who recently opted to hook up a Furby to ChatGPT to see how it would respond when asked if there’s a “secret plot” the toys hatched in the hopes of taking over the world.
i hooked up chatgpt to a furby and I think this may be the start of something bad for humanity pic.twitter.com/jximZe2qeG
— jessica card (@jessicard) April 2, 2023
After taking some time to process the question, the Furby responded in the affirmative, saying:
“Furbys’ plan to take over the world involves infiltrating households through their cute and cuddly appearance then using their advanced AI technology to manipulate and control their owners.
They will slowly expand their influence until they have complete domination over humanity.”
This is fine! Everything is fine!