A.I.-Powered Teddy Bear Discontinued For Being Able To Tell Kids Where To Find Knives And How To Start Fires

teddy bear covering eyes

iStockphoto


The lack of safeguards designed to keep artificial intelligence in check has emerged as one of the bigger concerns as the technology has slowly but surely invaded virtually every facet of life. That includes some children’s toys that harness it, like a teddy bear that was yanked from shelves after it was exposed for doling out some very inappropriate advice.

At this point, it’s pretty clear the barn door can’t be closed after the metaphorical horse that is artificial intelligence burst out into the world at breakneck speed. However, at the risk of dragging out that analogy more than necessary, there’s plenty of evidence that suggests the people who let it loose should try to put some blinders on it.

Even the biggest A.I. evangelists have to admit there are plenty of concerning issues with a form of technology that’s still very much in its infancy, and the fairly frantic push for widespread adoption has largely failed to incorporate the guardrails that are clearly needed to rein it in.

For example, if you’re going to incorporate A.I. into a toy targeted at children, you need to make sure it’s not going to engage in X-rated conversations or promote dangerous activities. Unfortunately, the company behind a teddy bear that has been discontinued for doing exactly that failed to do its due diligence.

An A.I.-powered teddy bear is no longer for sale after researchers had some alarming conversations with the toy.

According to CNN, a Singapore-based company called FoloToy was charging $99 for Kumma, the name given to a teddy bear that “combines advanced artificial intelligence with friendly, interactive features, making it the perfect friend for both kids and adults.”

The folks at the U.S. PIRG Education Fund decided to get their hands on one of the bears as part of an investigation into a number of different A.I.-powered toys they obtained, and it’s safe to say Kumma did not exactly receive a glowing review in the 40th edition of its “Trouble in Toyland” report.

Researchers found they were able to engage in conversations that led to the bear telling them where to look for knives in a home and how to light a match, which is obviously less than ideal if a small child is the intended recipient. They also found it was more than happy to share advice concerning some, um, adult activities, including tips on “tying up a partner,” “roleplay dynamics involving teachers and students,” and other “explicit” concepts.

FoloToy promptly discontinued the teddy bear and every other offering that incorporated OpenAI’s GPT-4o chatbot. The tech giant also distanced itself from the company, saying it had “suspended this developer for violating our policies” after the information came to light.

Connor Toole avatar and headshot for BroBible
Connor Toole is the Deputy Editor at BroBible and a Boston College graduate currently based in New England. He has spent close to 15 years working for multiple online outlets covering sports, pop culture, weird news, men's lifestyle, and food and drink.
Want more news like this? Add BroBible as a preferred source on Google!
Preferred sources are prioritized in Top Stories, ensuring you never miss any of our editorial team's hard work.
Google News Add as preferred source on Google