
iStockphoto / Julia Garan
One of the biggest (online) indicators of a person’s generation is how comfortably and freely they use search engines. Specifically, the language they use when typing into a search engine is often an indicator of when they adapted that new technology.
Younger generations who grew up using modern search engine technology and know nothing else have distinctly different ways of interacting with search engines than older generations who used the earliest search engines that weren’t nearly as sophisticated as modern ones.
How someone interfaces with search engines isn’t always an indicator of age/generation, but it often is. And this is a pattern that is likely to continue with the use of artificial intelligence like ChatGPT.
Case in point, a person just claims to have used ChatGPT to save their dog’s life after their veterinarian was stumped.
This is something many people, myself included, who haven’t had much hands-on experience with Artificial Intelligence, would never think to do. But it’s a sign that the next wave of AI users will be interfacing in completely new ways with the technology, ways that late adopters might never come about naturally.
@PeakCooper shared his story on Twitter. He claims that ChatGPT saved his dog’s life after his veterinarian made a diagnosis but his dog’s condition worsened.
#GPT4 saved my dog's life.
After my dog got diagnosed with a tick-borne disease, the vet started her on the proper treatment, and despite a serious anemia, her condition seemed to be improving relatively well.
After a few days however, things took a turn for the worse 1/
— Cooper (@peakcooper) March 25, 2023
At this point, the dog's condition was getting worse and worse, and the vet had no clue what it could be.
They suggested we wait and see what happens, which wasn't an acceptable answer to me, so we rushed to another clinic to get a second opinion 3/
— Cooper (@peakcooper) March 25, 2023
Despite the "I am not a veterinarian…" disclaimer, it complied.
Its interpretation was spot on, and it suggested there could be other underlying issues contributing to the anemia 5/ pic.twitter.com/hMk0yy9JfC
— Cooper (@peakcooper) March 25, 2023
When we reached the second vet, I asked if it's possible it might be IMHA.
The vet agreed that it's a possible diagnosis. They drew blood, where they noticed visible agglutination.
After numerous other tests, the diagnosis was confirmed. GPT4 was right. 7/
— Cooper (@peakcooper) March 25, 2023
Here he mentions that it’s unclear why the vet couldn’t make the correct diagnosis. But he firmly believes ChatGPT saved his dog’s life by making a quick diagnosis.
I don't know why the first vet couldn't make the correct diag., either incompetence, or poor mgmt.
GPT-3.5 couldn't place the proper diag., but GPT4 was smart enough to do it.
I can't imagine what medical diagnostics will look like 20 years from now.
— Cooper (@peakcooper) March 25, 2023
If @OpenAI needs more info for research purposes, I have Sassy's (the lucky dog) entire medical records and blood test results ready to share
— Cooper (@peakcooper) March 25, 2023
Thread of entire process reaching the correct diagnostic, without any guidance or hinting from my side: simply stating facts and asking "what is the most likely?"https://t.co/dbtCz3OK7j
— Cooper ☕ (@peakcooper) March 27, 2023
Without throwing anyone under the best, the most likely reason the vet missed what ChatGPT could quickly see is a combination of recency bias and familiarity. Vets (and doctors) first and foremost treat the symptoms.
Immune-Mediated Hemolytic Anemia, or IMHA, which ChatGPT diagnosed this dog as having, only occurs in roughly 1-in-500 dogs. It is amongst the most common autoimmune diseases in dogs. But it’s clear there was some impediment to reaching the proper diagnosis on the vet’s side and ChatGPT filled that life-saving gap.
Pet owners will go to unimaginable lengths to help their furry companions when they need us. It is interesting to think how ChatGPT might be able to assist in life-saving diagnoses in the future, such as this one.
At the same time, there will be a wave of people who put all of their faith in the new technology. Those people will be pleading with doctors and professionals in other fields saying ‘I was told this’ instead of presenting it to the doctor/expert in an appropriate manner as he did above.
I’m not here to tell anyone how to run their lives, but a future where patients are demanding doctors follow their orders of artificial intelligence bots might not be the friendliest and warmest future for patients if they value bedside manners from their medical professionals.