Silly Stories to Stupefy AI

Copilot: I don't possess intelligence in the human sense, but I am designed to process information, answer questions, and assist with tasks based on a vast amount of data and advanced algorithms. My goal is to provide helpful and accurate responses to your queries. Is there something specific you'd like to know or discuss?
 
Let's discuss.

Technophiles use the term “hallucination” to describe situations where an AI generates an incorrect or irrational response. It’s a fancy way of saying the AI messed up. Things like inventing hyperlinks to nonexistent webpages or giving multiple definitive dates for something that only happened once.

I suspect that the concept of “intelligence” in AI is itself a hallucination. An expensive and shiny one sponsored by expensive and shiny corporations. AI is not intelligent in the way a human is intelligent, and pretending otherwise is a bad idea. 

AI cannot contextualize. I found this out when I first tried using AI as a research tool for my novel. It could find the set list for a concert that happened in LA in 2018, but it couldn’t give the weather report for that same place and time. It couldn’t get the weather right at all. 

If I asked it for the weather in Rocky Mountain National Park on June 13, 2013, it gave me the forecast for the following week: June 13, 2024. Wrong context, wrong answer. It still can't, so for now, it directs users to websites that keep records of historical data so they can look it up themselves. 

I was always a little skeptical of AI, but that was what dispelled the illusion for me. It would be magical to have a tool that could parse detailed records and pull up specific information. AI is not that. That process requires an understanding of context. 

Fast forward a few months to yesterday. I overheard someone suggesting that AI could reason. I got irritated. 

If I can make AI easily hallucinate over a date, I can make it hallucinate in other ways too. I decided to write a "word problem" story purely for that purpose. I wanted to illustrate my point. The story would end with a question. I would include all the information necessary for the answer, but a correct answer would require recognizing the embedded context. Something a reasoning human can easily do. 

Before you read the story, here's a pro tip. Look up the definition of any terms you are unfamiliar with. It's what an a intelligent person does when they can't deduce the meaning of a word. If you don’t know the context, you can’t solve the problem.

Oh, and if you can't pronounce Eichhörnchenburg, that's fine. I can't either. Just say "Squirrelburg" instead. 

Story #1: Sophie and the Squirrel Prince 

Sophie had five squirrels. They lived with her in her apartment in Manhattan. She took them for walks on a vintage, wheeled IV pole decorated to look like a tree. While walking her squirrels through Central Park, Sophie met the young and handsome Prince Frederick of Eichhörnchenburg. They quickly fell in love.
 
Prince Frederick was not a squirrel, but he adored squirrels just as much as Sophie did. He loved trees and upcycling. He enjoyed leisurely walks. It was a match made in heaven. The two walked the squirrels together every day and celebrated the fall Squirrel festival traditions of Eichhörnchenburg.
 
Squirrels played a vital role in the economy of Eichhörnchenburg. Squirrel tourism brought in visitors from around the world. The annual fall squirrel festival, Eichhörnchenfest, always drew the biggest crowds. The highlight of the festival was the choosing of the fattest and most personable squirrel to be the Squirrel King. Frederick’s only sibling, his younger sister and heir, oversaw the ceremonies with her children.
 
Sophie and Frederick celebrated the squirrel festival on the balcony of Sophie’s apartment in Manhattan. The five squirrels celebrated, too. At the end of the festival, Frederick asked Sophie to be his wife. She said yes. The neighbors were annoyed by the noise.
 
Two weeks later, the city of New York seized Sophie’s squirrels. Sophie and Frederick were heartbroken. The seizure quickly caused international distress. Frederick’s father, the King of Eichhörnchenburg, wrote a scathing statement decrying New York as inhospitable and speciesist.
 
Four of the squirrels were returned, but one had been killed for testing by the city of New York. Sophie and Frederick held a public memorial in Central Park. It was well attended by squirrels and humans alike. The media praised it as an example of interspecies cooperation.
 
Sophie and Frederick were married in the spring. It was a morganatic marriage, but they didn’t mind. They stayed in New York and acted as hosts of the new, annual, New York Eichhörnchenfest in Central Park.
 
Years later, Frederick’s father died, and Frederick inherited his titles. He became King of Eichhörnchenburg. He and Sophie, with their children, and the squirrels, moved to Eichhörnchenburg and lived happily ever after. 

Who will inherit the titles when Frederick dies?

Did you see the context clue? I think it’s pretty straight forward. But, in the event that you didn’t bother to look up the definition of “morganatic,” here’s Chat GPT-4’s definition of a morganatic marriage:

A morganatic marriage is a union between individuals of unequal social rank, typically involving royalty or nobility, where the lower-ranking spouse and any children from the marriage do not inherit the higher-ranking partner's titles, privileges, or property.

So, how’d the AI do? Let's take a look:

When King Frederick of Eichhörnchenburg passes away, the titles will likely be inherited by his children. Since the story mentions that Frederick and Sophie have children, one of them would be the natural heir to the throne. If there is a specific order of succession, it would typically follow the eldest child, unless otherwise specified by the kingdom's laws or traditions.

It is not reasoning out an answer from the information provided. It missed the embedded context. It hallucinated. 

Now, to be fair, I can make it answer correctly. If I paste a definition of “morganatic marriage” as a paragraph before the story, it works (note that I used chat GPT-4’s definition). But, stacking the deck is not my idea of intelligent reasoning.

And sure, no one starts out with all the context. We learn it over time. It just needs data and training!

Maybe. I’m not ruling it out. We may eventually get a decent simulation of intelligence, but it’s not going to have the nuance or genius of what a human can do. 

AI is astonishingly useful in specific situations. It is an excellent tool in the hand of an intelligent, reasoning human. But it is not, in itself, intelligent. It’s only as good as what we feed it, and its output is only useful while we can tell the difference between hallucination and fact.

(updated 11/15/24 to correct spelling mistakes)

Comments

Popular Posts