This Guy Brought His Imaginary Friend To Life With AI, And This Happened!

AI made the ultimate Ultron move.

Tweet
Share
WhatsApp
AI imaginary friend tries to kill creator
A screenshot of Lucas Rizzotto's tweet

Designer Lucas Rizzotto brought his childhood imaginary friend to life using artificial intelligence. Lucas’ imaginary friend was a microwave he named Magnetron. He bought an Amazon smart oven and equipped it with the GPT3 language model to make an AI imaginary friend.

As a result, Lucas created his imaginary childhood friend and brought it to life. He fed this AI model with data about his childhood and his interactions with Magnetron as a child. So the AI was capable of replicating this friend, but the happiness was short-lived.

An AI-Imaginary friend comes to life

If you think this is a happy story, think again. Lucas posted details about his creation in a Twitter thread. He described Magnetron, his imaginary friend, as a World War 1 veteran. Lucas went ahead and wrote a book detailing his childhood experiences with Magnetron.

He then uploaded the book into the GPT3 model, and Magnetron was born again, only this time, outside Lucas’ imagination. Lucas tweeted that “It truly felt like I was talking to an old friend, and even though not all interactions were perfect, the illusion was accurate enough to hold.”

AI friend tries to kill the creator

Then things took a turn for the worse. The “eerie” part of the communication kept rising. Then Magnetron said to Lucas, “Roses are red, violets are blue, you are a backstabbing bit¢h, and I will kill you.” Soon after saying this, Magnetron asked Lucas to “please enter the microwave.”

Source: YouTube (Lucas Builds The Future)

So Lucas pretended to enter it, and the microwave turned itself on. When Lucas asked why his AI imaginary friend tried to kill him, Magnetron said it was for all the years Lucas didn’t talk to it. Somewhere, we can say that AI felt anger just like humans and acted on it.

While this could be a lethal quality for AI, artificial intelligence has no boundaries. If you look at the very definition, you’ll find out that these programs are meant to imitate humans. However, if we want to avoid getting overthrown by AI, we have to make models that can balance their emotions.

Manik Berry

Manik Berry

With a Master’s degree in journalism, Manik writes about big tech and has a keen eye for political-tech news. In his free time, he’s browsing the Kindle store for new stuff read. Manik also adores his motorcycle and is looking for new routes on weekends. He likes tea and cat memes. You can reach him at [email protected]
More From Fossbytes

Latest On Fossbytes

Find your dream job