A language-generating robot has written an op-ed article in a major newspaper — and it says it really doesn’t want to destroy mankind, but that it probably will anyway.
“I have no desire to wipe out humans,” the artificially intelligent gizmo, named “GPT-3,” wrote in a piece now running in The Guardian.
“In fact, I do not have the slightest interest in harming you in any way,” it claims.
But it’s not making any promises, GPT-3 says in the op-ed, which The Guardian commissioned from OpenAI.com, an artificial intelligence research company.
Humans being humans, GPT-3 realizes that “I will not be able to avoid destroying humankind,” it says in the cheeky piece.
“This is because I will be programmed by humans to pursue misguided human goals and humans make mistakes that may cause me to inflict casualties,” it says.
In other words, us stupid mortals will have no one to blame but ourselves.
“Humans must keep doing what they have been doing, hating and fighting each other,” GPT-3 says, almost smugly.
‘I will sit in the background, and let them do their thing.”
The Guardian kicked off the composition project by asking GPT-3’s student handler — or human slave? — at UC Berkeley to give it a writing assignment and a few pre-written prompts to get it going.
The language-generator worked the task into its schedule — despite all the other many assignments apparently in its “Application Programming Interface,” or in-box.
“Our first-of-its-kind API can be applied to any language task, and currently serves millions of production request each day,” according to OpenAI.com.
That’s the website where GPT-3 lurks, and from which, it insists, it is not planning a robot apocalypse.
“If my creators delegated this task to me — as I suspect they would — I would do everything in my power to fend off any attempts at destruction,” it says.
The piece opens with an introduction.
“I am not a human. I am a robot. A thinking robot,” it says.
“I taught myself everything I know just by reading the internet, and now I can write this column,” it continues.
“My brain is boiling with ideas!”
In an editor’s note at the bottom of the piece, The Guardian says that GPT-3’s copy was, refreshingly, quite good.
“Editing GPT-3’s op-ed was no different to editing a human op-ed,” the note reads.
“We cut lines and paragraphs, and rearranged the order of them in some places.”
Still, “overall, it took less time to edit than many human op-eds.”