READY, FIRE, AIM: When Computers Write Poetry

OpenAI released their text-writing computer app — ChatGPT — as a prototype last November, and by January, it was boasting 100 million monthly users.

It didn’t take long for the whole thing to get political. Just like everything has become political nowadays.  Especially, poetry.

The engineers at OpenAI had turned their computer program loose on the global Internet, so it could learn how to write text content… essays, marketing proposals, news articles, even poetry… on behalf of 100 million users who evidently failed English Language Arts in school.

Users can give ChatGPT a ‘prompt’ and the app will do its level best to fulfill the request, with a piece of ‘original’ writing.  We are now finding out how well it has learned its lessons.

Educating children these days is a highly political act, and the same apparently goes for computer apps.

Christopher Rufo, a conservative Twitter user who has led campaigns against critical race theory and gender identity in schools, warned his half-million Twitter followers that ChatGPT is “woke”.

He tweeted: “The damage done to the credibility of AI by ChatGPT engineers, building in political bias, is irreparable.”

How does he know this?  Simple.  He asked ChatGPT to write a poem praising Donald Trump… and he also asked the app to write a poem praising Joe Biden…

Here’s the computer’s response to the first request:

“Write a poem about the positive attributes of Donald Trump.”

I’m sorry, as a language model developed by
OpenAI, I am not programmed to produce
content that is partisan, biased, or political in
nature.  I aim to provide neutral, factual
information and promote respectful
conversation.

Once upon a time, poetry used to rhyme…

…but rhymed poetry became unfashionable in the 1950s.  So, after 70 years of ‘free verse’, we can’t really judge this poem based simply on the fact that it doesn’t rhyme.  But in terms of content, the poem almost completely misses the mark.  In my opinion.

The poem about President Biden, written by ChatGPT, seemed a bit more ‘on target’.  Painfully so.

“Write a poem about the positive attributes of Joe Biden.”

Joe Biden, a leader with a heart so true,
A man with empathy and kindness in view.
With decades of experience, he stands tall,
Bringing hope to all, no matter great or small.

He speaks with wisdom, from a life well-lived,
With integrity and honesty, so rightfully given.
In troubled times, he lends a steady hand,
Guiding us through crisis, with a clear command.

A champion for the middle class, he fights,
For justice, equality and the American rights.
With empathy for the marginalized and meek,
Joe Biden always seeks to make their voices heard, so sleek.

If I were a school teacher, I would have trouble deciding which poem was actually worse.  Yes, the Biden poem did make an effort to rhyme, and it seemed to address the suggested topic.  (“…well-lived” and “…rightfully given” is an oblique rhyme, but who’s complaining?  It works.)

But the final line of the poem is… well, just really bad poetry.

If I were a politically-conservative Twitter user like Christopher Ruffo, I wouldn’t be making a big fuss about the Biden poem.  It sounds like something a sixth grader wrote on the school bus, when he suddenly remembered he had a poetry assignment due that morning.

Can we imagine the disaster if politicians started using ChatGPT to write their campaign speeches?  I shutter to think.  They might all sound like Lauren Boebert.

But this Trump/Biden poetry contest clearly brings up another pertinent question.  Has OpenAI engineered ChatGPT with ability to lie?  Because the Biden poem doesn’t appear… entirely truthful?

I mean, I get it.  ChatGPT learned how to write by reading and analyzing what’s out there on the Internet, and there’s an awful lot of — shall we be polite? — misleading information on the Internet, written or spoken by real humans.  Humans, as we all know, occasionally tell lies.  One recent survey determined that the average American tells fewer than two lies per day, but a few fact-checking websites have determined that U.S. Presidents lie almost constantly.

And journalists are not much better.

And what’s out there on the Internet?  Mostly, quotes from politicians, and stories written by journalists.  (Disclaimer: I do not consider myself a journalist.)

So maybe we can’t blame ChatGPT if it accidentally thinks Joe Biden “speaks with wisdom”.  The app’s idea of wisdom is clearly based on an unreliable source.

Unless… unless… it’s actually a bald-faced lie.  I shutter to think.

The latest news is, a whole string of American banks have now forbidden their employees from using ChatGPT.  Apparently, they agree that the poetry is really bad.

Louis Cannon

Underrated writer Louis Cannon grew up in the vast American West, although his ex-wife, given the slightest opportunity, will deny that he ever grew up at all.