One Day, ChatGPT Will Move You to Tears

Generative AI is mechanizing writing itself. Is poetry far behind?

A 3D illustration of a robotic metal holding up and analyzing a human skull against an ominous dark background.

In November 2022, OpenAI released a free research preview of ChatGPT, a large language model that identifies patterns in a vast collection of data to teach itself how to generate human-like responses to user prompts—a prediction machine for language.

The prediction power of artificial intelligence has already helped people map protein structures and coordinate planetary defence. It outperforms humans in the analysis of medical imaging. The efficiency and purported objectivity of algorithms also mean AI will be increasingly relied upon to prop up overburdened social systems and institutions facing a crisis of legitimacy (hence the dystopian horror of “predictive policing”). AI’s potential impact on humanity has been compared to that of electricity. Now, it is creating images, stories, and poems on an unprecedented scale.

I will spare you a recap of all the AI-generated content that has since made the rounds. Some of it is fascinating. Unfortunately, a lot of the poetry sounds like this:

A cup of tea, a sunny day,
A friendly chat, a child’s play.
These little things we take for granted,
Are the ones that make life enchanted.

(From “Everyday Moments” by ChatGPT)

Most AI poetry is a series of insipid clichés arranged in rhyming quatrains. This is not ChatGPT’s fault. Its job is to guess the next word based on the patterns it has picked up from processing web content, and the example above is a fair imitation of poetry on the internet. If a language model trained itself exclusively on data sets of carefully curated poetry—if the LLM got an MFA—it might generate less predictable language, but it is not clear what we would do with that. Reduce the poetry shortage? This brings us to the question of what poetry is for.

Poets should not be threatened by the fact that every person with internet access can now create the poetic equivalent of hotel art. Although it involves technique, poetry is not a technical problem. We write it because we want to, not because we lack technology that can do it for us. Nor is poetry a zero-sum endeavour. Even if ChatGPT wrote better poems than we ever could, we would still write poetry, because there is more to writing than generating text. I write poetry for the same reason other people dance: because it is fun and probably good for my heart. Great poems offer a recognition and mutual understanding that I have not experienced outside of intimate conversation, and I write poetry because there is pleasure, value, and meaning in sharing this connection with others. I would not ask a robot to write my poems just as I would not ask it to hang out with my friends or savour my food. So AI will not take the poet’s job, but it will change it.

In March 2021, researchers at Google published a paper titled “On the Dangers of Stochastic Parrots,” which noted the significant risks associated with “synthetic but seemingly coherent text” entering the discourse without any accountable author. We are entering an age in which much of what we read day to day will be created by agents that lack the capacity to mean. Meaning consists of more than the interpretation of symbols, according to the sociologist Neil Postman. “Meaning also includes those things we call feelings, experiences, and sensations that do not have to be, and sometimes cannot be, put into symbols.” At its core, artificial intelligence is pure symbol, a vast string of binary code. Our feelings, experiences, and physical sensations are continuous and multifaceted, so when we express them using discrete values (words, binary), something is lost. Symbols cannot fully express the richness and range of human feeling and experience. Fortunately, every human reader also brings a lifetime of feelings, sensations, and associations to the table, which enables poetry to evoke introspection beyond what can be captured by symbols. Thus, a poem amounts to more than the sum of its words. In contrast, a large language model that has processed every bit of writing ever created cannot be moved by the words because it has no feeling or physical experience to draw upon; it can only engage with poetry as symbols representing instructions, a series of 1s and 0s corresponding to electrical currents that are either on or off. This is why ChatGPT cannot mean as humans do. Any meaning it generates is ontologically synthetic. Ground beef.

Nevertheless, the text AI generates is meaningful to us. We may interact with it unknowingly and assume it is human. It may one day generate poetry that moves us to tears. Coming to terms with this requires a conceptual shift. AI cannot become intelligent so long as our idea of intelligence is rooted in human experience. Instead, our idea of intelligence will become more artificial. The metaphors we use to understand technology and humanity work both ways. When we speak of “processing” experiences, for example, we conceive of experience as information that can be integrated through a set of procedures, which makes the self a sort of central processing unit. “To discover truth,” writes Meghan O’Gieblyn in God, Human, Animal, Machine, “it is necessary to work within the metaphors of our time, which are for the most part technological.”

Conceptually, the humanization of machines follows more than a century of the mechanization of humans, from the nineteenth-century principles of scientific management that segmented human work into a series of optimizable processes, to the assembly lines and mind-control experiments of the twentieth century, and to the modern-day conception of the brain as a supercomputer that can be hacked and rewired. The latest manifestation of this trend was encapsulated by Sam Altman, CEO of OpenAI (the company behind ChatGPT), who tweeted on December 4, 2022, “i am a stochastic parrot, and so r u.” If prediction machines can write our cover letters, help us come up with ideas, and seemingly relate to us, then maybe we are prediction machines.

I used to think that all language intermediates between ideas and reality, but if software applications with no ideas and no sense of reality can produce coherent language, then my anthropocentric understanding of language must change. As Maggie Nelson wrote in The Argonauts, “Words change depending on who speaks them; there is no cure.”

There was a time when poetry did not exist outside of the human voice. It could not be recorded or otherwise converted into a system of symbols beyond speech. Anyone who recited poetry essentially gave life to it, but the work was not attributed to an individual author. Poetry changed with the written word. It changed again after the printing press mechanized the reproduction of writing. Now AI is mechanizing writing itself. It remains to be seen exactly how this will change poetry, but it undoubtedly will.

Excerpted from Best Canadian Poetry 2024, edited by Bardia Sinaee, with permission from Biblioasis. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.

Bardia Sinaee
Bardia Sinaee’s first book of poetry is called Intruder. He lives in Toronto.