ChatGPT launched two years ago this month, becoming the fastest growing app in history.

Right from the start, the chatbot’s fluency when conversing with users was game changing. And one game it changed, maybe irreversibly, was the way we look for information online.

No surprise why. Underpinning ChatGPT is an AI system able to ingest billions of words about a topic from every precinct of the internet and spit out bite-sized summaries in straightforward English. Type “What are Pierre Poilievre’s policies?” into a search engine, and you’ll get pages and pages of links to sift through. Type the same question into ChatGPT, and you might get several paragraphs on how, if he becomes prime minister, he plans to tackle everything from the size of government to the cost of living. If you have a query, wouldn’t you rather get a human-sounding reply instead of a bunch of websites?

ChatGPT is not technically a search engine—it doesn’t cite any sources, for one. But the summarization tool it has popularized is already being incorporated into the search functions of companies like Meta, Microsoft, and, of course, Google (whose AI Overviews, which provides thumbnail reports at the top of search results, is being rolled out across the United States). Such tools will be a boon for people seeking quick answers, but a bane for publishers. Disincentivizing curious users from clicking through to a news site for additional information—a trend called zero-click search—sends less traffic to media outlets that invest in the costly reporting that AI machines are scraping, strip-mining, and synthesizing. According to a recent study by Datos and SparkToro, more than half of all Google searches in the United States and the European Union are already zero click.

But AI-powered search won’t simply steal journalism; as it gets better at generating convincing and comprehensive responses, it could also end up eroding the attention that readers need to properly consume the original work. More specifically, the convenience of powerful searchbots might discourage users from vetting sources themselves. Worse: it might make reading critically seem too tedious to justify doing at all.

This makes it urgent that organizations like The Walrus defend what they do. A search engine won’t break a story about the years of horrific abuse students suffered at a military school, or spend months interviewing a visionary curator to learn about her controversial departure from a major Canadian art gallery, or pin down the harmful stereotypes that inform the cover designs of books by racialized authors. And it certainly won’t ask twelve of Canada’s savviest writers to examine various scenarios and implications of a possible Poilievre administration. (All of which we do in this issue.)

What a search engine can do, and increasingly will, is drain all colour from the prose driving the enterprise of those specific stories and excrete what remains in bullet points. The stagecraft of storytelling can’t be replaced by reeled-off facts and stats. That’s because the “return” on a stylish, reportorial, narrative-driven article is pleasure. Pleasure in a writer’s skill at laying out a series of anecdotes, or translating complex research into an accessible vignette, or simply describing a room. Pleasure in a well-chosen verb, a deft turn of phrase, or the bravado of a sentence. Pleasure in voice.

My favourite definition of long-form journalism comes from the critic Kenneth Burke: to “use all there is to use.” Using everything they know and everything they’ve learned, good magazine writers operate at a level of detail few other formats can afford them. For their art to survive, we have to resist the idea of a future in which we learn about one another through an AI curator that reduces and depletes language to fit the minimum requirements of a query. And that resistance starts with reading The Walrus.

Carmine Starnino
Carmine Starnino (@cstarnino) is editor-in-chief of The Walrus.