So when is a robot supposed to take my job, exactly?
Why I'm not worried about AI replacing journalists
When I started in journalism 15 years ago, first as an intern and then as a magazine assistant, one of my most frequent, least pleasant duties was transcribing interviews. This meant typing out conversations that were sometimes an hour long, painstakingly clicking back in the audio file over and over while dealing with background noise, accents, fast talkers, and obscure slang or technical jargon. The story that used the resulting transcript might only quote the interview a few times, but you still had to spend hours sitting, listening, typing.
Now a service called Otter will automatically transcribe long interviews in a matter of minutes. You still have to check the transcript for accuracy before publishing a quote — it’s not always precisely correct — but it’s good enough that no one has to get an intern to transcribe anything ever again.
This is just one of a dozen stories I could tell about technology making journalism more efficient and less labor intensive. You used to have to look through newspaper archives physically, using microfilm; now there’s Nexis. Breaking news reporters used to call their offices and read their notes over the phone; now they can email copy to their editors or write and post stories directly from their phones. Similar transformations have taken place across every industry in the world. The third industrial revolution shudders on. Yada yada yada.
What’s novel here is that Otter uses AI, and AI will transform the world, or so we’ve been told. ChatGPT and similar programs are already capable of writing essays, producing images from prompts, even videos. AI can answer any question you can ask. You can use it as a kind of virtual assistant, it can recommend movies, it can help you come up with workout plans. “It also powers search engines, social media, online shopping, and even gaming. In essence, AI is becoming increasingly integrated into various aspects of daily life.” At least, that’s what Google’s AI told me when I typed in “common uses of AI.”
It will also replace journalists, supposedly. And here I have some doubts.
Journalists and writers are always on those “which jobs are threatened by AI” lists, the logic being that journalists gather information and generate content from it, and AI can do that instantaneously. Workers in the industry are authentically worried about being replaced by “clankers,” the brand-new slur for robots. Just as unions representing dockworkers and grocery clerks have tried to limit the number of driverless trucks and automated checkout machines, unions representing writers (including my own) have tried to negotiate contracts that prohibit companies from laying them off in favor of AI. This worry seems to me oddly accepting of the premise that a clanker could replace a writer in the first place.
Usually unacknowledged in the hype and panic over AI is that for nearly a decade, companies have been using “robo journalists” to write up things like quarterly earnings reports and recaps of sporting events, the kind of journalism that is so formulaic it can be automated. And current iterations of AI are still mainly reproducing forms of writing that take little ability or effort: A recent Wall Street Journal article reported that freelance copywriters were losing clients to AI, but that this was especially pronounced for those doing “low value” tasks on work-for-hire platforms like Fiverr.
AI is clearly good enough now to produce readable copy for BS content purposes. Way back in the day, I ran an indie music blog for a shoe brand that saw music blogging as a way to burnish its image as the shoe worn by cool indie music people; the shoe brand marketing guy I worked for could not have cared less about what I wrote so long as it didn’t mention smoking, drugs, or competing brands. That job would now surely be done by AI. With time AI will likely be able to do more complex stuff, like recapping major news events (presidential debates, natural disasters) by summarizing social media posts and press releases1.
But journalism presents challenges I’m not sure AI can overcome. Here is a recent story I wrote about some restaurant workers forming a union, which is based on interviews with union organizers and restaurant management. AI could not have written that because, while it is superlatively gifted at regurgitating information it finds online, the information in that article didn’t exist before I reported it. AI could help you come up with questions for an interview,2 but it can’t conduct interviews, it can’t ask follow-up questions, it can’t form relationships with people and gradually prod them to divulge their secrets. It can’t learn new things and share them with others, the fundamental goal of journalism, because it can’t learn anything, at least not in the human sense.
The list of AI’s limitations is long. It can’t watch a movie, experience a work of art, or eat at a restaurant and form an independent opinion about such things. As an incredibly powerful version of autocomplete, AI can aggregate and summarize human opinions, but can’t literally think for itself. Will a future version of AI make the jump to not just mashing out summaries of news events but crafting original stories, pieces that make you sit up at attention? Could AI write the next “Frank Sinatra Has a Cold”? The next “The Case for Reparations”? The next “Negroni Season”?
The catch here is that a lot of the journalism industry is not focused on producing wildly original, impossible-to-replicate work. In the digital media era, outlets have built entire (flimsy) business models on aggregating and repackaging news events, churning out SEO-optimized headlines and shareable social content. If the point of writing, and writers, is to generate pageviews, shares, and engagement, well, why couldn’t AI-penned articles generate those same numbers for a miniscule fraction of the cost?
Last year, when io9 (an entertainment website owned by Gizmodo) published a machine-written piece about Star Wars, it got basic details wildly wrong and proved a total laughing stock, but it’s worth lingering on what the piece was supposed to be: a list of Star Wars TV shows and movies in the order the fictional events take place. AI shouldn’t write that story; no one should write that story. It’s boring, it offers no new perspective and certainly no new information, it only exists for SEO purposes. What’s more, once AI becomes able to make a correct chronological list of Star Wars films, no one is going to search for such a thing, they’ll just ask Chat GPT or Google’s in-house AI. We can argue about whether AI will take journalism jobs, but it’s almost guaranteed to take the SEO drek jobs.
The threat to journalism jobs isn’t from AI, it’s from C-suite types who imagine AI can replace journalists. In a previous generation these geniuses were getting rid of experienced reporters and replacing them with cheaper, less well-sourced greenhorns (or in a truly visionary move, replacing them with no one). A decade ago, the trend was to have readers, who were now called users, write the content themselves, for free.3 Now some people are running the “more with less” playbook again, only instead of underpaid freelancers or interns, they’re having machines do the writing.
Case in point: BuzzFeed’s uber-popular, nutrient-free quizzes — once written by “BuzzFeed community members” for basically nothing — are now written by robots. It’s the logical endpoint of the “more with less” slide of the content industry: a piece of fluff designed to be forgotten, written by literally nobody and probably read by nobody either.
Every time you hear about a website replacing writers with AI, that website first decided to chase clicks and embrace the content churn. Once your boss has decided you should be writing things like “7 Reasons ‘House of the Dragon’ Needs to Show Dick… Starting Tonight,” it’s awfully easy for that same boss to decide that actually, why don’t they just get an LLM to do your job? But the problem in this example isn’t the LLM, it’s your boss. And maybe a machine will end up taking his job first.
That’s assuming engineers can figure out how to get AIs to stop “hallucinating” and making up things. If they can’t do that, AI is going to be a lot less useful than anyone imagines.
Do you really need help with that?
Sites ranging from the Huffington Post to Forbes to the long-forgotten Mic relied to some degree on unpaid, amateur contributors.
"The threat to journalism jobs isn’t from AI, it’s from C-suite types who imagine AI can replace journalists." feels like a logic model issue, where AI is a newish resource available, and downstream of that, leadership may decide to use that resource instead of journalists. If so, regardless of whether we blame AI for existing or leadership for using it, the result is fewer journalism jobs, particularly lower level ones that help people get a foot in the door and experience. For many in leadership, journalism doesn't need to be GOOD, just GOOD ENOUGH to get clicks. If people will click on AI generated content, there's less demand for real live journalism and therefore fewer jobs, in many job sectors.
Yess! Thank you.