Opinion: AI is already in journalism, and generative models like ChatGPT won’t rip the industry out of people’s hands
———-
As ChatGPT’s first anniversary nears, I don’t believe that artificial intelligence will take jobs away from me or my peers. If anything, it could make work better for us in the future. Compared to the opinions of my friends and family outside of journalism, my stance may seem optimistic or naive, but I think it’s unrealistic to expect AI to transform writing or the industry into algorithms.
ChatGPT’s arrival in November 2022 brought expectations and fears to the front of many people’s minds. In academia, the fear of professors is that students will stop writing original work for assignments and rely too heavily on AI tools. Both reporters and fiction writers face the fear that if AI gets good enough, there could be job loss in their industries. Even though AI and algorithms are not new, the generative AI site sparked the belief that technology could complete writing-based tasks.
Generative AI platforms are not the first “game-changers” to face high expectations from the journalism industry and the general public. When Apple launched the iPad in 2010, newspapers like the New York Times faced financial instability and hoped the new technology would draw readers back to the paper. The two waves of excitement in 2010 and now demonstrate how expectations may fall short of technology’s abilities.
Newsrooms have used AI in journalism for automated news and machine-written stories over the last decade. Most pieces written by AI are on data-focused topics like companies’ financial reports, sports results and athlete statistics, weather and traffic updates. So, for straightforward data reporting, AI has already proved its usefulness.
“Until you get to bigger analyses about these sorts of topics, it’s like Mad Libs,” Bryce Newell, an associate professor of media law and policy at the University of Oregon, said. “You can take certain information and have a template. You can plug and play statistics.”
When discussing how AI writes, whether it is for journalism or academia, Newell said templates for data reporting can be more reliable than generating an original story. Prompts that tell AI to create original writing can lead to mistakes and false information, known as AI hallucinations. These hallucinations are leaps and conclusions drawn by algorithms that range from offensive headlines to attributing books to the wrong authors.
Last month, MSN, Microsoft’s news and content website, published an AI-written obituary when former NBA player Brandon Hunter died. The automated headline read: “Brandon Hunter useless at 42.” MSN retracted the story after becoming aware of its error.
“Where it’s gone wrong has been where there needs to be sort of a nuance to a story, or there is just a way in which humans communicate [that] is not always understood by an AI,” Damian Radcliffe, a Carolyn S. Chambers professor in journalism at UO, said.
Of course, automated news can go wrong for reasons like editors not fact-checking or AI using inaccurate triggers to base its reporting on. Whether AI gets information correct or not, publications should disclose machine involvement so reporters and editors can step in to correct or remove misinformation when necessary.
Although AI can and does produce flawed journalism, examples of success have increased as publications and people figure out how to incorporate it into the industry. For example, the Associated Press has used AI to create quarterly earnings reports since 2014. But for AI to be successful, people must be involved in directing it towards the proper result.
“This technology, in many respects, is only as smart as the operators who are using it,” Radcliffe said.
Incorporating AI into journalism can increase efficiency in newsrooms by increasing the volume of published stories and clearing up time for reporters to pursue intensive or time-consuming projects like investigative pieces. AI has not taken over the storytelling work of journalism, but instead allows people to write more in-depth narratives.
“Reading a transcript is not the same as being in a room and reading the tone of a room, and seeing people’s body language and then being able to follow up and talk to people individually,” Radcliffe said.
AI and its generative models should prompt questions about its ethical role and accuracy in journalism, but the high hopes or fears of it replacing human reporters are far-fetched. ChatGPT is just one of many new technologies that will join a writer’s toolkit.
Moore: Stop expecting AI to be the game-changer in journalism
Maddy Moore
November 9, 2023
More to Discover