Imagine an understaffed and overworked local newsroom has no one available to cover a town board meeting. A TV news producer proposes to place a camera in the meeting room, record the meeting and then have generative AI transcribe highlights, surface what it judges to be the most important and newsworthy elements in the meeting and then pull them together into a first draft of a story. An editor then puts eyes on this draft, makes a few changes and hits publish.
This scenario was posed to a news organization leader, a news technologist and a news ethicist. Their task: hash out the viability of the process and examine whether it is possible today.
“We have a tool in place called SearchMinutes.com that’s going to be integrated more into workflow solutions with the NPS, but it does exactly that,” said the news technologist, Aimee Rinehart, senior project manager for AI strategy at the Associated Press, speaking at TVNewsCheck’s NewsTECHForum conference on Tuesday. “It takes recordings of any public meeting, does a transcription of that, can identify key words that a reporter has said they’re interested in — maybe it’s ‘pothole repair’ — and it will take you right into the video where they take place.”
Rinehart pointed out that there are a lot of issues with the tool. Sometimes the microphone doesn’t work. People forget to hit record. However, with newsrooms across the country lacking resources, the technologist said this tech can help the newsroom “try and close a gap.” Instead of zero coverage of the town board meeting, with the gen AI tech tool in place, the newsroom is at least able to provide some coverage of it.
The news organization leader, Gina Chua, executive editor of Semafor, also endorsed the technology and said it’s “important” that newsrooms engage with it now so they get a sense of its accuracy rate. This will only lead to tool technological improvements.
“The real issue at the end of the day is, what is your error rate?” Chua said. “Your error rate is not zero, but let’s be honest, your error rate with journalists is also not zero. So, the question is, what is the delta between those two?”
The news ethicist, though, had a warning for the technologist and organization leader: Transparency is a must. And inviting this technology into newsrooms creates a new disclosure requirement that leaders and producers have to execute should they wish to be in line with good journalism ethics.
“There’s many ways you can do that,” said Kyle Plantz, senior program manager at the Craig Newmark Graduate School of Journalism at the City University of New York. “I’ve seen stories that have that printed at the bottom or at the top, or they have news outlets that have dedicated pages to explaining how they’re using AI in their reporting.”
Plantz also observed that a lack of real-reporter presence at the town board meeting means there’s a chance the news group will not have been aware of a possible altercation between two attendees prior to its official beginning. Furthermore, he said the technology as it stands now is not good at deciphering crosstalk, which means more of the meeting’s discussion will be missed or misinterpreted.
“If AI gets something wrong, who is responsible for it?” Plantz asked. “Of course, the newsroom. So, make sure you’re creating that credibility and ensuring that the reputation of your brand is not threatened.”
This scenario was one of several presented by moderator Jon Accarrino, founder of Ordo Digital and TVNewsCheck’s AI columnist.
The second focused on versioning, which Accarrino called “one of the most time consuming aspects of many journalists’ work days, especially those who work in television.” He asked Chua and Rinehart what AI is doing to help right now.
“This moment for AI is for versioning,” Rinehart said. She revealed that the AP will soon roll out a new product in its Electronic News Production System called Storytelling that will turn the “hopes and dreams of people who always wanted to have versioning made easy” into a reality.
“The technology’s here,” said Chua. “It’s capable … The only thing that holds us back is some imagination and some ethics.”
Plantz didn’t dive directly into the ethics concerns with gen AI taking care of versioning, at least not without first “encouraging a culture of experimentation.” After all, newsrooms will only get stories touched by AI “right” through trial and error.
Plantz returned to the question of transparency and accountability in newsrooms, reminding listeners to take them into consideration when engaging with gen AI tools. Then, he added that newsroom leaders must also ask “a larger fundamental question” of whether the extra coverage enabled by gen AI is “actually serving the community in any way.”
“Some coverage does not equal good coverage,” he said. “You have to have that conversation with your organization and your audience about what is important to them in order to cover stories accurately and understand the issues that they are facing. Do they want you to cover that school board meeting? Do they need that versioning to happen there? I just want to make sure that we avoid becoming AI content farms where we’re just churning out content or versions of content out, and we lose the journalistic value of the original story.”
Accarrino presented a third scenario to the panel: A newsroom is considering AI-created avatar anchors, but only to reinvest the money saved on the personnel into field reporters.
“Maybe it’s not replacing an anchor with an avatar, but maybe leaning on the reporting team to anchor you through the broadcast,” Rinehart said. “That’s how I would see it. I like to keep it very human.”
Rinehart added that should such a scenario come to pass, TV news producers may lose a direct connection with the audience, “but that visible presence of the reporting team,” even in place of an anchor, could keep the connection in place.
Chua said it’s already understood by viewers that anchors haven’t reported every story they present. However, a change to avatar anchors may bring changes to the relationship between consumers and publishers.
“I don’t know where it’s going to land, but I think anybody who’s in the business of spoken information with some presentation to people needs to understand how that’s tracking,” Chua said. “Because we are not driving this. This is happening, and we need to stay on top of it. Because whatever happens, whatever we think, we are going to be swept up in it.”
Though avatar on-air personalities generated by AI may not be ideal to the longevity of U.S.-based TV anchors, or even reporters, in other places the technology may ironically save them.
Rinehart said for a long time she believed AI-generated talent cannot and should not ever replace humans. But then she heard about journalists in Argentina leveraging the tech to deliver news anonymously. Why? To avoid persecution by the country’s press-targeting president, Javier Milei.
“Avatars can’t be arrested,” Rinehart said.