A Wyoming newspaper reporter resigned after being caught using AI in several news stories that included fake quotes from officials.
The editor of a small Wyoming newspaper wrote a mea culpa after a reporter on staff had been caught using AI or artificial intelligence. The editorial from the newspaper’s editor does not name the reporter. In fact, the editor took all of the blame, saying, “the buck stops here.”
The controversy came to light when a reporter at a competing paper noticed a few strange things in stories. He said a few quotes seemed off.
But thank Larry the Cable Guy for helping expose what went wrong
But he knew something was definitely wrong in a story about, of all people, the comedian known as Larry the Cable Guy. The comedian had been chosen to be the grand marshal for a parade. Here’s the excerpt as quoted from ABC News and the Associated Press…see if you can spot what’s suspicious:
The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy’s most beloved figures. … This structure ensures that the most critical information is presented first, making it easier for readers to grasp the main points quickly.
The second sentence derailed the story. That sentence described a concept known as the “inverted pyramid.” It’s a style of news writing for print and digital. In a nutshell, it dictates the newest and most important details go at the top. Older, more background information goes toward the bottom of the story.
As one story of the inverted pyramid goes, it dates back to the Civil War. Reporters had to telegraph stories to their newspapers. But they knew telegraph lines being knocked down in battle was a real concern. So they had to organize their stories to get the newest details out right away. If the line stayed live for the entire transmission, everything got through. But if the line went dead halfway through, presumably, the most critical information still got through.
But Poynter calls that a myth, suggesting that it actually came into practice a bit later: as President Abraham Lincoln lay dying from an assassin’s bullet.
Either way, the definition of an inverted pyramid — which in no possible way belongs in a story about Larry the Cable Guy serving as grand marshal of a parade — led to the reporter being caught using AI.
Is there a valid excuse to use AI in news stories?
The reporter who spotted it said he met with the reporter accused of using it. He says the reporter, who was fairly new to journalism, admitted to using it in stories shortly before resigning from the paper.
No matter how new a reporter is to journalism, we might reasonably expect that it’d be clear that this is unacceptable. Journalism is supposed to be about reporting the truth. Journalists receive press releases every day. But when they pull details from a release, they’re supposed to say so. That’s especially true when they aren’t able to immediately confirm the facts the release contains.
It’s called attribution. It’s a critical factor in journalism. It helps clarify where information comes from. In this case, it would have clarified the reporter didn’t personally talk to the people he quoted — or apparently misquoted.
I work for a company in my real job that has a specific policy about artificial intelligence: We’re not to use it. The policy states folks much higher up in the chain of command must approve such use in advance. They must carefully vet the AI vendor to make sure it can handle the task appropriately. To put it simply, we’re not permitted to just go to an AI platform and have it generate content.
You shouldn’t have to tell a journalist to not rely on AI.
The Associated Press says it has experimented with AI in certain cases. One of those cases is in translating content from English to Spanish. That usage shouldn’t be a problem, one might think. But I tried doing the same to see if there was a dependable translation source out there.
In my experiment, I used two different translation services to “double translate.” I had the first translate from English to Spanish. Then, I used a second source to translate the Spanish translation back to English. I wanted to see how well the original thoughts held up in the translation.
To be fair, I did not intend to actually publish the translated version. I wanted to compile a few translations just to see how much we might be able to depend on it at some point in the future…if it reached any kind of approval.
Overall, it did fairly well. But there are certain phrases in English that don’t translate well in Spanish. (And vice versa.) For instance, I wrote a story about South Carolina’s governor making a plan for steps he would call for after the federal government lifted Title 42 COVID-19 emergency order.
In English, we use the word lift in that context to mean cancel. The translation AI took it more literally, translating it to refer to what the governor would do when the government elevated the Title 42 order.
Oops.
I expected as much, honestly. I’ve told you what happened when I experimented with an even simpler AI-generated task here at this blog. Spoiler alert: It produced an embarrassing blunder right off the start.
A journalist can’t trust artificial intelligence to do what a journalist does. No matter how much faster it may be able to generate content, a journalist should consider that content suspect at best.
How did it slip by?
But in the Larry the Cable Guy story, one would hope the reporter would have caught the obvious non sequitur. One would hope, as well, that the editor would have as well.
The editor of the paper, in his editorial, titled “Eating Crow,” recalled recent editorials criticizing the director of the Secret Service for failing to take responsibility in the wake of the Donald Trump assassination attempt.
The editor then said this:
When you are the captain of the ship, you are responsible. You hold full responsibility for everything that happens to it, even when you are not aboard. When the buck stops here, this is the deal, either when you encounter mighty storms at sea or slightly sloped roofs.
If I were to make excuses about how busy I am or how new I am to editing, I would be no different than Director Cheatle. Failures are failures and must be owned by responsible people.
I have no doubt that in the struggling world of journalism, this editor — like every other — faces a seemingly impossible workload. I’ve been there in my real job.
Mistakes, unfortunately, are human nature. No matter how hard you strive for perfection, you’ll fall short of it sooner or later. I credit this editor with taking the responsibility and for his vow to “do better.”
Those of us who work in the industry take mistakes personally. No matter how much a typo might irritate a reader, it will irritate the editor who missed it exponentially more. It’s easy for a reader to spot a mistake and be ready to call someone out. It’s a lot harder to catch those mistakes when you’re on deadline, trying to get as much content reviewed and edited as possible.
Every editor should be eagle-eyed. Nothing should ever get missed. No typo, no fact error, no curious line that seems to not make sense.
But no one, including that reader who takes delight in pointing out a mistake, is perfect.
I’m sorry, honestly, that he had to “eat crow.” But I’m glad he did so publicly. It’s a reality check for everyone in the industry, and sometimes, a reality check can be healthy.
A reporter getting caught using AI to create stories is not what journalism needs. A public commitment to make sure it doesn’t happen again by an editor who cares is exactly what journalism does need.
Along with a commitment, it should go without saying, by reporters to make sure they never make the same mistake.