Skip to main content
AI programs struggle to give users an executive summary of the news, according to testing done by the BBC. Getty Images
Tech

AI can’t accurately summarize news: BBC


  • The BBC asked four top AI platforms to summarize news stories. The British public broadcaster found that more than 51% of their responses had “significant issues.” They included factual errors and altered quotes.
  • Researchers are concerned about the potential harm caused by inaccuracies and distortions in AI-generated news summaries. They fear it could undermine trust in news.
  • Nineteen percent of AI-generated summaries introduced factual errors, and 13% altered quotes from BBC stories, highlighting specific areas of concern.

Full Story

Despite billions of dollars in research and technology invested in artificial intelligence, the programs still struggle to give users an executive summary of the news. That’s according to testing done by the BBC. 

Media Landscape

See how news outlets across the political spectrum are covering this story. Learn more
Left 36% Center 45% Right 18%
Bias Distribution Powered by Ground News

The British public broadcaster ran 100 of its stories through four of the most advanced AI platforms. They are OpenAI’s ChatGPT, Microsoft’s Copilot, Google’s Gemini and Perplexity AI. It asked them all to give summaries of the stories it had entered.

The result left researchers unimpressed at best and, at worst, worried about the potential for harm.

QR code for SAN app download

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.

Point phone camera here

It found that 51% of all AI answers to questions about the news had “significant issues.” Nineteen percent of the AI answers that cited BBC content introduced factual errors, including incorrect factual statements, numbers and dates. Thirteen percent of the quotes the platforms sourced from BBC stories had been altered from the original source or not present in the article.

In one test, Google’s Gemini produced a summation that included “The [National Health Service] advises people not to start vaping, and recommends that smokers who want to quit should use other methods.” 

The UK’s NHS says vaping is one of the most effective tools to quit smoking. 

“This matters because it is essential that audiences can trust the news to be accurate, whether on TV, radio, digital platforms, or via an AI assistant,” the researchers wrote. “It matters because society functions on a shared understanding of facts, and inaccuracy and distortion can lead to real harm.”

The report notes how much damage an incorrect or misleading headline could cause should it go viral via social media.

In response to the study, a spokesperson for OpenAI told BBC, “We support publishers and creators by helping 300 million weekly ChatGPT users discover quality content through summaries, quotes, clear links, and attribution.”

The BBC didn’t test DeepSeek, which news rating agency NewsGuard found it to have only provided accurate responses to news and information prompts 17% of the time. In 30% of the cases, it repeated false claims. Additionally, 52% of the time, it gave “vague or unhelpful responses” to news-related questions.

Tags: , , , , , ,
Demo mode ×