November 6, 2025
EBU, WAN-IFRA and FIPP are calling on those who value reliable news and information to support FACTS IN : FACTS OUT, a global campaign demanding that AI systems stop distorting news content.
AI platforms are already the gateway to news for many. However, recent BBC/EBU research has shown that these systems often decontextualize, misattribute or strip trusted journalism across countries, languages and platforms.
“Despite its power and potential, AI is still not a reliable source of news and information, but the AI industry has not made it a priority,” said Liz Corbin, director of news at the EBU.
“If enough organizations support FACTS IN: FACTS OUT, I hope AI companies will take action on this issue quickly…The public rightly wants access to quality, trustworthy journalism, regardless of the technology they use, so it’s clear we need to work together.”
Vincent Peyregne, CEO of WAN-IFRA, said: “Everyone who invests in the production and publication of news is encouraged to stand up for trusted journalism by supporting FACTS IN: FACTS OUT. But this is not about condemnation, it is an invitation to collaborate.”
The consortium calls on global media organizations to focus on FACTS IN : FACTS OUT by:
We support five principles. Please share your logo by contacting us at info@newsintegrity.org. Visit www.newsintegrity.org to access resources, information and talking points. Share your BBC/EBU report with your network. Explore and realize dialogue with regulators and technology partners. Use the hashtag #FactsInFactsOut on social media.
Fact In: About Fact Out
FACTS IN : FACTS OUT is part of the “News Integrity in the Age of AI” initiative, which sets out five key principles that AI developers must adhere to to ensure their tools do not compromise the integrity of news.
No Consent – No Content. News content should only be used in AI tools with the sender’s permission. Fair recognition. The value of trusted news content needs to be recognized when used by third parties. Accuracy, Attribution, and Sources. The original source behind AI-generated content must be visible and verifiable. Pluralism and diversity. AI systems must reflect the diversity of the global news ecosystem. Transparency and dialogue. Technology companies need to engage openly with news organizations to develop common standards of safety, accuracy, and transparency.
For more information visit newsintegrity.org
