Join the call for truth in AI with FACTS IN : FACTS OUT – a global stand for trustworthy news

The EBU, WAN-IFRA and FIPP are inviting those who value trustworthy news and information to endorse FACTS IN : FACTS OUT – a global campaign demanding that AI systems stop distorting news content.

AI platforms are already many people’s go-to gateways to news. Yet a recent BBC / EBU study showed that these systems often alter, misattribute or strip context from trusted journalism – no matter the country, language or platform.

“For all its power and potential, AI is not yet a reliable source of news and information – but the AI industry is not making that a priority,” said Liz Corbin, EBU Director of News.

“If enough organisations endorse FACTS IN : FACTS OUT, we hope the AI companies will address the problem urgently… The public rightly demands access to quality and trustworthy journalism, no matter what technology they use, so it’s clear we need to work together.”

Vincent Peyregne, CEO of WAN-IFRA said: “Anyone invested in making and publishing news is encouraged to stand up for trusted journalism by endorsing FACTS IN : FACTS OUT. But this is not about blame – it’s an invitation to collaborate.”

The consortium invites the world’s media organisations to put their weight behind FACTS IN : FACTS OUT by:

  • Endorsing the five principles. Contact info@newsintegrity.org and share your logo;
  • Visiting www.newsintegrity.org to access resources, information and talking points;
  • Sharing the BBC/EBU report with your networks;
  • Seeking and enabling dialogue with regulators and technology partners;
  • Using the hashtag #FactsInFactsOut on social media.

About FACTS IN : FACTS OUT 

FACTS IN : FACTS OUT is part of the News Integrity in the Age of AI initiative, which sets out five key principles AI developers must respect to ensure their tools don’t damage the integrity of news:

  1. No consent – no content. News content must only be used in AI tools with the authorization of the originator.
  2. Fair recognition. The value of trusted news content must be recognised when used by third parties.
  3. Accuracy, attribution, provenance. The original source behind any AI-generated content must be visible and verifiable.
  4. Plurality and diversity. AI systems should reflect the diversity of the global news ecosystem.
  5. Transparency and dialogue. Technology companies must engage openly with media organisations to develop shared standards of safety, accuracy, and transparency.

More at newsintegrity.org

Your first step to joining FIPP's global community of media leaders

Sign up to FIPP World x