Two Badges. One Commitment.

Individuals and organisations who sign the Charter may earn an official Responsible AI badge — a visible symbol of leadership, trust, and responsible AI practice across the UK’s media and creative sectors.

  • Responsible AI Practitioner (for individuals)
  • Responsible AI Organisation (for companies and institutions)

Badge access is exclusive to AI in Media Institute members after signing the Charter.

About the Charter

Artificial Intelligence is reshaping how media is created, distributed, and consumed. It influences how we write, design, publish, produce, broadcast, edit, and communicate. These technologies offer extraordinary opportunity, but also risk without clear standards.

The Responsible AI Charter sets out shared principles for ethical, transparent, and accountable AI use across the UK’s creative and media industries. It is designed to strengthen trust, protect creative value, and support the safe and responsible adoption of emerging technologies.

The Charter is open to individuals and organisations across journalism, broadcasting, film, television, publishing, advertising, design, digital media, cultural institutions, and the wider creative economy.

  • 1. Human Creativity First

    AI should elevate human imagination and enhance the cultural and editorial value of creative work.

  • 2. Transparency and Attribution

    AI-generated or AI-assisted content should be disclosed clearly and appropriately to maintain audience trust.

  • 3. Ethical Data and Fair Use

    AI systems should respect copyright, creator rights, and lawful data practices, ensuring ethical use of training data.

  • 4. Accuracy, Integrity and Public Trust

    AI must be used in ways that strengthen truth, uphold journalistic standards, and minimise harm or misinformation.

  • 5. Diversity, Equity and Inclusion

    AI tools and workflows must avoid bias, reflect diverse voices, and support fair and inclusive representation.

  • 6. Accountability and Oversight

    Clear responsibility should be held for how AI tools are deployed, monitored, and integrated into creative and editorial workflows.

  • 7. Safety and Risk Management

    Organisations should ensure safeguards are in place to protect people, communities, and cultural institutions when deploying AI technologies.

  • 8. Sustainability and Long-Term Impact

    AI innovation should contribute positively to society, culture, and the creative economy, supporting long-term public value.

  • Evidence, Proportionality and Due Process

    AI must support fair, proportionate, and evidence-based decision-making, with clear routes for scrutiny and redress.

Why The Charter Matters

The rapid rise of AI in creative and media workflows demands shared standards and a collective commitment to integrity. The Charter:

  • Creates a trusted benchmark for responsible AI practice
  • Supports fair, ethical, and transparent innovation
  • Signals professionalism to audiences, clients, funders, and collaborators
  • Aligns individuals and organisations with national leadership in responsible AI
  • Helps shape the UK’s future media and creative ecosystem

It is not a legal document or regulatory code; it is a public commitment to responsible, human-centred innovation.

How Signing Works

Signing the Charter involves three steps:

Review the principles
Commit to upholding them in
your work or organisation
Submit the signatory form
(individual or organisational)

Signatories may be publicly recognised as part of the UK’s commitment to responsible AI in media and creativity.

Note: Badges are only available to AI in Media Institute members who have signed the Charter. Visit the Membership page for details.