UK Must Protect Creative Industries, not Big AI, urge Open Markets Institute, Center for Journalism & Liberty, and Partners

 

“The major AI companies are set to collectively spend upwards of $1 trillion on AI development over the next five years, profiting off the backs of creators and publishers whose works have been unlawfully scraped for training data.”

A coalition of prominent organizations representing hundreds of news outlets and millions of journalism consumers in the UK and around the world has urged the UK to compel AI companies to respect copyright and avoid granting dominant foreign incumbents free reign over the UK’s intellectual and creative labor. In a submission to the UK Intellectual Property Office's consultation on copyright and artificial intelligence, groups call for the urgent implementation of an opt-in protocol for AI training data collection to protect creators' rights and compel technology companies to respect these rights through fair compensation.

“The UK's proposed opt-out approach fundamentally undermines copyright principles and unfairly privileges the AI industry instead of its longstanding creative industries, with perilous implications for not just the UK economy but for democracy as well,” said Dr. Courtney Radsch, Director of the Center for Journalism & Liberty at Open Markets. “Of particular concern is the rampant exploitation of creators and publishers by AI companies that are strip-mining copyrighted works without permission or compensation, threatening the very foundation of the UK's creative and information industries.”

Key recommendations from the submission include:

  • Opt-In Framework: Implement a consent-based approach requiring explicit permission from rights holders before their works can be used for AI training.

  • Transparency Requirements: Mandate detailed disclosure of training data sources, including web crawler specifications and data acquisition methods.

  • Standardized Protocols: Develop freely available, machine-readable, standardized opt-in protocols that are user friendly and easy to implement for both creators and AI companies.

  • Strong Enforcement: Establish clear legal consequences for violations, including fines and disgorgement of profits from infringing activities.

  • Protection for Individual Rights: Implement specific provisions for personal likenesses, especially image and voice rights.

As the submission notes, "The UK's creative industries represent a crown jewel of its economy, producing some of the world's most celebrated cultural works across literature, art, music, and some of the world's most respected and trusted news sources. This informational and creative excellence requires careful stewardship to maintain its vitality."

While technology corporations speculate about potential transformative benefits of generative AI, the very existence of these AI systems and their safety and effectiveness, depend heavily on access to high-quality training data, which only human creators can reliably provide. Establishing regulatory frameworks to ensure fair compensation for creators is not just about protecting their rights — it is also about preserving the very source of innovation that AI companies depend upon for advancement.

The coalition includes the Open Markets Institute, its Center for Journalism & Liberty, the Independent Media Association (UK), Impress (UK), the Global Forum for Media Development (global), and the Danish Press Publications’ Collective Management Organization (Denmark). 

Open Markets and the Center for Journalism & Liberty (CJL) at Open Markets have published extensively about the growing threat of market concentration in the AI sector and its impact on creators, the risk of entrenching Big Tech AI monopolies, and made the Case for Consent in the AI Data Gold Rush. They have previously submitted expert input to the UK Competition and Markets Authority related to anticompetitive practices in AI, including Microsoft and OpenAI’s partnership and Google’s pending strategic market status