Our joint statement calling for transparency, fairness and respect for creators’ rights in the age of AI
DACS alongside a coalition of organisations representing over 100,000 visual artists, photographers, illustrators and image libraries has spoken out against the widespread, unauthorised use of copyright protected visual works in AI training.
Together, we are calling for retrospective settlements for past unauthorised use, transparent disclosure of training datasets and fair licensing agreements to ensure creators are properly credited and compensated for their contributions to AI development.
Below is the full statement released today.
A statement from leading organisations representing Visual Artists, Photographers, Illustrators and Image Libraries on Artificial Intelligence and Copyright
Our members include hundreds of thousands of professional visual artists, photographers, and illustrators working in the UK and internationally whose creative, copyright protected works have been unlawfully scraped and used for AI training without remuneration.
Whilst we recognise the developmental potential of AI innovations for the creative sector and beyond, the fundamental principle that creators should receive fair attribution and compensation for their contributions to AI development has not been upheld.
We believe that AI companies operating in the UK should be transparent, equitable and respect creators' rights and economic interests.
Generative AI & Visual Works
Addressing Past Uses
Professional visual artists, photographers and illustrators have experienced a reduction in commissions, opportunities and pay due to the competition of AI-generated content in the marketplace. Meanwhile, these creators have not received financial compensation for the use of their work to train the systems they are being substituted by.
Many past ingestions of creative works for AI model development likely infringed rights in multiple countries. We seek retrospective settlement mechanisms where models trained on unlicensed works are exploited. This is essential to restore trust and enable future cooperation between the creative and AI sectors.
Transparency Requirements
Many generative AI models do not declare what works they have used for training, which is a barrier to achieving compensation for creators’ indispensable contribution to the AI value chain.
Transparency measures are being developed in other territories, such as California, and should be available to UK creators.
We call for comprehensive disclosures on the use of creative works in existing AI systems, including:
- What content was used: Clear documentation of works included in training datasets
- How it was obtained: The methods and sources through which content was acquired
- How it was used: The weight, relevance, and relationship of creative works to the commercial value generated by AI systems
Fair compensation for value creation
We seek appropriate compensation for the unauthorised use of creative works in AI training. While we acknowledge the complexities involved, including questions of valuation, distribution mechanisms, and identifying rightsholders, we firmly believe existing copyright licensing models can be replicated in the AI context, as it has done previously for many burgeoning technologies and innovations.
Collective and transactional licensing schemes currently operated by collective management organisations, individual artists and representative companies demonstrate that administrative and commercial challenges can be overcome to create solutions to market demands.
We believe that a collaborative dialogue between creators, representatives and technology companies can produce fair remuneration models which compensate visual artists for their contribution to the AI value chain – which starts with their work.
Roadmap for Ongoing Remuneration
By addressing past uses and restoring the trust needed to initiate good-faith negotiations between AI firms and creators, both sectors can move forward with developing a roadmap for fair remuneration models.
Fair pay for high-quality works
Indiscriminate scrapping for AI training without remuneration is not beneficial for ethical AI firms or creators. Whilst some high-quality works – which are key to high-quality AI products – are accessible online, an extensive portion of creators’ works remain behind paywalls, in private digital archives or stored offline. By agreeing fair terms for past and ongoing use, creators may be more agreeable to allowing AI firms lawful access to these catalogues, improving model quality.
Recognition of investment
Professional visual creators spend significant resources developing their practice, purchasing equipment and acquiring new skills to produce high-quality works. These works, often of high resolution with accurate metadata, are valuable for AI products. The personal investment creators put into their works must be recognised and recompensed.
Licensing partnerships
Emerging licensing partnerships between AI firms and publishing, legacy media and social media companies have demonstrated remuneration for the use of copyright works is achievable. The visual sector has seen a significantly lower volume of such agreements and visual image rightsholders have not achieved value from existing deals.
We call on AI companies to seek licensing deals to access high-quality works, ensuring fair remuneration for creators.
This statement reflects the collective position of AOI, AOP, DACS, PICSEL, and the creators we represent. We are committed to ongoing engagement with all relevant stakeholders to achieve fair and sustainable solutions.
Together, we urge AI firms, policymakers and the wider public to support fair compensation and transparency across the creative and technology sectors. If you want to support this statement you can:
- Share our statement on social media, see here
- Write to your MP via the Creative Rights in Coalition campaign here