What changing terms and privacy policies mean for artists in the age of AI
You may have seen recent conversations around changes to WeTransfer’s Terms of Service - sparking concern among many artists, designers, photographers and creatives alike. As part of our ongoing support for visual artists, DACS is taking a closer look at what these kinds of updates mean. We want to help artists understand what to watch out for in terms and privacy policies, especially when using online tools to store, share or showcase their work.
What's happening?
Over the past 12 to 18 months, several popular platforms used by artists have quietly changed their terms and privacy policies. In many cases, the updates introduce broad or vague wording about how user content may be used. This often includes phrases such as “developing and improving services,” which can potentially cover machine-learning processes. For artists and creatives, the worry is that work uploaded for one purpose - sharing with a client, storing files or publishing online - could later be used in different ways without their clear understanding or consent.
The shift over time
These changes have become more frequent as AI tools have become more commonplace. The most visible example for many came in 2024, when Meta announced plans to train its AI models using public Facebook and Instagram content. In the UK, adult users’ public posts may be used unless they exercise their right to object.
In 2025, Pinterest, a platform widely used by artists and creators, updated its Privacy Policy to allow user content, including previously posted Pins, to be used for generative AI training. UK users are included under this policy but can exercise their right to object by adjusting their account settings.
CapCut, a popular video-editing app, updated its Terms of Service to grant a very broad, perpetual licence over uploaded content. The terms do not explicitly say user content will be used to train AI, but the breadth of the licence and integrated AI features have raised reasonable concerns among creators.
Most recently, WeTransfer’s updated Terms of Service prompted criticism because of unclear language regarding how uploaded files could be used. WeTransfer has since clarified the Terms of Service language and stated that it does not use user files to train AI.
These are just a few examples of platforms that have changed their terms and privacy policies over the past couple of years which has led to concerns over whether user content may be used for AI training. Other platforms have quietly made similar changes, and the broader discussion around how our content is being used on these platforms continues.
Why it matters
For many creatives, these sites are an essential part of daily work - whether for sharing high-resolution images, showcasing portfolios, or delivering commissioned projects. When the language in terms and privacy policies becomes less transparent, it raises important questions about consent, ownership, and the future use of creative work. The risk is that artists might unwittingly agree to terms that allow their work to be used in ways they would never intended, especially in the fast-moving world of AI.
What to look out for in terms & privacy documents
Examples of wording that may signal repurposing or machine learning/AI processing include:
- “To operate, develop and improve the services”
- “For research and development purposes”
- “To develop new features or technologies”
- “Analyse content using machine learning to improve accuracy and performance”
If you see these without clear limits or opt-outs, proceed cautiously.
Can artists opt out?
In most cases, opting out of AI-related data use is difficult - and sometimes impossible. Meta users in the UK can follow steps to object to public content being used to develop AI models; in other regions, comparable opt-outs are often unavailable. Artists concerned about this trend can take practical steps to reduce the risk of their works being repurposed:
- Review before you upload. Use platforms that advertise stronger protections or a clear opt-out - but remember policies can change quickly.
- Check settings regularly. Look for any available AI-use or personalisation controls.
- Protect sensitive files. Add passwords or encryption when sending or storing high-value assets.
- Limit what’s public. When posting publicly, use lower-resolution images where appropriate and consider visible watermarks.
- Speak up. Voice concerns publicly or directly to the platform; user pressure has, in some cases, led to clearer wording or policy changes.
What's next?
AI technology is evolving rapidly, and more platforms are likely to adjust their terms to allow broader uses of user content. At DACS, we will continue to monitor these developments and share updates with artists, beneficiaries and members. Through our policy work in Parliament and direct engagement with MPs and government departments, we push for greater transparency, choice and protection in how creative work is used online. Our goal is to ensure that, even in an increasingly digital and AI-driven world, artists retain meaningful control over their work.