Skip to main content

Organisations demand investigation of Meta's new AI training privacy policies

Following Meta’s proposals to update its privacy policies to allow training of AI models, DACS, together with societies representing copyright for writers, performers and directors, has written to the Digital Regulation Cooperation Forum (DRCF), which was established in 2020 to cooperate on regulation of online platforms to investigate the matter.

The letter comes following an announcement in May that Meta, the parent company of Instagram and Facebook, updated its privacy policies to include provisions for using user data to develop its AI models.

The change is due to come into force on 26 June, and may have significant implications for users, particularly visual artists and creators who share their works on these platforms.

In June, the Irish Data Protection Commissioner revealed that Meta paused its plans to train its large language model using public content shared by adults on Facebook and Instagram in the EU after complaints filed in Europe. But it is not clear whether these changes would impact UK creators and users of Meta’s products.

DACS, together with Authors Licensing and Collecting Society (ALCS), British Equity Collective Society (BECS), PICSEL and Directors UK, has written to the DRCF and its members (the Information Commissioners Office, Ofcom, the Competition and Markets Authority and the Financial Conduct Authority) to investigate whether Meta’s actions could harm the rights of individuals in the UK.

Lord Clement Jones, CBE, supporting the letter, said:

“Meta’s proposed privacy policy changes will have a very chilling impact on all users, but particularly on creators and performers who use social media platforms to bring audiences to their work. UK regulators must look into the proposed changes, which would allow Meta and third parties to use content posted onto the platforms to develop AI models, and investigate whether individuals in the UK are at risk.”

Related