Meta Requires Disclosure of Digitally Altered Content
Meta has announced it will be requiring advertisers to disclose when they digitally create or alter content for advertisements surrounding social issues, elections, or political subjects.
From now until the end of 2024, more than 2 billion people will be voting in nationwide elections from countries around the world. Protecting citizens from misinformation has become increasingly important to protect the integrity of democratic processes. As such, big tech platforms are taking action to ensure its users have access to accurate and authentic information.
Meta will apply its new policy worldwide at the beginning of 2024 and will start informing users about digitally created or altered advertising.
This will require all advertisers to disclose when an ad shows a real person saying or doing something that never occurred, shows a realistic person or event that doesn’t exist, or alters or creates footage of an event that actually or allegedly occurred.
Advertisers are not required to disclose digitally altered images or videos when the altercations are “inconsequential or immaterial to the claim, assertion, or issue raised in the ad,” such as size adjustments, color correction, or image sharpening.
Meta will reject digitally altered ads not properly identified and companies that repeatedly fail to disclose content accurately will risk penalties.
This new policy was announced only a week after Meta disclosed that advertisers in regulated industries, including political advertisers, will be barred from using its new generative AI advertising tools released in October.
Microsoft also announced measures to promote free and fair elections. First it will introduce tools to allow users to digitally sign and authenticate images and videos with the Coalition for Content Provenance and Authenticity (C2PA) digital watermarking credentials. This process ensures the metadata will travel with the content and publishers will know whether the source is digitally altered or uses generative AI tools.
Second, Microsoft will offer services to advise and support campaigns to navigate the new world of AI, protect themselves against misinformation, and ensure the authenticity of their own content.
Finally, Microsoft will create a new “Elections Communications Hub” to allow election authorities access to support for security challenges leading up to their election. Microsoft will also support legislation that will protect the electoral process in the face of new technologies.