political advertisers to “prominently disclose” when they made their ads with AI, . Starting in November, Google says advertisers must include a disclosure when an election ad features “synthetic content” that depicts “realistic-looking people or events.”
That includes political ads that use AI to make someone look as if they’re saying or doing something they never did, as well as changing the footage of an actual event (or fabricating a realistic-looking one) to create a scene that never happened.
Google says these types of ads must contain a disclaimer in a “clear and conspicuous” place, noting that it will apply to images, videos, and audio content. The labels will need to state things like, “This audio was computer generated,” or “This image does not depict real events.” Any “inconsequential” tweaks, such as brightening an image, background edits, or removing red eye with AI, won’t require a label.
“Given the growing prevalence of tools that produce synthetic content, we’re expanding our policies a step further to require advertisers to disclose when their election ads include material that’s been digitally altered or generated,” a Google spokesperson Allie Bodack says in a statement to The Verge.
Update September 6th, 7:12PM ET: Added a statement from a Google spokesperson.