YouTube is establishing rules for disclosing generative AI use on its platform, while admitting it'll be difficult to police. Altered or synthetic content must be labeled as such if it isn't otherwise obvious.
The new label is meant to strengthen transparency with viewers and build trust between creators and their audience. Some examples of content that require disclosure include:
• Using the likeness of a realistic person: Digitally altering content to replace the face of one individual with another's or synthetically generating a person's voice to narrate a video.
• Altering footage of real events or places: Such as making it appear as if a real building caught fire, or altering a real cityscape to make it appear different than in reality.
• Generating realistic scenes: Showing a realistic depiction of fictional major events, like a tornado moving toward a real town.
Don't worry, guys, we don't have to disclose our beauty filters.