YouTube has announced a new tool that requires creators to disclose when their videos contain AI-generated or synthetic material. This self-labeling feature aims to help viewers identify content that could be easily mistaken for real people, places, or events.
According to the YouTube Team, the new tool will be available in Creator Studio, where creators can disclose altered or synthetic media content during the uploading and posting process. The disclosures will appear as labels in the expanded description or on the front of the video player, depending on the sensitivity of the topics covered.
Examples of content that require disclosure include using the likeness of a realistic person, altering footage of real events or places, and generating realistic scenes. However, disclosures won't be necessary for clearly unrealistic content, such as animation, special effects, or beauty filters.
The introduction of this self-labeling tool aligns with YouTube's announcement in November 2023 regarding their commitment to increasing transparency around digital content. The platform has been collaborating with the Coalition for Content Provenance and Authenticity (C2PA) to develop industry-wide standards for content transparency.
While YouTube wants to give creators time to adjust to the new process, they plan to introduce enforcement measures in the future for those who consistently choose not to disclose AI-generated content. In some cases, YouTube may add a label even when a creator hasn't disclosed it, particularly if the altered or synthetic content has the potential to mislead viewers.
As generative AI continues to transform the creative landscape, YouTube recognizes the importance of empowering creators while ensuring viewers have access to transparent information about the content they consume. The self-labeling tool is just one step in an ongoing process to navigate the evolving world of AI in content creation.
YouTube acknowledges that this will be an ever-evolving process and pledges to continue improving as they learn. The platform hopes that increased transparency will foster a deeper appreciation for the ways AI can empower human creativity while maintaining trust between creators and their audiences.