Core Insights - YouTube has introduced a "likeness detection" tool to help creators remove AI-generated videos that exploit their likeness, but concerns have been raised about the use of creators' biometric data for training AI models [1][3][5] Group 1: YouTube's Likeness Detection Tool - The likeness detection tool scans videos to identify unauthorized use of a creator's face in deepfakes and is being expanded to millions of creators in the YouTube Partner Program [3][9] - To use the tool, creators must upload a government ID and a biometric video of their face, which raises concerns about the potential misuse of this sensitive data [4][5] - YouTube maintains that the biometric data is only used for identity verification and to power the safety feature, but experts caution that the policy allows for future misuse [5][8] Group 2: Industry Concerns and Expert Opinions - Experts have expressed concerns about YouTube's biometric policy, stating that creators should be cautious about giving control of their likeness to a platform [7][8] - Third-party companies like Vermillio and Loti are working with creators to protect their likeness rights, emphasizing the value of likeness in the AI era [7] - The rapid improvement of AI-generated video tools raises new concerns for creators, as their likeness and voice are central to their business [11]
YouTube's new AI deepfake tracking tool is alarming experts and creators