AI News

YouTube’s New AI Deepfake Detection Tool Sparks Concerns Over Creator Biometrics

YouTube announces new generative AI tools for Shorts creators

Key Highlights:

  • YouTube recently expanded its likeness detection tool to millions of creators amid rising AI-generated deepfake concerns.
  • To use the feature, creators must upload government ID and biometric video, which isbmbraising questions about Google’s AI training policy.
  • Experts warn that Google’s broad data policy could lead to future misuse of creators’ biometrics despite YouTube’s assurances.

Google has recently made a solid comeback in the AI race with the launch of its Gemini 3 models and is now posing a threat to OpenAI’s position. This launch has reportedly influenced OpenAI’s CEO so much that he has declared “code red” with the company, and has asked everyone to shift focus entirely on making ChatGPT better than ever. 

But, this article isn’t about OpenAI’s fear — it’s about YouTube’s attempt to protect its creators from AI-generated deepfakes, which has reportedly raised concerns over how Google might use the biometric data required to operate it.

YouTube’s new likeness tracking tool raises concerns among creators, as it requires them share their biometrics

YouTube’s new AI deepfake tracker tool was launched back in October to detect the platform for videos where a creator’s face has been manipulated, altered, or fully generated using AI. As deepfakes flood social media, the feature is being opened to millions of YouTube Partner Program creators. But there’s a catch. Creators must upload a government ID and a biometric video of their face in order to use it. And that’s exactly where security experts say the trouble begins.

YouTube insists that Google has never used the biometric data from this tool to train any AI model, and the company says it’s reviewing the wording in its sign-up form to avoid confusion. But even with that clarification, YouTube is not changing the underlying privacy policy that governs the data.

The policy still states that public content — including biometric information — can be used to help train Google’s AI systems. That gap between policy language and YouTube’s assurances is what experts call a red flag. With Google aggressively pushing its generative AI roadmap, security experts worry that creators may unknowingly open the door for their likeness to train models in the future.

YouTube says it’s an opt-in feature and biometrics are used only for verification

YouTube says the feature is completely optional, but insists that a visual reference is required to identify deepfaked clips. A spokesperson reiterated that the data is used only for verification and powering the safety feature. However, rights advocates say they flagged the risks months ago with little clarity in response. In a statement provided to CNBC, YouTube spokesperson Jack Malon, said:

Our approach to that data is not changing. As our Help Center has stated since the launch, the data provided for the likeness detection tool is only used for identity verification purposes and to power this specific safety feature.

As you may know, creators’ faces act as their brand identity. As AI-generated videos get better and better due to advanced models, losing control over identity may have lasting consequences. Dan Neely, CEO of Vermillio — a company dedicated to monitoring likeness misuse online — says, “Your likeness will be one of the most valuable assets in the AI era, and once you give that control away, you may never get it back.”

Another expert, Loti CEO Luke Arrigoni, warned that the existing policy wording is dangerous enough to allow synthetic identities closely tied to real creators, raising the stakes for fraud, impersonation, and malicious deepfakes. Both Neely and Arrigoni say they cannot currently recommend that their clients enroll in the tool.

Are you also a creator who signed up for YouTube’s likeness tracking tool? What do you think about it? Please let us know in the comments below.

Rishaj Upadhyay
Rishaj is a tech journalist with a passion for AI, Android, Windows, and all things tech. He enjoys breaking down complex topics into stories readers can relate to. When he's not breaking the keyboard, you can find him on his favorite subreddits, or listening to music/podcasts
You may also like
More in:AI News