The Deepfake Dilemma: How AI Video Is Devaluing Likeness

We’re entering an era where your face is no longer exclusively yours. AI-generated video technology has reached a threshold where anyone’s likeness can be convincingly replicated with minimal effort and expense. This technological shift fundamentally changes the economics and ethics of personal identity in ways we’re only beginning to grapple with.

For decades, a person’s likeness held clear economic value. Celebrities could command millions for endorsement deals. The right to use someone’s image was a protectable asset, governed by publicity rights and carefully negotiated contracts. Your face, your voice, your mannerisms—these were intrinsically scarce resources that only you could provide.

AI video generation shatters this scarcity. Tools like Runway, Pika, and numerous deepfake applications can now generate video of anyone saying or doing virtually anything. The barrier to entry is no longer technical expertise or expensive equipment—it’s simply access to reference footage, which for public figures means anything available online.The implications ripple across multiple domains. In entertainment, actors face a future where studios might license their likeness once and generate infinite performances without their ongoing participation. We’ve already seen this with posthumous performances and de-aging technology, but generative AI accelerates and democratizes the capability. The unique value proposition of being physically present for a performance diminishes when your digital twin can be conjured at will.

For influencers and content creators, the disruption cuts deeper. Their entire business model rests on authentic connection with audiences—the sense that they’re seeing the real person. When AI can generate endless synthetic content that looks identical to the genuine article, what happens to that authenticity premium? How do audiences distinguish between real and fake when the fakes are indistinguishable?

The devaluation extends beyond economics into questions of consent and autonomy. When anyone can create a video of you without permission, likeness rights become nearly unenforceable. The legal frameworks designed to protect personality rights were built for a world where creating unauthorized reproductions required significant resources. Those frameworks struggle to address a reality where infringement is trivial and ubiquitous.

Some argue this democratizes media production, breaking the monopoly that individuals held over their own images. Why should celebrities control every use of their publicly visible face? There’s a case that once someone becomes a public figure, their likeness enters a kind of cultural commons. But this perspective overlooks power asymmetries and the potential for harm—from non-consensual pornography to political manipulation to simple impersonation fraud.

The counterargument focuses on scarcity of a different kind: authentic presence and verifiable reality. Perhaps what becomes valuable isn’t the likeness itself but proof that content is genuine. We might see the emergence of cryptographic verification systems, authenticated content chains, or premium platforms where users pay specifically for confirmed authentic material. The value shifts from the image to the verification.

We’re also likely to see markets split between synthetic and authentic content. AI-generated videos might dominate low-budget productions, advertising, and content where perfection and malleability matter more than authenticity. Meanwhile, verified authentic content could command a premium in contexts where genuine human presence matters—live performances, verified testimonials, or content explicitly marketed as human-created.The technology isn’t going away. The capabilities will only improve and become more accessible. This means we need new frameworks for thinking about likeness, identity, and authenticity in a post-scarcity image environment. That might include stronger authentication standards, new forms of consent-based licensing, or social norms that stigmatize unauthorized deepfakes the way we’ve developed norms around other forms of impersonation.

What’s clear is that we can’t simply preserve the old model where individuals maintained exclusive control over their likenesses through scarcity. That scarcity is gone. The question now is what comes next—and whether we can build systems that protect autonomy and consent while acknowledging the transformed technological landscape. The value of likeness hasn’t disappeared entirely, but it’s being radically redistributed in ways we’re still learning to navigate.