Skip to main content

Back

CapCut’s New T&Cs Turn Creators into Collateral

By Miranda Aitken | Journalist, Blair Beven | Principal Partner, Samantha Ludemann | Associate

CapCut, the popular video editing app owned by TikTok’s parent company, ByteDance, has recently updated its terms of service – and it’s a harsh edit. 

Under the new terms, ByteDance and its affiliates are granted a perpetual, irrevocable, royalty-free, worldwide license to use, adapt, create derivative works from, distribute, and store user content. 

In other words, if you upload a photo, voiceover, or video to CapCut, you’re effectively handing over the rights for CapCut to use it however they see fit – forever.

More concerningly, the updated terms specify that CapCut can use your username, image, and likeness to identify you as the source of user content, including in sponsored material. This means your face could appear in an ad campaign you weren’t paid for, and you’d have no legal grounds to remove it.

In essence, by using CapCut, ByteDance reserves the right to commercialise your image, voice, and likeness indefinitely and globally, with or without your knowledge.

If this sounds familiar, it’s because it is. Last year’s strikes by SAG-AFTRA and the Writers Guild of America highlighted similar concerns. A significant point of contention was studios’ use of AI to scan and store actors’ likenesses, particularly background actors, for future digital use without ongoing compensation. The industry rightly identified this as a move to secure cheap, permanent talent.

Now, it’s not just an issue reserved for Hollywood extras and stars. It’s happening on your smartphones.

What the law says

Unlike some jurisdictions, Australia does not have explicit “publicity rights” or “personality rights” that protect an individual’s likeness [1]

Instead, individuals must rely on a combination of laws, such as those related to defamation, passing off, and the Australian Consumer Law, to address unauthorised use of their image. This patchwork approach offers some limited protections, especially when users have agreed to terms that grant broad usage rights to platforms like CapCut. 

While many U.S. states have laws that protect the “right of publicity” – the right of all individuals, but principally celebrities, against the misappropriation by another of the commercial value of their own identity or performance – it lacks a federal right of publicity, leading to a fragmented legal landscape. 

Some states, like California and New York, offer robust protections, recognising the right as both an intellectual property and privacy interest. However, other states provide minimal or no protections, making enforcement inconsistent across the country [2]

Regulatory Landscape

The EU has taken proactive steps with the introduction of the AI Act, the world’s first comprehensive legal framework on AI. This legislation aims to address risks associated with AI, including those related to fundamental rights and safety [3]

While the AI Act doesn’t specifically tackle the use of personal likenesses, its emphasis on transparency and accountability sets a precedent for future regulations in this space. 

In addition, The Privacy Act 1988 governs the handling of personal information, including biometric data. However, its provisions are often seen as inadequate in addressing the complexities of modern biometric technologies and the challenges posed by AI-driven platforms [4]

As in the case of the US “right to publicity”, currently there is no specific law regulating the use of artificial intelligence (AI) but instead there is a patchwork of laws regulating privacy, intellectual property and consumer protection.

Legal experts argue this isn’t enough. They’re calling for more specific regulations to protect individuals’ biometric data and likenesses.

Thankfully, the federal government have publicly stated that they intend to reform Australia’s privacy laws and regulate artificial intelligence with the implementation of new laws to be set down in 2025 or so.

But wait, is this really legal?

Technically, yes – if you’ve agreed to it. These terms are legally enforceable in most jurisdictions unless deemed unconscionable, misleading, or in breach of other established consumer rights. However, the ethical implications of such broad usage rights, especially when buried in lengthy terms of service, are increasingly being questioned.

What can I do?

  • Delete CapCut if you’re uncomfortable with the trade-off.
  • Read the terms of any app that handles your face, voice, or name.
  • Consider alternative platforms with clearer terms of service or less aggressive data policies.
  • Advocate for reform. Support stronger consumer protections and AI-specific legislation.

If you are interested in learning more or require assistance in protecting your social media content, connect with us at XVII Degrees. 

More Insights

References:

[1] Arts Law Centre of Australia. (n.d.). Unauthorised use of your image. https://www.artslaw.com.au/information-sheet/unauthorised-use-of-your-image/ 

[2] Legal Information Institute. (n.d.). Publicity. Cornell Law School. https://www.law.cornell.edu/wex/publicity 

[3] European Parliament. (2025, February 19). EU AI Act: First regulation on artificial intelligence. https://www.europarl.europa.eu/pdfs/news/expert/2023/6/story/20230601STO93804/20230601STO93804_en.pdf 

[4] Dentons. (2024, November 18). Data protection, privacy and artificial intelligence laws. https://www.dentons.com/en/insights/articles/2024/november/18/data-protection-privacy-and-artificial-intelligence-laws 

Featured Image by Mika Baumeister on Unsplash.

Share: