Tech News
← Back to articles

Sora Changed the Deepfake Game. Can You Tell Whether a Video Is Real or AI?

read original related products more articles

If you're even a little bit online, the odds are you've seen an image or video that was AI-generated. I know I've been fooled before, like I was by that viral video of bunnies on a trampoline. But Sora is taking AI videos to a whole new level, making it more important than ever to know how to spot AI.

Sora is the sister app of ChatGPT, made by the same parent company, OpenAI. It's named after its AI video generator, which launched in 2024. But it recently got a major overhaul with a new Sora 2 model, along with a brand-new social media app by the same name. The TikTok-like app went viral, with AI enthusiasts determined to hunt down invite codes. But it isn't like any other social media platform. Everything you see on Sora is fake; all the videos are AI-generated. Using Sora is an AI deepfake fever dream: innocuous at first glance, with dangerous risks lurking just beneath the surface.

Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.

From a technical standpoint, Sora videos are impressive compared to competitors such as Midjourney's V1 and Google's Veo 3. Sora videos have high resolution, synchronized audio and surprising creativity. Sora's most popular feature, dubbed "cameo," lets you use other people's likenesses and insert them into nearly any AI-generated scene. It's an impressive tool, resulting in scarily realistic videos.

That's why so many experts are concerned about Sora, which could make it easier than ever for anyone to create deepfakes, spread misinformation and blur the line between what's real and what's not. Public figures and celebrities are especially vulnerable to these potentially dangerous deepfakes, which is why unions like SAG-AFTRA pushed OpenAI to strengthen its guardrails.

Identifying AI content is an ongoing challenge for tech companies, social media platforms and all of us who use them. But it's not totally hopeless. Here are some things to look out for to identify if a video was made using Sora.

Look for the Sora watermark

Every video made on the Sora iOS app includes a watermark when you download it. It's the white Sora logo -- a cloud icon -- that bounces around the edges of the video. It's similar to the way TikTok videos are watermarked.

Watermarking content is one of the biggest ways AI companies can visually help us spot AI-generated content. Google's Gemini "nano banana" model, for example, automatically watermarks its images. Watermarks are great because they serve as a clear sign that the content was made with the help of AI.

But watermarks aren't perfect. For one, if the watermark is static (not moving), it can easily be cropped out. Even for moving watermarks like Sora's, there are apps designed specifically to remove them, so watermarks alone can't be fully trusted. When OpenAI CEO Sam Altman was asked about this, he said society will have to adapt to a world where anyone can create fake videos of anyone. Of course, prior to OpenAI's Sora, there wasn't a popular, easily accessible, no-skill-needed way to make those videos. But his argument raises a valid point about the need to rely on other methods to verify authenticity.

... continue reading