We've all been advised not to believe everything we see on the internet, and that's never been more true in the age of generative AI.
AI-generated videos are everywhere, from deepfakes of celebrities and false disaster broadcasts to viral videos of bunnies on a trampoline. Sora, the AI video generator from ChatGPT's parent company, OpenAI, has only made it more difficult to separate truth from fiction. And the Sora 2 model, a brand-new social media app, is becoming more sophisticated by the day.
In the last few months, the TikTok-like app has gone viral, with AI enthusiasts determined to hunt down invite codes. But Sora isn't like any other social media platform. Everything you see on Sora is fake, and all the videos are AI-generated. I described it as an AI deepfake fever dream, innocuous at first glance, with dangerous risks lurking just beneath the surface.
Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
From a technical standpoint, Sora videos are impressive compared to competitors such as Midjourney's V1 and Google's Veo 3. They have high resolution, synchronized audio and surprising creativity. Sora's most popular feature, dubbed "cameo," lets you use other people's likenesses and insert them into nearly any AI-generated scene. It's an impressive tool, resulting in scarily realistic videos.
That's why so many experts are concerned about Sora. The app makes it easier for anyone to create dangerous deepfakes, spread misinformation and blur the line between what's real and what's not. Public figures and celebrities are especially vulnerable to these deepfakes, and unions like SAG-AFTRA have pushed OpenAI to strengthen its guardrails.
Identifying AI content is an ongoing challenge for tech companies, social media platforms and everyone else. But it's not totally hopeless. Here are some things to look out for to determine whether a video was made using Sora.
Look for the Sora watermark
Every video made on the Sora iOS app includes a watermark when you download it. It's the white Sora logo -- a cloud icon -- that bounces around the edges of the video. It's similar to the way TikTok videos are watermarked.
Watermarking content is one of the biggest ways AI companies can visually help us spot AI-generated content. Google's Gemini "nano banana" model, for example, automatically watermarks its images. Watermarks are great because they serve as a clear sign that the content was made with the help of AI.
... continue reading