Tech News
← Back to articles

Is That Video Real or AI? Why It's So Hard to Spot a Deepfake and What to Look For

read original related products more articles

Don't feel bad if you've been fooled by an AI-generated image or video. AI is creating content more convincing than ever before, and long gone are the days when a "fake" on the internet was easy to spot, like a badly Photoshopped picture.

New AI tools, including OpenAI's Sora and Google's Veo 3 and Nano Banana, have erased the line between reality and AI-generated fantasies. Now, we're swimming in a sea of AI-generated videos and deepfakes, from bogus celebrity endorsements to false disaster broadcasts.

If you're struggling to separate the real from the AI, you're not alone. Here are some helpful tips that should help you cut through the noise to get to the truth of each AI-inspired creation. For more, check out the problem behind AI video's energy demands and what we need to do in 2026 to avoid more AI slop.

Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.

Why it's hard to spot Sora AI videos

From a technical standpoint, Sora videos are impressive compared to competitors such as Midjourney V1 and Google Veo 3. They have high resolution, synchronized audio and surprising creativity. Sora's most popular feature, dubbed "cameo," lets you use other people's likenesses and insert them into nearly any AI-generated scene. It's an impressive tool, resulting in scarily realistic videos.

Sora joins the likes of Google's Veo 3, another technically impressive AI video generator. These are two of the most popular tools, but certainly not the only ones. Generative media has become an area of focus for many big tech companies in 2025, with the image and video models poised to give each company the edge it desires in the race to develop the most advanced AI across all modalities. Google and OpenAI have both released flagship image and video models this year in an apparent bid to outdo each other.

That's why so many experts are concerned about Sora and other AI video generators. The Sora app makes it easier for anyone to create realistic-looking videos that feature its users. Public figures and celebrities are especially vulnerable to these deepfakes, and unions like SAG-AFTRA have pushed OpenAI to strengthen its guardrails. Other AI video generators present similar risks, along with concerns about filling the internet with nonsensical AI slop and could be a dangerous tool for spreading misinformation.

Identifying AI content is an ongoing challenge for tech companies, social media platforms and everyone else. But it's not totally hopeless. Here are some things to look out for to determine whether a video was made using Sora.

Look for the Sora watermark

... continue reading