is The Verge’s senior AI reporter. An AI beat reporter for more than five years, her work has also appeared in CNBC, MIT Technology Review, Wired UK, and other outlets.
An anime version of Jesus Christ flipping tables. OpenAI employees performing in Hamilton costumes. News anchors discussing a story on television. A man doing a thirst-trap TikTok dance. Sam Altman — stealing GPUs on CCTV, listening to a business pitch, crying.
Such were the contents of my feed on Sora, OpenAI’s new social media app for AI-generated video. The company released the iOS app on Tuesday with the ability to create 10-second videos of virtually anything you can dream up, including “cameos,” or videos featuring your own AI-generated self and anyone else who approves of you using their likeness. OpenAI employees called Sora a potential “ChatGPT moment for video generation” in a briefing with reporters earlier this week. On Friday, Sora topped the list for top free apps in Apple’s App Store.
Already, reception has been mixed. Multiple viral posts juxtaposed the company’s lofty science- and research-related goals and its current release, with the digs becoming so popular that Altman himself had to respond. Many have expressed concern that the ultra-realistic videos incorporating real people are a misinformation nightmare. Others simply call it an AI slop machine.
Some OpenAI employees have publicly posted about their concerns, too. John Hallman, who works on pre-training at OpenAI, wrote in a post, “I won’t deny that I felt some concern when I first learned we were releasing Sora 2. That said, I think the team did the absolute best job they possible could in designing a positive experience.” Boaz Barak, a member of OpenAI’s technical staff, wrote on X that he feels a “mix of worry and excitement. Sora 2 is technically amazing but it’s premature to congratulate ourselves on avoiding the pitfalls of other social media apps and deepfakes.” He said that although he’s happy with some of the safeguards, “But as always, there is a limit to how much we can know before a product is used in the real world.”
But compared to other AI “social” apps like Meta’s Vibes, Sora has an at least temporarily compelling hook: the ability to meme-ify yourself and your friends. OpenAI seems to have noticed that most popular AI trends involve transforming yourself — into a Studio Ghibli character, into a sort of boring doll. Now, it’s built an entire app around this. And Sora’s popularity has seemed to eclipse Vibes so far; some people are scrolling it like TikTok, judging from online anecdotes. The question is whether these trends can meaningfully replace real people expressing their real opinions using their real voices in a real setting, once the initial novelty of Altman meowing in a full-body cat costume wears off.
When I signed up for Sora, the app gave me a content advisory, saying, “You are about to enter a creative world of AI-generated content.” It also told me, “We may train on your content and use ChatGPT memories for recommendations, all controllable in settings.”
So far, my feed is essentially made up of OpenAI employees parodying themselves and the company, a lot of deepfake instructional videos on how to use Sora, and a handful of animal videos. The volume of OpenAI people isn’t necessarily surprising — they’ve been using the app for a while, and invites to the public are still restricted. But it was still striking how hard it was to find anything else.
No matter how people feel about Sora so far, though, the broad consensus seems to be that our perception of what’s real and what isn’t may never be the same.
I reluctantly completed the signup flow allowing Sora to generate videos using my own likeness, which involved moving my head from side to side and saying a sequence of three numbers. When I first tried to generate a video of myself, the app told me that it was under “heavy load” and to “try again later.” Then when I asked for a video of myself “running through a meadow,” it said that was a “content violation” and couldn’t be made, adding that “this content may include suggestive or racy material.” When I traded the word “running” for “frolicking,” though, the app came through.
... continue reading