At an OpenAI Dev Days keynote today, CEO Sam Altman announced that the company is launching an SDK preview that will allow developers the ability "to build real apps inside of ChatGPT." Altman said that, starting today, the new SDK will give developers "full stack" control over app data, action triggers, and even interactive user interfaces for apps that can appear inline as part of an existing ChatGPT conversation window. The SDK is built on the open source Model Context Protocol (MCP), Altman said. That means developers that already use MCP only need to add an HTML resource to enable ChatGPT integration, he added. The new integration means a ChatGPT user can directly ask Figma to turn a sketch into a diagram, for instance, and get results integrated into their ChatGPT conversation. It also means that ChatGPT can suggest apps that might be suited to a more general query, like recommending and creating a Spotify playlist when someone asks for song suggestions. In a live onstage demo, OpenAI software engineer Alexi Christakis showed how the new API can "expose context back to ChatGPT from your app," a process he likened to ChatGPT "talking to apps." For instance, the LLM can expand in real time on what's being said in an embedded Coursera video. "I don't need to explain what I'm seeing in the video, ChatGPT sees it right away," Christakis said on stage. Other onstage demos showed off ChatGPT using Canva to generate poster ideas in the background while the user consulted an inline Zillow map for information. Even when the Zillow window was expanded to full screen, the user could ask ChatGPT for additional context via an overlaid chat window.