ServiceNow announced a multi-year partnership with OpenAI to bring GPT-5.2 into its AI Control Tower and Xanadu platform, reinforcing ServiceNow’s strategy to focus on enterprise workflows, guardrails, and orchestration rather than building frontier models itself.For enterprise buyers, the deal underscores a broader shift: general-purpose models are becoming interchangeable, while the platforms that control how they’re deployed and governed are where differentiation now lives.ServiceNow lets enterprises develop agents and applications, plug them into existing workflows, and manage orchestration and monitoring through its unified AI Control Tower. The partnership does not mean ServiceNow will no longer use other models to power its services, said John Aisien, senior vice president of product management at ServiceNow."We will remain an open platform. There are things we will partner on with each of the model providers, depending on their expertise. Still, ServiceNow will continue to support a hybrid, multi-model AI strategy where customers can bring any model to our AI platform,” Aisien said in an email to VentureBeat. “Instead of exclusivity, we give enterprise customers maximum flexibility by combining powerful general-purpose models with our own LLMs built for ServiceNow workflows.”What the OpenAI partnership unlocks for ServiceNow customersServiceNow customers get:Voice-first agents: Speech-to-speech and voice-to-text supportEnterprise knowledge access: Q&A grounded in enterprise data, with improved search and discoveryOperational automation: Incident summarization and resolution supportServiceNow said it plans to work directly with OpenAI to build “real-time speech-to-speech AI agents that can listen, reason and respond naturally without text intermediation.” The company is also interested in tapping OpenAI’s computer use models to automate actions across enterprise tools such as email and chat.The enterprise playbookThe partnership reinforces ServiceNow’s positioning as a control layer for enterprise AI, separating general-purpose models from the services that govern how they’re deployed, monitored, and secured. Rather than owning the models, ServiceNow is emphasizing orchestration and guardrails — the layers enterprises increasingly need to scale AI safely.Some companies that work with enterprises see the partnership as a positive. Tom Bachant, co-founder and CEO of AI workflow and support platform Unthread, said this could further reduce integration friction. “Deeply integrated systems often lower the barrier to entry and simplify initial deployment," he told VentureBeat in an email. "However, as organizations scale AI across core business systems, flexibility becomes more important than standardization. Enterprises ultimately need the ability to adapt performance benchmarks, pricing models, and internal risk postures; none of which remain static over time.”As enterprise AI adoption accelerates, partnerships like this suggest the real battleground is shifting away from the models themselves and toward the platforms that control how those models are used in production.