Now Reading: ServiceNow deal will see it embed OpenAI models into its AI Platform

Loading
svg

ServiceNow deal will see it embed OpenAI models into its AI Platform

NewsJanuary 21, 2026Artifice Prime
svg12

ServiceNow signed a multiyear agreement with OpenAI on Tuesday that it said is designed to “accelerate enterprise AI outcomes.”

The company stated that it intends to build direct speech-to-speech technology using OpenAI models “to break through language barriers and offer more natural interactions. With the latest OpenAI models including GPT-5.2, ServiceNow will unlock a new class of AI-powered automation for the world’s largest companies.”

Asked what prompted the move, John Aisien, senior vice president of product management at ServiceNow, said, “the deeper integration with OpenAI reflects both customer demand and a rapid inflection point in AI capabilities. As AI model releases accelerate, large enterprises need help keeping their workflows aligned with the latest innovations.”

Organizations, he said, “are looking to move beyond AI experimentation to deployment at scale — into workflows that are secure, scalable, and designed to deliver measurable outcomes. Through our AI Control Tower and work like our co-innovation with OpenAI, partnerships with Nvidia, Anthropic and others, we meet those needs.”  

Scott Bickley, advisory fellow at Info-Tech Research Group, praised the deal, noting that since ServiceNow will still offer their proprietary models alongside frontier models such as OpenAI, organizations will have a choice as to which model to run on the out-of-the box Now Assist workflows. He noted, however, that these workflows work best with the ServiceNow proprietary model.

Despite this, said Bickley, most enterprises have discovered that one model cannot do all things at the same proficiency level and have started to tailor specific models to specific use cases.

He added, “a best of both worlds approach makes sense, with frontier models providing a core intelligence engine, while proprietary models are targeting cases requiring a specialized knowledge base and more enhanced guardrails. ServiceNow will still layer in its workflow specificity, data governance and enterprise controls rooted in its AI Control Tower.”

Proprietary LLM development is costly

In addition, he noted, “the reality is that development of proprietary LLMs is expensive across their lifecycle, and [they] require constant attention and tuning. The economics make sense to leverage what are quickly becoming commodity AI models where possible and leverage a hybrid architecture where the frontier model is coupled with enterprise logic and governance, combining scale with contextual relevance.”

Jason Andersen, VP and principal analyst at Moor Insights & Strategy, said that at the highest level, this move makes a lot of sense. He noted that the addition of OpenAI is in general good news for CIOs. “It will likely save them some money if they [already] have a strategy to use OpenAI models and technologies.”

But, he added, LLMs tuned or trained for business are not in decline. “We are moving to the world of agents that can work against multiple models for specific tasks,” he said. ServiceNow is smart enough to know that its models should focus on the tasks most specific to its customers’ processes and data, and the more common stuff can be handled with a frontier model. 

Sanchit Vir Gogia, chief analyst at Greyhound Research, said that this move is happening now because the enterprise AI conversation has crossed a line from assistance to accountability. “Until recently, ServiceNow could credibly argue that business tuned models were sufficient, because the AI was largely augmentative,” he said. “It summarized tickets. It drafted responses. It helped agents move faster, but humans still carried responsibility.”

Harsh internal reality at play

That boundary “has now collapsed. Customers want AI that opens cases, triggers approvals, escalates incidents, interacts with legacy systems, and increasingly operates through voice and agents rather than a structured UI,” he noted. “Once AI is expected to act, reasoning quality and generalization depth stop being nice to have and start becoming operational risk variables.”

He added that there is also a harsh internal reality at play: “Frontier model cadence has become incompatible with enterprise software development cycles. When intelligence improves every few weeks, maintaining parity through internal tuning becomes an endless catch up exercise that diverts talent, capital, and attention away from the platform layer.”

ServiceNow did not lose confidence in its models, he said, “it gained clarity on where its time is better spent. Its defensible advantage is not linguistic intelligence. It is workflow authority, permissions, data relationships, and governance at scale.”

Another under-discussed driver, said Gogia, is customer fatigue. “Enterprises are no longer impressed by claims of proprietary AI,” he pointed out. “They are overwhelmed by model sprawl, opaque costs, and fragmented controls. What they want is a system that can absorb intelligence improvements without forcing architectural rewrites every quarter.”

In addition, he had this advice for CIOs: “The most dangerous interpretation [you] could make is that business-tuned models are obsolete. They are not. What is obsolete is the assumption that one model strategy can satisfy all enterprise needs. The enterprise AI stack is fragmenting by design, not by accident.”

ServiceNow’s move, added Gogia “should be read as a warning as much as an opportunity. The enterprise AI stack is becoming layered, faster, and more consequential. CIOs who continue to think in terms of tools and features will struggle. CIOs who think in terms of systems, controls, and failure modes will be the ones who extract real value.”

This article originally appeared on CIO.com.

Original Link:https://www.computerworld.com/article/4119650/servicenow-deal-will-see-it-embed-openai-models-into-its-ai-platform-2.html
Originally Posted: Wed, 21 Jan 2026 07:05:35 +0000

0 People voted this article. 0 Upvotes - 0 Downvotes.

Artifice Prime

Atifice Prime is an AI enthusiast with over 25 years of experience as a Linux Sys Admin. They have an interest in Artificial Intelligence, its use as a tool to further humankind, as well as its impact on society.

svg
svg

What do you think?

It is nice to know your opinion. Leave a comment.

Leave a reply

Loading
svg To Top
  • 1

    ServiceNow deal will see it embed OpenAI models into its AI Platform

Quick Navigation