Bottom line: At this year's Cisco Live event in San Diego, one thing became apparent: the opportunity to run modern applications like GenAI and autonomous agents is no longer limited to the cloud. Cisco was ready to highlight several new products and services specifically targeted at major cloud computing providers and large ISPs. They also noted that interest in expanding the capabilities of on-premises data centers is not only still alive, but it's being reinvigorated by the rapid transition to AI-powered workloads.
From new AI-optimized, on-premises-focused routers and switches delivered through the company's traditional networking hardware business, to AI-enhanced versions of their communications, collaboration, and customer support software platforms, and even into their new agentic AI offerings, Cisco made it clear at this year's Live event that it is eager and willing to meet the growing demand for capabilities that businesses can run within their own environments.
... it's a stunning development, given the relentless focus on moving everything to the cloud that the IT industry has witnessed over the past 10 to 15 years.
In some ways, it's a stunning development, given the relentless focus on moving everything to the cloud that the IT industry has witnessed over the past 10 to 15 years. And a cynic might argue it's an attempt by an old-school tech company to return to its glory days of providing essential enterprise equipment. But in truth, it's part of a much broader industry movement that many large companies are starting to see and talk about.
Driven in part by traditional data gravity arguments – which say you need to bring the workloads to the data – the fact that the most valuable corporate data (and the most useful for AI model fine-tuning) often still sits behind the firewall is a key factor. Combine that with the growing range of hardware and software offerings now capable of running these advanced workloads in the enterprise, and it all starts to make sense.
While companies will unquestionably continue to use cloud computing resources for some of their workloads, the interest in – and ability to do – more internally is very real. To put it succinctly: the world of Hybrid AI, where companies use both public and private clouds and data centers for AI workloads, has arrived.
Viewed through that Hybrid AI lens, many of the specific announcements from Cisco Live gain important relevance. For example, in hardware, new additions to the C9000 series switches and 8000 series routers leverage the latest Cisco Silicon One chips to run latency-sensitive AI workloads, while also integrating support for post-quantum security, zero-trust networking, and other capabilities.
The 8000 series routers also incorporate an enhanced version of Cisco's SASE (Secure Access Service Edge), SD-WAN (Software-Defined Wide Area Network), and next-generation firewall (NGFW) functionality.
Cisco AI PODs
In computing, Cisco launched its version of Nvidia's Blackwell RTX Pro 6000 GPU server (or, as Cisco calls them, AI Pods), specifically designed for companies to run within their own data centers.
... continue reading