Tech News
← Back to articles

AMD debuts AMD Instinct MI350 Series accelerator chips with 35X better inferencing

read original related products more articles

AMD unveiled its comprehensive end-to-end integrated AI platform vision and introduced its open, scalable rack-scale AI infrastructure built on industry standards at its annual Advancing AI event.

The Santa Clara, California-based chip maker announced its new AMD Instinct MI350 Series accelerators, which are four times faster on AI compute and 35 times faster on inferencing than prior chips.

AMD and its partners showcased AMD Instinct-based products and the continued growth of the AMD ROCm ecosystem. It also showed its powerful, new, open rack-scale designs and roadmap that bring leadership Rack Scale AI performance beyond 2027.

“We can now say we are at the inference inflection point, and it will be the driver,” said Lisa Su, CEO of AMD, in a keynote at the Advancing AI event.

In closing, in a jab at Nvidia, she said, “The future of AI will not be built by any one company or within a closed system. It will be shaped by open collaboration across the industry with everyone bringing their best ideas.”

Lisa Su, CEO of AMD, at Advancing AI.

AMD unveiled the Instinct MI350 Series GPUs, setting a new benchmark for performance, efficiency and scalability in generative AI and high-performance computing. The MI350 Series, consisting of both Instinct MI350X and MI355X GPUs and platforms, delivers a four times generation-on-generation AI compute increase and a 35 times generational leap in inferencing, paving the way for transformative AI solutions across industries.

“We are tremendously excited about the work you are doing at AMD,” said Sam Altman, CEO of Open AI, on stage with Lisa Su.

He said he couldn’t believe it when he heard about the specs for MI350 from AMD, and he was grateful that AMD took his company’s feedback.

AMD said its latest Instinct GPUs can beat Nvidia chips.

... continue reading