Featured content

2025: Look Back at the Year’s Top Advances

Catch up on 2025’s highlights: ROCm 7.0, liquid-cooled AI servers, server processors for SMBs, and a MicroBlade server that’s highly efficient.

  • December 19, 2025 | Author: Peter Krass
Learn More about this topic

2025 was a year to remember. But in case you’ve forgotten, here are some of the year’s top advances.

ROCm for the AI Era

This past fall, AMD introduced version 7.0 of its ROCm software stack. This latest edition features capabilities designed especially for AI.

ROCm, part of AMD’s portfolio since 2016, translates code written by human programmers into instruction sets that AMD GPUs and CPUs can understand and execute.

Now AMD has purpose-built ROCm 7.0 for GenAI, large-scale AI training, and AI inferencing. Essentially, ROCm now offers the tools and runtime to make the most complex GPU workloads run efficiently.

The full ROCm 7.0 stack contains multiple components. These include drivers, a Heterogeneous Interface for Portability (HIP), math and AI libraries, compilers and system-management tools.

Liquid-Cooled AI Servers

Supermicro introduced two rackmount AI servers in June, both of them powered by AMD Instinct MI350 Series GPUs and dual AMD EPYC 9005 CPUs.

One of the two new servers, Supermicro model number AS -4126GS-NMR-LCC, is a 4U liquid-cooled system. This server can handle up to eight GPUs, the user’s choice of AMD’s Instinct MI325X or MI355X.

The other server, Supermicro model number AS -8126GS-TNMR, is a larger 8U server that’s also air-cooled. It also offers a choice of AMD GPUs, either the AMD Instinct MI325X or AMD Instinct MI350X.

Both servers feature PCIe 5.0 connectivity; memory capacities of up to 2.3TB; support for AMD’s ROCm open-source software; and support for AMD Infinity Fabric Link connections for GPUs.

In June, Supermicro CEO Charles Liang said the new servers “strengthen and expand our industry-leading AI solutions—and give customers greater choice and better performance as they design and build the next generation of data centers.”

EPYCs for SMBs

In May, AMD introduced a CPU series designed specifically for small and medium businesses.

The processors, known as the AMD EPYC 4005 Series, bring a full suite of enterprise-level features and performance. But they’re designed for on-prem SMBs and cloud service providers who need cost-effective solutions in a 3U form factor.

“We’re delivering the right balance of performance, simplicity, and affordability,” says Derek Dicker, AMD’s corporate VP of enterprise and HPC. 

That balance includes the same AMD ‘Zen 5’ core architecture behind the AMD EPYC 9005 Series processors used in data centers run by large enterprises.

The AMD EPYC 4005 Series CPUs for SMBs come in a single-socket package. Depending on model, they offer anywhere from 6 to 16 cores and boosted performance of up to 5.7 GHz.

One model of the AMD EPYC 4005 line also includes integrated AMD 3D V-Cache technology for a larger 128MB L3 cache and lower latency.

MicroBlades for CSPs

The AMD EPYC 4005 Series processors made a star appearance in November, when Supermicro introduced a 6U, 20-node MicroBlade server (model number MBA-315R-1G) powered by the new CPUs.

These servers are intended for small and midsize cloud service providers.

Each blade is powered by a single AMD EPYC 4005 CPU. When 20 blades are combined in the system’s 6U form factor, the system offers 3.3x higher density than a traditional 1U server. It also reduces cabling by up to 95%, saves up to 70% space, and lowers energy costs by up to 30%.

This MicroBlade system with an AMD EPYC 4005 processor is also available as a motherboard (model number BH4SRG) for use in Supermicro A+ servers.

~~~~~~~~~

Happy holidays from all of us at Performance Intensive Computing, and best wishes for the new year! We look forward to serving you in 2026.

~~~~~~~~~~

Read related 2025 blog posts:

 

Featured videos


Events


Find AMD & Supermicro Elsewhere