AI-ML Articles
Meet Supermicro’s new AMD-powered edge systems
- February 4, 2026
- Author: Peter Krass
The five new systems range from compact systems to rackmount servers. They’re designed for use outside of the traditional data center.
Looking for AI's ROI? Try purpose-fitting
- January 29, 2026
- Author: Peter Krass
Delivering an AI return on investment can be challenging. A new IDC white paper offers a solution: leverage infrastructure to the use case.
Tech Explainer: What’s an AI Factory?
- January 23, 2026
- Author: KJ Jacoby
Discover how AI factories work—and how your clients might benefit from building an AI factory of their own.
2025: Look Back at the Year’s Top Advances
- December 19, 2025
- Author: Peter Krass
Catch up on 2025’s highlights: ROCm 7.0, liquid-cooled AI servers, server processors for SMBs, and a MicroBlade server that’s highly efficient.
Research Roundup: Server Sales Rise, AI Helps Customer Service, Social Media is for Adults, LLMs Know What You Need
- December 17, 2025
- Author: Peter Krass
Catch up on the latest research from leading technology analysts and market watchers.
Tech Explainer: What are CPU Cores, Threads, Cache & Nodes?
- December 16, 2025
- Author: KJ Jacoby
Today’s CPUs are complex. Find out what the key components actually do—and why, in an age of AI, they still matter.
Check out Supermicro’s new AMD GPU-powered server—it’s air-cooled
- November 25, 2025
- Author: Peter Krass
Supermicro’s new 10U server is powered by AMD’s EPYC CPUs and Instinct MI355X GPUs. And it’s kept cool by nearly 20 fans.
Tech Explainer: What’s new in AMD ROCm 7?
- November 20, 2025
- Author: KJ Jacoby
Learn how the AMD ROCm software stack has been updated for the era of AI.
Research Roundup: IT budgets, server sales, IoT analytics, AI at work
- October 28, 2025
- Author: Peter Krass
Catch up on the latest intelligence from leading IT market watchers and pollsters.
Tech Explainer: What’s liquid cooling? And why might your data center need it now?
- October 22, 2025
- Author: KJ Jacoby
Liquid cooling offers big efficiency gains over traditional air. And while there are upfront costs, for data centers with high-performance AI and HPC servers, the savings can be substantial. Learn how it works.
- 1 of 10
- next ›











