Financial services companies earn their keep by investing in stocks, bonds and other financial instruments. Now these companies are also making big investments in artificial intelligence technology.
To help these financial services industry (FSI) players adopt AI, Supermicro and AMD are working together. The two are partnering to offer advanced computing solutions designed to empower and speed the finance industry’s move to technology and business leadership.
FSI companies can use these systems to:
- Detect risks faster, uncovering patterns and anomalies by ingesting ever-larger data sets
- Supercharge trading with AI in both the front- and back-office
- Modernize core processes to lower costs while boosting resilience
- Engage and delight customers by meeting—even exceeding—their expectations
Big Spenders
Already, FSI spending on AI technology is substantial. Last year, when management consulting firm Bain & Co. surveyed nearly 110 U.S. FSI firms, it found that those respondents with annual revenue of at least $5 billion were spending an average of $221 million on AI.
The companies were getting a good return on AI, too. Bain found that 75% of financial services companies said their generative AI initiatives were either achieving or exceeding their expected value. In addition, the GenAI users reported an average productivity gain across all uses of an impressive 20%.
Based on those findings, Bain estimates that by embracing AI, FSI firms can reduce their customer-service costs by 20% to 30% while increasing their revenue by about 5%.
Electric Companies
One big issue facing all users of AI is meeting the technology’s energy needs. Power consumption is a big-ticket item, accounting for about 40% of all data center costs, according to professional services firm Deloitte.
Greater AI adoption could push that even higher. Deloitte believes global data center electric consumption could double by as soon as 2030, driven by big increases in GenAI training and inference.
As Deloitte points out, some of that will be the result of new hardware requirements. While general-purpose data center CPUs typically run at 150 to 200 watts per chip, the GPUs used for AI run at up to 1,200 watts per chip.
This can also increase the power demand per rack. As of early 2024, data centers typically supported rack power requirements of at least 20 kilowatts, Deloitte says. But with growth of GenAI, that’s expected to reach 50 kilowatts per rack by 2027.
That growth is almost sure to come. Market watcher Grand View Research expects the global market for GPUs in data centers of all industries to rise over the next eight years at a compound annual growth rate (CAGR) of nearly 36%. That translates into data-center GPU sales leaping from $14.48 billion worldwide last year to $190.1 billion in 2033, Grand View predicts.
Partner Power
FSI companies don’t have to meet these challenges alone. Supermicro and AMD have partnered to deliver advanced computing systems that deliver high levels of compute performance and flexibility, yet with a comparatively low total cost of ownership (TCO).
They’re boosting performance with high-performing, dense 4U servers using the latest AMD EPYC CPUs and AMD Instinct GPUs. Some of these servers offer up to 60 storage drive bays, 9TB of DDR5 RAM and 192 CPU cores.
For AI workloads, AMD offers the AMD EPYC 9575F AI host node. It has 64 cores and a maximum boost frequency of up to 5 GHz.
Flexibility is another benefit. Supermicro offers modular Datacenter Building Block Solutions. These include system-level units that have been pre-validated to ease the task of data-center design, among other offerings.
AMD and Supermicro are also offering efficiencies that lower the cost of transforming with AI. Supermicro’s liquid cooling slashes the total cost of ownership (TCO). AMD processors are designed for power efficiency. And SMC’s multi-mode design gives you more processing capability per rack.
Are you working with FSI customers looking to lead the way with AI investments? The latest Supermicro servers powered by AMD CPUs and GPUs have your back.
Do More: