Site icon Converge Digest

Dell’Oro: Hyperscalers Deployed 5 Million AI Accelerators in 2024

Revenue for accelerators, consisting of GPUs and custom accelerators, increased by 130 percent in 3Q 2024, according to a new report from Dell’Oro Group. The US hyperscalers―Amazon, Google, Meta, and Microsoft―are set to deploy over 5 million AI training-capable accelerators in 2024.

“Demand for accelerators has been growing at a breakneck pace as the hyperscalers race to deploy infrastructure for the training and inference of large language models,” said Baron Fung, Senior Research Director at Dell’Oro Group. “In addition to commercially available GPUs, the US hyperscalers are also increasing their deployment of AI infrastructure with custom accelerators. As large language models continue to grow in size, driving the need for larger compute clusters, hyperscalers are accelerating their adoption of custom accelerators. Often co-developed with chipmakers like Broadcom and Marvell, these custom solutions aim to boost performance efficiency, lower costs, and reduce dependency on NVIDIA GPUs,” Fung added.

Additional highlights from the 3Q 2024 Data Center IT Semiconductors and Components Quarterly Report:

Exit mobile version