Site icon Converge Digest

Ampere targets 256-core CPU

Ampere Computing has released its annual update, showcasing the company’s commitment to sustainable, power-efficient computing for the Cloud and AI sectors. The announcement included a collaboration with Qualcomm Technologies, Inc. to develop a joint AI inferencing solution. This solution will leverage Qualcomm’s high-performance, low-power Qualcomm Cloud AI 100 inference solutions alongside Ampere’s CPUs, aiming to deliver robust AI inferencing capabilities.

Renee James, CEO of Ampere, emphasized the growing importance of Ampere’s silicon design, particularly in addressing the increasing power demands of AI. She highlighted that Ampere’s approach has successfully combined low power consumption with high performance, challenging the traditional notion that low power equates to low performance. James pointed out that as AI continues to advance rapidly, the energy consumption of data centers becomes a critical issue, and Ampere’s solutions are designed to address this by enhancing efficiency and performance without compromising sustainability.

Jeff Wittich, Chief Product Officer at Ampere, outlined the company’s vision for “AI Compute,” which integrates traditional cloud-native capabilities with AI. He noted that Ampere CPUs are versatile enough to handle a range of workloads, from data processing and web serving to media delivery and AI applications. Both James and Wittich introduced the upcoming AmpereOne® platform, featuring a 12-channel 256-core CPU built on the N3 process node, which promises to significantly boost performance.

Key Highlights from Ampere’s Update:

false

Exit mobile version