• Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io
No Result
View All Result
Converge Digest
Monday, April 13, 2026
  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io
No Result
View All Result
Converge Digest
No Result
View All Result

Home » Microsoft Announces Microfluidic Cooling for AI Chips

Microsoft Announces Microfluidic Cooling for AI Chips

September 23, 2025
in All
A A

Microsoft researchers have demonstrated a new microfluidic cooling technology that channels liquid directly into silicon chips, removing heat up to three times more effectively than today’s cold plate systems. The approach etches hair-thin channels into the back of the chip, allowing coolant to flow directly onto the hotspots. Early lab tests showed a 65% reduction in maximum GPU temperature rise compared with cold plates.

The company combined this hardware innovation with AI algorithms that map heat signatures and direct coolant flow for greater efficiency. By integrating cooling at the silicon level, Microsoft believes microfluidics could enable denser datacenter designs, higher overclocking potential, and more sustainable operations. The technology also promises to improve power usage effectiveness (PUE), cut operational costs, and support future 3D chip architectures that would otherwise be constrained by heat.

Microsoft has collaborated with Swiss startup Corintis on bio-inspired microchannel designs and has produced four iterations of prototypes in the past year. The company is now working on packaging, etching, and manufacturing techniques to move toward production deployment. The breakthrough comes as Microsoft invests more than $30 billion in capital expenditures this quarter, including custom chips such as its Cobalt and Maia families.

• Removes heat up to 3x better than cold plates

• 65% lower maximum GPU temperature rise in lab tests

• Bio-inspired channel design resembles veins in leaves

• Potential to enable overclocking without thermal damage

• Supports higher server density and future 3D chip stacking

“Microfluidics would allow for more power-dense designs that will enable more features that customers care about and give better performance in a smaller amount of space,” said Judy Priest, corporate vice president and chief technical officer of Cloud Operations and Innovation at Microsoft.

🌐 Analysis: Thermal management has become one of the biggest constraints for AI infrastructure buildouts, with GPUs such as NVIDIA’s Blackwell and AMD’s MI300X consuming increasing amounts of power. Microsoft’s in-chip microfluidics effort reflects a wider industry trend as hyperscalers test advanced liquid cooling approaches, including immersion and rear-door heat exchangers. If successfully commercialized, microfluidics could reshape datacenter cooling strategies and accelerate the adoption of more compact, power-dense silicon designs.

Original article by Catherine Bolgar on Microsoft.com: https://news.microsoft.com/source/features/ai-innovation/microfluidics-cooling-at-the-micro-level-for-microsofts-datacenters

ShareTweetShare
Previous Post

Europe Backs STARLight Project to Lead 300mm Silicon Photonics

Next Post

IonQ Converts Visible Photons to Telecom Wavelengths, Unlocking Quantum Network Potential

Jim Carroll

Jim Carroll

Editor and Publisher, Converge! Network Digest, Optical Networks Daily - Covering the full stack of network convergence from Silicon Valley

Related Posts

Cisco, G42, and AMD to Build AI Infrastructure in the UAE
AI Infrastructure

DigitalBridge Teams with KT for AI Data Centers in Korea

November 26, 2025
BerryComm Expands Central Indiana Fiber with Nokia
5G / 6G / Wi-Fi

Telefónica Germany Awards Nokia a 5-Year RAN Modernization Deal

November 26, 2025
AMD’s Compute + Pensando Network Architecture Powers Zyphra’s AI 
AI Infrastructure

AMD’s Compute + Pensando Network Architecture Powers Zyphra’s AI 

November 25, 2025
Bleu, the “Cloud de Confiance” from Capgemini and Orange
Clouds and Carriers

Orange Business Begins Migration of 70% of IT Infrastructure to Bleu Cloud

November 25, 2025
Dell’s server and networking sales rise 16% yoy
Financials

Dell Raises FY26 AI Infrastructure Outlook as AI Server Shipments Surge 150%

November 25, 2025
GlobalFoundries acquires Tagore Technology’s GaN IP
Optical

GlobalFoundries Acquires InfiniLink for Silicon-Photonics Expertise

November 25, 2025
Next Post
NTT Breaks 160 Tbps Barrier on 1,000 km Fiber Using X Band

IonQ Converts Visible Photons to Telecom Wavelengths, Unlocking Quantum Network Potential

Categories

  • 5G / 6G / Wi-Fi
  • AI Infrastructure
  • All
  • Automotive Networking
  • Blueprints
  • Clouds and Carriers
  • Data Centers
  • Enterprise
  • Explainer
  • Feature
  • Financials
  • Last Mile / Middle Mile
  • Legal / Regulatory
  • Optical
  • Quantum
  • Research
  • Security
  • Semiconductors
  • Space
  • Start-ups
  • Subsea
  • Sustainability
  • Video
  • Webinars

Archives

Tags

5G All AT&T Australia AWS Blueprint columns BroadbandWireless Broadcom China Ciena Cisco Data Centers Dell'Oro Ericsson FCC Financial Financials Huawei Infinera Intel Japan Juniper Last Mile Last Mille LTE Mergers and Acquisitions Mobile NFV Nokia Optical Packet Systems PacketVoice People Regulatory Satellite SDN Service Providers Silicon Silicon Valley StandardsWatch Storage TTP UK Verizon Wi-Fi
Converge Digest

A private dossier for networking and telecoms

Follow Us

  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io

© 2025 Converge Digest - A private dossier for networking and telecoms.

No Result
View All Result
  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io

© 2025 Converge Digest - A private dossier for networking and telecoms.

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.
Go to mobile version