• Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io
No Result
View All Result
Converge Digest
Saturday, April 18, 2026
  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io
No Result
View All Result
Converge Digest
No Result
View All Result

Home » NVIDIA intros Mellanox 400G InfiniBand – the 7th generation

NVIDIA intros Mellanox 400G InfiniBand – the 7th generation

November 18, 2020
in All
A A

NVIDIA introduced the seventh generation of Mellanox InfiniBand delivering ultra-low latency and double data throughput with NDR 400Gb/s, as well as new NVIDIA In-Network Computing engines to provide additional acceleration.

NVIDIA Mellanox 400G InfiniBand offers 3x the switch port density and boosts AI acceleration power by 32x. In addition, it surges switch system aggregated bi-directional throughput 5x, to 1.64 petabits per second, enabling users to run larger workloads with fewer constraints.

Edge switches, based on the Mellanox InfiniBand architecture, carry an aggregated bi-directional throughput of 51.2Tb/s, with a landmark capacity of more than 66.5 billion packets per second. The modular switches, based on Mellanox InfiniBand, will carry up to an aggregated bi-directional throughput of 1.64 petabits per second, 5x higher than the last generation.

NVIDIA said the world’s leading infrastructure manufacturers — including Atos, Dell Technologies, Fujitsu, GIGABYTE, Inspur, Lenovo and Supermicro — plan to integrate NVIDIA Mellanox 400G InfiniBand into their enterprise solutions and HPC offerings. These commitments are complemented by extensive support from leading storage infrastructure partners including DDN and IBM Storage, among others.

“The most important work of our customers is based on AI and increasingly complex applications that demand faster, smarter, more scalable networks,” said Gilad Shainer, senior vice president of networking at NVIDIA. “The NVIDIA Mellanox 400G InfiniBand’s massive throughput and smart acceleration engines let HPC, AI and hyperscale cloud infrastructures achieve unmatched performance with less cost and complexity.”

“Microsoft Azure’s partnership with NVIDIA Networking stems from our shared passion for helping scientists and researchers drive innovation and creativity through scalable HPC and AI. In HPC, Azure HBv2 VMs are the first to bring HDR InfiniBand to the cloud and achieve supercomputing scale and performance for MPI customer applications with demonstrated scaling to eclipse 80,000 cores for MPI HPC,” said Nidhi Chappell, head of product, Azure HPC and AI at Microsoft Corp. “In AI, to meet the high-ambition needs of AI innovation, the Azure NDv4 VMs also leverage HDR InfiniBand with 200Gb/s per GPU, a massive total of 1.6Tb/s of interconnect bandwidth per VM, and scale to thousands of GPUs under the same low-latency InfiniBand fabric to bring AI supercomputing to the masses. Microsoft applauds the continued innovation in NVIDIA’s Mellanox InfiniBand product line, and we look forward to continuing our strong partnership together.”

“High-performance interconnects are cornerstone technologies required for exascale and beyond. Los Alamos National Laboratory continues to be at the forefront of HPC networking technologies,” said Steve Poole, chief architect for next-generation platforms at Los Alamos National Laboratory. “The Lab will continue their relationship working with NVIDIA in evaluating and analyzing their latest 400Gb/s technology aimed at solving the diverse workload requirements at Los Alamos.”

“Amid the new age of exascale computing, researchers and scientists are pushing the limits of applying mathematical modeling to quantum chemistry, molecular dynamics and civil safety,” said Professor Thomas Lippert, head of the Jülich Supercomputing Centre. “We are committed to leveraging the next generation of Mellanox InfiniBand to further our track record of building Europe’s leading, next-generation supercomputers.”

https://nvidianews.nvidia.com/news/nvidia-announces-mellanox-infiniband-for-exascale-ai-supercomputing

Tags: Blueprint columnsInfinibandNvidia
ShareTweetShare
Previous Post

PacketLight debuts 4x400G transponder/muxponder

Next Post

FCC opens 5.9 GHz band for Wi-Fi and C-V2X

Staff

Staff

Related Posts

OCP Expands AI Initiative with Contributions from NVIDIA and Meta
Semiconductors

Arm Extends Neoverse With NVIDIA NVLink Fusion

November 17, 2025
Deutsche Telekom Looks to NVIDIA for €1B Industrial AI Cloud
AI Infrastructure

Deutsche Telekom Looks to NVIDIA for €1B Industrial AI Cloud

November 6, 2025
Forescout Unveils Real-Time Detection Tech for Non-Quantum-Safe Encryption
Quantum

NVQLink: NVIDIA’s Bridge to Quantum Supercomputing

November 1, 2025
NVIDIA Fuels Korea’s AI Factory Boom
AI Infrastructure

NVIDIA Fuels Korea’s AI Factory Boom

November 1, 2025
The Megawatt Shift: NVIDIA’s 800 VDC Strategy
Data Centers

The Megawatt Shift: NVIDIA’s 800 VDC Strategy

November 1, 2025
NVIDIA Launches BlueField-4 DPU
Data Centers

NVIDIA Launches BlueField-4 DPU

October 30, 2025
Next Post
FCC opens 5.9 GHz band for Wi-Fi and C-V2X

FCC opens 5.9 GHz band for Wi-Fi and C-V2X

Please login to join discussion

Categories

  • 5G / 6G / Wi-Fi
  • AI Infrastructure
  • All
  • Automotive Networking
  • Blueprints
  • Clouds and Carriers
  • Data Centers
  • Enterprise
  • Explainer
  • Feature
  • Financials
  • Last Mile / Middle Mile
  • Legal / Regulatory
  • Optical
  • Quantum
  • Research
  • Security
  • Semiconductors
  • Space
  • Start-ups
  • Subsea
  • Sustainability
  • Video
  • Webinars

Archives

Tags

5G All AT&T Australia AWS Blueprint columns BroadbandWireless Broadcom China Ciena Cisco Data Centers Dell'Oro Ericsson FCC Financial Financials Huawei Infinera Intel Japan Juniper Last Mile Last Mille LTE Mergers and Acquisitions Mobile NFV Nokia Optical Packet Systems PacketVoice People Regulatory Satellite SDN Service Providers Silicon Silicon Valley StandardsWatch Storage TTP UK Verizon Wi-Fi
Converge Digest

A private dossier for networking and telecoms

Follow Us

  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io

© 2025 Converge Digest - A private dossier for networking and telecoms.

No Result
View All Result
  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io

© 2025 Converge Digest - A private dossier for networking and telecoms.

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.
Go to mobile version