IP Infusion has released OcNOS Data Center (OcNOS-DC) version 6.6.1, its latest open networking software tailored for AI and machine learning workloads in modern data centers. The software runs on Broadcom’s Tomahawk 5 ASIC and powers 51.2Tbps-capable white-box switches from Edgecore (AS9817-64D) and UfiSpace (S9321-64E), offering low-latency, lossless Ethernet fabrics specifically built to handle distributed GPU clusters.
OcNOS-DC 6.6.1 addresses critical requirements in AI/ML environments such as high-throughput east-west traffic, tight latency constraints, and zero-packet-loss performance. It supports key features like Layer 3 Priority-based Flow Control (PFC), Enhanced Transmission Selection (ETS), DCBX, and Dynamic Load Balancing (DLB). For orchestration, the system exports gNMI telemetry for real-time network awareness and integrates with Kubernetes for job reallocation and Ansible for automating queue and traffic class configurations.
By promoting a disaggregated architecture, IP Infusion enables hyperscalers to scale their AI infrastructure with vendor-agnostic optics, flexible hardware sourcing, and cost transparency. The collaboration with HYPER SCALERS highlights the platform’s viability in real-world deployments where lossless Ethernet is essential for GPU performance optimization.
- OcNOS-DC 6.6.1 targets AI/ML workloads using Broadcom Tomahawk 5 and 51.2Tbps white-box switches
- Supports Layer 3 PFC, ETS, DCBX, and DLB for lossless, low-latency fabrics
- Integrates with Kubernetes and Ansible for dynamic orchestration and automated traffic optimization
- Enables cost-effective scaling via disaggregated, open networking
- Used in production by HYPER SCALERS for GPU-intensive AI services
“OcNOS Data Center sets a new standard for AI/ML networking, delivering unparalleled performance and scalability for next-generation data centers,” said Kiyo Oishi, CEO of IP Infusion.
