Site icon Converge Digest

Edgecore Launches Nexvec AI Infrastructure Platform

Edgecore Networks introduced its Nexvec solution for Enterprise AI, combining disaggregated networking with software-defined composable infrastructure to address the performance and resource challenges of AI workloads. Targeted at inference, agentic AI, and reasoning tasks, Nexvec enables GPU and memory pooling using PCIe and CXL fabrics, reducing the power and space overhead typically required by high-end server clusters. The solution supports orchestration frameworks such as VMware, SLURM, and Kubernetes.

Nexvec integrates Liqid’s Matrix software to dynamically allocate GPUs, memory, and storage across workloads, while leveraging Edgecore’s open Ethernet switch portfolio. The solution supports both scale-up and scale-out AI architectures with Broadcom Tomahawk and Jericho chipsets and includes the new Nous fabric controller for full lifecycle automation. Edgecore also emphasized its support for SONiC and third-party NOS integration through a certification program.

Limited availability for Nexvec begins immediately, with general availability expected by year-end. Edgecore’s approach builds on its long-standing open networking strategy while expanding into full-stack AI infrastructure for enterprise environments.

Key Points:

“We’re moving beyond networking to deliver a full-stack solution, integrating disaggregated networking and composable compute to simplify Enterprise AI adoption,” said Jun Shi, CEO of Accton Technology Group.

Exit mobile version