• Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io
No Result
View All Result
Converge Digest
Saturday, April 11, 2026
  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io
No Result
View All Result
Converge Digest
No Result
View All Result

Home » ZEDEDA and NVIDIA Boost Enterprise Edge AI Orchestration

ZEDEDA and NVIDIA Boost Enterprise Edge AI Orchestration

March 12, 2025
in Enterprise
A A

ZEDEDA has expanded its support for NVIDIA’s edge AI platform, integrating deeper with NVIDIA Jetson systems, the NGC catalog, and the TAO toolkit. This enhancement enables enterprises to seamlessly deploy, secure, and manage AI models at the edge, addressing challenges such as resource constraints, heterogeneous hardware, and intermittent connectivity. By leveraging ZEDEDA’s edge-first orchestration platform, organizations can optimize AI models using NVIDIA’s TAO toolkit and deploy them efficiently across distributed environments without increasing operational complexity.

The platform enhancements include direct CLI integration with the NVIDIA NGC catalog, enabling seamless AI model deployment to edge nodes, and native support for NVIDIA Jetson GPUs, including Orin NX and AGX, for optimized AI performance. Automated edge management allows organizations to scale AI deployments across tens of thousands of nodes, while integrated observability tools using Grafana and Prometheus provide real-time visibility into AI performance. ZEDEDA’s zero-trust security model ensures that AI workloads are deployed with enterprise-grade protection, making it easier for industries such as manufacturing, energy, retail, transportation, and robotics to integrate AI into their operations.

Key Benefits cited by ZEDEDA

• Seamless AI Model Deployment – Direct CLI integration with NVIDIA’s NGC catalog simplifies edge AI deployment.

• Optimized Edge AI Performance – Native support for NVIDIA Jetson GPUs improves efficiency and inference speed.

• Automated Edge Management – Zero-touch deployment scales AI workloads across thousands of edge nodes.

• Advanced AI Observability – Integrated Grafana and Prometheus dashboards provide real-time monitoring and analytics.

• Enterprise-Grade Security – Zero-trust security framework ensures AI models are deployed safely across distributed environments.

“By expanding our integration with NVIDIA’s AI platform, we’re enabling enterprises to innovate faster with simplified AI deployment workflows, reduce risk through zero-touch management and security, and lower costs by automating manual processes at the edge,” said Said Ouissal, CEO and founder of ZEDEDA.

  • ZEDEDA, founded in 2016 and headquartered in San Jose, California, specializes in cloud-native edge virtualization platforms designed to monitor, visualize, and secure real-time edge applications. The company was co-founded by CEO Said Ouissal, who brings extensive experience from roles at Juniper Networks, Ericsson, and Redback Networks, and CTO Erik Nordmark, a seasoned technologist in networking and distributed systems with prior roles at Cisco and Sun Microsystems.
  • In February 2024, ZEDEDA secured $72 million in a Series C funding round led by Smith Point Capital, bringing its total funding to over $140 million following additional investments later in 2024. This investment supports global expansion, including a new Middle East headquarters in Abu Dhabi, and addresses the rising demand for edge computing solutions. In 2024, ZEDEDA doubled its annual revenue and edge nodes under management, serving 12 Fortune Global 500 companies by mid-year. Notable partnerships include a collaboration with VMware to simplify edge computing deployments through an OEM agreement, and with Edge Impulse to enhance AI model development and deployment at the edge.
Tags: Zededa
ShareTweetShare
Previous Post

CommScope Intros Propel XFrame Fiber Solution for High-Density Data Centers

Next Post

OpenInfra Joins the Linux Foundation to Drive Open Source Infrastructure

Jim Carroll

Jim Carroll

Editor and Publisher, Converge! Network Digest, Optical Networks Daily - Covering the full stack of network convergence from Silicon Valley

Related Posts

Multi-Cloud Flexibility for Telco Cloud and Edge
Video

What’s next for edge services in 2023?

January 16, 2023
Next Post
OpenInfra Joins the Linux Foundation to Drive Open Source Infrastructure

OpenInfra Joins the Linux Foundation to Drive Open Source Infrastructure

Categories

  • 5G / 6G / Wi-Fi
  • AI Infrastructure
  • All
  • Automotive Networking
  • Blueprints
  • Clouds and Carriers
  • Data Centers
  • Enterprise
  • Explainer
  • Feature
  • Financials
  • Last Mile / Middle Mile
  • Legal / Regulatory
  • Optical
  • Quantum
  • Research
  • Security
  • Semiconductors
  • Space
  • Start-ups
  • Subsea
  • Sustainability
  • Video
  • Webinars

Archives

Tags

5G All AT&T Australia AWS Blueprint columns BroadbandWireless Broadcom China Ciena Cisco Data Centers Dell'Oro Ericsson FCC Financial Financials Huawei Infinera Intel Japan Juniper Last Mile Last Mille LTE Mergers and Acquisitions Mobile NFV Nokia Optical Packet Systems PacketVoice People Regulatory Satellite SDN Service Providers Silicon Silicon Valley StandardsWatch Storage TTP UK Verizon Wi-Fi
Converge Digest

A private dossier for networking and telecoms

Follow Us

  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io

© 2025 Converge Digest - A private dossier for networking and telecoms.

No Result
View All Result
  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io

© 2025 Converge Digest - A private dossier for networking and telecoms.

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.
Go to mobile version