• Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io
No Result
View All Result
Converge Digest
Saturday, April 11, 2026
  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io
No Result
View All Result
Converge Digest
No Result
View All Result

Home » Cerebras sets a new record for largest AI model

Cerebras sets a new record for largest AI model

June 22, 2022
in Semiconductors
A A

Cerebras Systems announced an ability to train models with up to 20 billion parameters on a single CS-2 system.

The Cerebras WSE-2 is the largest processor ever built, boasting 2.55 trillion more transistors and 100 times as many compute cores as the largest GPU. 

By enabling a single CS-2 to train these models, Cerebras reduces the system engineering time necessary to run large natural language processing (NLP) models from months to minutes. It also eliminates one of the most painful aspects of NLP — namely the partitioning of the model across hundreds or thousands of small graphics processing units (GPU).

“In NLP, bigger models are shown to be more accurate. But traditionally, only a very select few companies had the resources and expertise necessary to do the painstaking work of breaking up these large models and spreading them across hundreds or thousands of graphics processing units,” said Andrew Feldman, CEO and Co-Founder of Cerebras Systems. “As a result, only very few companies could train large NLP models – it was too expensive, time-consuming and inaccessible for the rest of the industry. Today we are proud to democratize access to GPT-3XL 1.3B, GPT-J 6B, GPT-3 13B and GPT-NeoX 20B, enabling the entire AI ecosystem to set up large models in minutes and train them on a single CS-2.”

Tags: #AICerebrasSilicon
ShareTweetShare
Previous Post

Dell’Oro: Small cell demand remains strong

Next Post

Vodafone to deploy Oracle Dedicated Region in its own data centers

Staff

Staff

Related Posts

Cerebras + Ranovus = Wafer-Sale Compute + Co-Packaged Optics
All

Cerebras + Ranovus = Wafer-Sale Compute + Co-Packaged Optics

April 1, 2025
Cerebras packs 4 trillion transistor into CS-3 AI processor
Data Centers

Cerebras Expands with Six New AI Data Centers

March 13, 2025
Montage Technology Samples PCIe 6.x/CXL 3.x Retimer Chips
Data Centers

Montage Technology Samples PCIe 6.x/CXL 3.x Retimer Chips

January 22, 2025
Cerebras Wafer Scale Engine packs 1.2 trillion transistors
Financials

Cerebras Files for IPO, Aiming to Revolutionize AI with its Wafer-Scale Chip

October 5, 2024
Cerebras packs 4 trillion transistor into CS-3 AI processor
Semiconductors

Cerebras Signs Aramco to Accelerate AI in Saudi Arabia

September 11, 2024
Cerebras Launches AI Inference Solution 20x Faster Than GPUs”
All

Cerebras Launches AI Inference Solution 20x Faster Than GPUs”

August 27, 2024
Next Post
Vodafone to deploy Oracle Dedicated Region in its own data centers

Vodafone to deploy Oracle Dedicated Region in its own data centers

Please login to join discussion

Categories

  • 5G / 6G / Wi-Fi
  • AI Infrastructure
  • All
  • Automotive Networking
  • Blueprints
  • Clouds and Carriers
  • Data Centers
  • Enterprise
  • Explainer
  • Feature
  • Financials
  • Last Mile / Middle Mile
  • Legal / Regulatory
  • Optical
  • Quantum
  • Research
  • Security
  • Semiconductors
  • Space
  • Start-ups
  • Subsea
  • Sustainability
  • Video
  • Webinars

Archives

Tags

5G All AT&T Australia AWS Blueprint columns BroadbandWireless Broadcom China Ciena Cisco Data Centers Dell'Oro Ericsson FCC Financial Financials Huawei Infinera Intel Japan Juniper Last Mile Last Mille LTE Mergers and Acquisitions Mobile NFV Nokia Optical Packet Systems PacketVoice People Regulatory Satellite SDN Service Providers Silicon Silicon Valley StandardsWatch Storage TTP UK Verizon Wi-Fi
Converge Digest

A private dossier for networking and telecoms

Follow Us

  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io

© 2025 Converge Digest - A private dossier for networking and telecoms.

No Result
View All Result
  • Home
  • Events Calendar
  • Blueprint Guidelines
  • Privacy Policy
  • Subscribe to Daily Newsletter
  • NextGenInfra.io

© 2025 Converge Digest - A private dossier for networking and telecoms.

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.
Go to mobile version