ADVERTISEMENT
Thursday, January 15, 2026
  • Login
Tech | Business | Economy
No Result
View All Result
NEWSLETTER
  • News
  • Tech
    • DisruptiveTECH
    • ConsumerTech
    • How To
    • TechTAINMENT
  • Business
    • BusinesSENSE For SMEs
    • Telecoms
    • Commerce & Mobility
    • Environment
    • Travel
    • StartUPs
      • Chidiverse
    • TE Insights
    • Security
  • Partners
  • Economy
    • Finance
    • Fintech
    • Digital Assets
    • Personal Finance
    • Insurance
  • Features
    • IndustryINFLUENCERS
    • Guest Writer
    • EventDIARY
    • Editorial
    • Appointment
    • Chidiverse
  • TECHECONOMY TV
  • Apply
  • TBS
  • Advertise
  • News
  • Tech
    • DisruptiveTECH
    • ConsumerTech
    • How To
    • TechTAINMENT
  • Business
    • BusinesSENSE For SMEs
    • Telecoms
    • Commerce & Mobility
    • Environment
    • Travel
    • StartUPs
      • Chidiverse
    • TE Insights
    • Security
  • Partners
  • Economy
    • Finance
    • Fintech
    • Digital Assets
    • Personal Finance
    • Insurance
  • Features
    • IndustryINFLUENCERS
    • Guest Writer
    • EventDIARY
    • Editorial
    • Appointment
    • Chidiverse
  • TECHECONOMY TV
  • Apply
  • TBS
  • Advertise
No Result
View All Result
Tech | Business | Economy
No Result
View All Result

Home » OpenAI to Buy 750MW of Compute From Nvidia Rival Cerebras in $10 Billion Deal

OpenAI to Buy 750MW of Compute From Nvidia Rival Cerebras in $10 Billion Deal

Joan Aimuengheuwa by Joan Aimuengheuwa
January 15, 2026
in DisruptiveTECH
Reading Time: 2 mins read
0
OpenAI to Buy 750MW of Compute From Nvidia Rival Cerebras

Source: OpenAI

OpenAI has locked in a massive new supply of computing power, agreeing to buy 750 megawatts over three years from US chipmaker Cerebras in a deal valued at more than $10 billion.

The agreement, which runs through 2028, is designed to speed up how ChatGPT and other OpenAI products respond to users, as competition gets stronger among the world’s largest technology firms to control the infrastructure behind advanced models. 

Cerebras will provide the capacity through its own cloud services, powered by its wafer-scale chips, with new data centres to be built or leased specifically for the contract.

Seven hundred and fifty megawatts is not incremental capacity, it shows industrial-level demand. 

OpenAI is no longer thinking only about training models but about inference, the everyday work of answering questions, reasoning through problems and serving millions of users at once. 

“Integrating Cerebras into our mix of compute solutions is all about making our AI respond much faster,” OpenAI stated. 

Talks between the two companies began last August after Cerebras showed that OpenAI’s open-source models could run more efficiently on its chips than on standard graphics processors. 

Months of negotiations followed. The result is a structure where Cerebras owns and operates the hardware, while OpenAI pays to access it as a service. Capacity will be added in stages over the next few years.

MTN New

Cerebras is preparing to return to public markets after withdrawing its previous listing attempt in late 2024. It now plans to re-file for an initial public offering in the second quarter of 2026. This OpenAI contract helps address a long-standing concern among investors: dependence on a single customer. 

In 2024, UAE-based technology group G42 accounted for nearly 87% of Cerebras’ revenue. A multi-billion-dollar US client changes that picture.

The deal also fits neatly into OpenAI’s bigger strategy. The company is laying the foundation for a potential initial public offering that could value it at around $1 trillion. 

Its chief executive, Sam Altman, has publicly committed $1.4 trillion to building 30 gigawatts of computing capacity, enough to power roughly 25 million homes in the United States. 

Competition is high. Nvidia still tops the market for training chips, while Google and Microsoft are pushing their own custom hardware through cloud platforms. 

By turning to Cerebras, OpenAI is clearly trying to reduce reliance on a single supplier and gain an edge in speed, particularly for reasoning-heavy models.

But then, the spending spree is unsettling some observers, with warnings that the surge in valuations and capital commitments across the sector shows the excesses of the dot-com era. 

The counter-argument is equally strong, demand for inference compute is exploding as artificial intelligence moves from labs into daily use. 

0Shares

stanbic
Previous Post

‘DeRemi Atanda: Africa Must Define its Payment Future

Next Post

Wikipedia Turns 25: Partners Microsoft, Meta, Amazon, AI Startups for Paid Access to its Data

Joan Aimuengheuwa

Joan Aimuengheuwa

Joan thrives at helping individuals and businesses scale via storytelling...

Next Post
Wikipedia Partners Microsoft, Meta, Amazon, AI Startups

Wikipedia Turns 25: Partners Microsoft, Meta, Amazon, AI Startups for Paid Access to its Data

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

I agree to the Terms & Conditions and Privacy Policy.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

MTN New
UBA
Advertisements
  • About Us
  • Careers
  • Contact Us

© 2026 TECHECONOMY.

No Result
View All Result
  • Techeconomy
  • News
  • Technology
  • Business
  • Economy
  • Jobseeker
  • Advertise

© 2026 TECHECONOMY.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.