Tech

Sorry Nvidia, TPUs from Google now power part of OpenAI’s ChatGPT workloads

Share
Share


  • OpenAI adds Google TPUs to reduce dependence on Nvidia GPUs
  • TPU adoption highlights OpenAI’s push to diversify compute options
  • Google Cloud wins OpenAI as customer despite competitive dynamics

OpenAI has reportedly begun using Google’s tensor processing units (TPUs) to power ChatGPT and other products.

A report from Reuters, which cites a source familiar with the move, notes this is OpenAI’s first major shift away from Nvidia hardware, which has so far formed the backbone of OpenAI’s compute stack.

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
This bold compression tool promises smaller files and a greener planet, but hides key features behind paywalls
Tech

This bold compression tool promises smaller files and a greener planet, but hides key features behind paywalls

CompressionX promises better compression and greener storage, but locks core features behind...

What to know about buying electric vehicles after the federal tax incentives end
Tech

What to know about buying electric vehicles after the federal tax incentives end

Credit: Kindel Media from Pexels The massive tax and spending cut bill...