Teknolojivisibility73 views

OpenAI Unhappy with Nvidia Chip Speed: Cerebras Deal

According to Reuters' report, OpenAI is dissatisfied with the speed of some of Nvidia's AI chips. The company is in talks with alternative manufacturers like Cerebras for some of its needs.

calendar_today🇹🇷Türkçe versiyonu
OpenAI Unhappy with Nvidia Chip Speed: Cerebras Deal

Criticism of Nvidia's Inference Chips

It has been reported that OpenAI, a leading company in the artificial intelligence field, is dissatisfied with the speed of some AI chips produced by Nvidia. According to a Reuters report citing eight sources, the criticism focuses on "inference" chips, which enable trained models to respond to user queries.

The sources indicate that Nvidia hardware's response generation speed fails to meet expectations, especially in applications where speed is critical, such as software development. Consequently, OpenAI has begun searching for new hardware for approximately ten percent of its future inference workload.

Search for a Different Design for Speed

The inference process requires more memory access compared to the training process. While Nvidia GPUs use external memory, this can affect processing speed. It is stated that OpenAI is turning to chip designs with SRAM memory embedded directly on the processor, offering faster access, as a solution to this problem.

In this context, it has been learned that the company has been in talks for several months with next-generation chip manufacturers like Cerebras and Groq. It was reported that an agreement was reached with Cerebras, and CEO Sam Altman confirmed last month that this agreement aims to meet the speed requirements needed for coding models. Talks with Groq ended when Nvidia signed a $20 billion licensing agreement with the startup.

$100 Billion Investment Delayed

Last September, it was announced that Nvidia planned to invest up to $100 billion in OpenAI. However, while the deal was expected to be finalized within weeks, negotiations are said to have been ongoing for months. One source suggested that OpenAI's changing product roadmap has impacted the slow progress of the talks.

Nvidia CEO Jensen Huang, last Saturday, dismissed reports of tension between the two companies as "nonsense" and emphasized that the company still plans to invest tens of billions of dollars. An OpenAI spokesperson stated that the company continues to rely on Nvidia for the vast majority of its inference fleets.

Such moves regarding AI infrastructure show that competition in the sector is shaped not only by software but also by hardware and processing power. While the healthcare sector is undergoing an AI transformation with platforms like Amazon Bedrock, developments in the chip technologies that provide the underlying infrastructure continue to be of critical importance.

AI-Assisted Content
Sources: the-decoder.com

recommendRelated Articles