OpenAI Shifts to AMD Chips, Eyes Custom AI Hardware by 2026

Sophia Steele

Sophia Steele

October 29, 2024 · 2 min read
OpenAI Shifts to AMD Chips, Eyes Custom AI Hardware by 2026

In a significant move, OpenAI, the artificial intelligence research organization behind ChatGPT, is shifting its hardware strategy to incorporate AMD chips into its Microsoft Azure setup. According to Reuters, this change is part of OpenAI's updated hardware plan to support its large AI workloads for inference.

While this move marks a departure from its previous reliance on Nvidia chips, OpenAI is also reportedly working with Broadcom to develop custom silicon designed specifically for its AI workloads. The company has assembled a team of around 20 people, including lead engineers from Google's Tensor processor team, to drive this effort. However, production of these custom chips is not expected to begin until 2026.

This development puts OpenAI on a similar path to other tech giants like Microsoft, Meta, Google, and Amazon, which are also investing in custom AI chip designs to manage costs and access to AI server hardware. While OpenAI may be playing catch-up, its move signals a significant investment in its AI infrastructure and a commitment to staying competitive in the rapidly evolving AI landscape.

The adoption of AMD chips is a significant win for the company, which has seen its data center business double in a single year. OpenAI's decision to incorporate AMD chips into its Azure setup is likely to boost AMD's market share, challenging Nvidia's dominance in the AI chip market.

As the AI chip race heats up, OpenAI's move is a significant development in the tech and startup community, highlighting the importance of custom hardware designs in driving AI innovation and competitiveness.

Similiar Posts

Copyright © 2024 Starfolk. All rights reserved.