OpenAI Exploring Development of Its Own AI Chips
As the AI chip shortage intensifies, OpenAI may take chip design into its own hands. Sources reveal the $1B startup has been exploring solutions to secure more hardware for its growing models since 2021. Now, CEO Sam Altman has prioritized chip acquisition to power what's next.
Like competitors reliant on Graphics Processing Units (GPUs) for development, OpenAI uses Nvidia's parallel computing chips to train giants like ChatGPT and envision future projects. But the AI boom stressing suppliers has GPUs sold out until 2024 according to insiders. This GPU drought threatens disruption for even Microsoft's advanced AI infrastructure.
The costs are skyrocketing too. One analysis found equipping ChatGPT at one-tenth Google Search scale would require $48B in initial GPUs plus $16B annually. Faced withgpus shortage, OpenAI is weighing an acquisition or designing chips in-house.
Pioneering this path are Google, Amazon and reportedly Microsoft collaborating with AMD on a custom chip. Now OpenAI appears ready to join the chip race with its $11B funding and nearing $1B annual revenue. A potential $90B valuation also signals investors may back chip R&D costing hundreds of millions.
However, AI hardware brings big risks as competitors like Graphcore and Habana Labs have learned through layoffs, falling revenue and failed deals. Even Meta struggled with experimental designs.
While an OpenAI chip could take years and fortune to develop, seriously tackling its GPU dependence may prove the only long term solution. But investors will have to stomach the potential for a very expensive gamble.