Open AI is finally releasing a new chapter in its journey by designing its own in-house chip. This company is known for its amazing AI models and now is collaborating with Broadcom and Taiwan Semiconductor Manufacturing Company, or TSMC to manage costs and build a reliable and secure supply chain.
There were speculations in the Open-AI team to build their own chip manufacturing foundries but the high costs and timeline made the company decide to build their in-house chip design. Recently, Open AI assembled a group of 20 chip engineers, including experts with experience designing Google’s TPU, or Tensor Processing Units.
The collaboration with TMSC came at a crucial time, as many companies in the tech area were figuring out how to source high-performance chips. Open AI’s team also built its custom design to its chips to establish control over its hardware and cut infrastructure costs.
The announcement of the in-house chip to the world made a huge impact on the stock market. This reaction reflects the investors’ confidence in Open AI’s ability to navigate the complexities of the chip design all while managing industry partnerships. The first investors were none other than the companies that collaborated with Open AI, namely claiming a 4 percent stake by Brandcom and a 1 percent stake in the US-listed shares by TMSC.
The current strategy of Open AI is to secure consistent chip supply and manage the ever-rising costs that were also faced by other tech giants like Amazon, Google, Meta, etc. OpenAI also decided to integrate AMD Chips, along with Nvidia’s GPUs recently. Although these companies continue to dominate the market, they have been limited in supply. Open AI’s decision to source from multiple chipmakers as well as develop its own custom chip will have broader implications for the tech industry.
OpenAI is still in confusion whether or not to develop or acquire other elements for its chip design. Several other sources stated that OpenAI would come up with its first custom-designed chip by 2026. The future of Open AI is to make companies rely less on external suppliers and build their own infrastructure to power the next generation of AI technologies.
Reports also gave attention to OpenAI’s planned use of the AMD chips through Microsoft Azure which talks in detail on how AMD’s new MI300X chips would attempt to capture a portion of the market dominated by Nvidia. AMD, right now, has projected 4.5 Billion U.S. Dollars in AI chip sales for 2024 after the chip’s launch in Q4 2023.
Some sources indicated that the OpenAI projects a 5 billion U.S. Dollars loss this year against USD 3.7 billion in revenue. This loss is the total computed costs of hardware, electricity, and cloud services for processing vast datasets and training models. This shall remain the company’s largest expense which prompts initiatives to optimise resource use and diversify the supplies.
Other Reuters added that Open AI has had positive relations with NVIDIA avoiding aggressive recruiting from the chip giant while continuing to use its high-performance GPUs.
Related: OpenAI CEO Confirms No GPT-5 In 2024 Due To GPT-0.1 Delays And Other Computational Issues.