OpenAI's $1.4 trillion question hangs over the AI industry like a cloud, casting a shadow on investor concerns about bubble warnings in the artificial intelligence boom. The company behind ChatGPT needs an enormous amount of computing power to train its models and produce responses, which is a staggering expense that dwarfs its annual revenues of $13 billion.
The cost of OpenAI's compute commitment – including chips and servers that power its chatbot – is a whopping $1.4 trillion over the next eight years. This figure has become a chasm-like gap between OpenAI's financials, with investors questioning how the company can afford such an enormous spending commitment. The answer to this question could go a long way in easing investor concerns and reassuring market nerves about AI spending.
OpenAI's CEO Sam Altman attempted to address these concerns during an awkward exchange with leading investor Brad Gerstner of Altimeter Capital, who described the company's ability to pay for more than $1 trillion in compute costs as "a question hanging over the market." Altman responded by stating that OpenAI is doing well and expressing confidence in its revenue growth.
However, there are also questions over the circular nature of some of OpenAI's compute deals. For instance, Oracle will spend $300 billion building new datacentres for OpenAI in Texas, New Mexico, Michigan, and Wisconsin – and OpenAI will then pay back roughly the same amount to use those datacentres. These complex transactions have raised eyebrows among investors and analysts.
Benedict Evans, a tech analyst, notes that OpenAI is trying to match the other big AI players such as Mark Zuckerberg's Meta, Google, and Microsoft, which are supported by their already profitable business models. However, this raises concerns about OpenAI's ability to generate sufficient revenue to cover its costs.
One Silicon Valley investor who wishes to remain anonymous comments that OpenAI can build on its popularity but its success is contingent on factors such as the models improving, the cost of operating them getting cheaper, and the chips used to power them becoming less costly. The question is at what scale can they build out these products and revenue models, and how good can their models get?
Despite being loss-making, OpenAI remains optimistic about its prospects, with CEO Sam Altman believing that revenue will come from a number of sources, including growing demand for paid-for versions of ChatGPT, other companies using its datacentres, people buying the hardware devices it is building, and "huge value" created by AI's achievements in scientific research.
However, not everyone shares this optimism. Carl Benedikt Frey, an associate professor of AI and work at Oxford University, points to recent evidence of a slowdown in AI adoption in the US, with the US Census Bureau reporting that AI adoption has been declining among companies with more than 250 employees. Without new breakthroughs, Frey does not see OpenAI reaching $100 billion in revenue by 2027 – a figure Altman has hinted at.
Ultimately, the bet is on whether demand and ever-better iterations of OpenAI's products will pay off its enormous compute costs. Only time will tell if this gamble will pan out for the company or leave investors with egg on their faces.
The cost of OpenAI's compute commitment – including chips and servers that power its chatbot – is a whopping $1.4 trillion over the next eight years. This figure has become a chasm-like gap between OpenAI's financials, with investors questioning how the company can afford such an enormous spending commitment. The answer to this question could go a long way in easing investor concerns and reassuring market nerves about AI spending.
OpenAI's CEO Sam Altman attempted to address these concerns during an awkward exchange with leading investor Brad Gerstner of Altimeter Capital, who described the company's ability to pay for more than $1 trillion in compute costs as "a question hanging over the market." Altman responded by stating that OpenAI is doing well and expressing confidence in its revenue growth.
However, there are also questions over the circular nature of some of OpenAI's compute deals. For instance, Oracle will spend $300 billion building new datacentres for OpenAI in Texas, New Mexico, Michigan, and Wisconsin – and OpenAI will then pay back roughly the same amount to use those datacentres. These complex transactions have raised eyebrows among investors and analysts.
Benedict Evans, a tech analyst, notes that OpenAI is trying to match the other big AI players such as Mark Zuckerberg's Meta, Google, and Microsoft, which are supported by their already profitable business models. However, this raises concerns about OpenAI's ability to generate sufficient revenue to cover its costs.
One Silicon Valley investor who wishes to remain anonymous comments that OpenAI can build on its popularity but its success is contingent on factors such as the models improving, the cost of operating them getting cheaper, and the chips used to power them becoming less costly. The question is at what scale can they build out these products and revenue models, and how good can their models get?
Despite being loss-making, OpenAI remains optimistic about its prospects, with CEO Sam Altman believing that revenue will come from a number of sources, including growing demand for paid-for versions of ChatGPT, other companies using its datacentres, people buying the hardware devices it is building, and "huge value" created by AI's achievements in scientific research.
However, not everyone shares this optimism. Carl Benedikt Frey, an associate professor of AI and work at Oxford University, points to recent evidence of a slowdown in AI adoption in the US, with the US Census Bureau reporting that AI adoption has been declining among companies with more than 250 employees. Without new breakthroughs, Frey does not see OpenAI reaching $100 billion in revenue by 2027 – a figure Altman has hinted at.
Ultimately, the bet is on whether demand and ever-better iterations of OpenAI's products will pay off its enormous compute costs. Only time will tell if this gamble will pan out for the company or leave investors with egg on their faces.