Commoditization & Competition in AI
Declining token costs mean declining revenues, even with increasing numbers of users. LLM companies need to build ancillary revenue streams to hedge their risk.
Introduction
I keep getting questions about my claim that declining token costs for AI computation will lead to large language models (LLMs) becoming commoditized tech. An implication of this observation is that the purveyors of large language models, such as OpenAI, will have to develop value-added services, with sustainable pricing power, to sit on top of their LLMs. ChatGPT is one example of a value-added service.
So let me try to make this more concrete.
The cost of tokens for interacting with large language models like GPT4 and similar platforms is influenced by several factors, including technological advancements, scaling efficiencies, competitive market dynamics, and the evolving business models of the companies that develop and deploy these models. There has been a general trend of decreasing costs for accessing and using LLMs, which can be attributed to a number of reasons, for which, read the rest of this post.
Given that token costs across LLMs are declining, the revenues that companies like OpenAI get from selling computational resources to developers, will also decline over time, even if customers increase in number. “I’ll make it up on volume!” doesn’t work in the real world. Therefore, these companies will need to build value-added services, with sustainable pricing models, that sit on top of their LLMs.
Keep reading with a 7-day free trial
Subscribe to Buy the Rumor; Sell the News to keep reading this post and get 7 days of free access to the full post archives.