This is all perfectly reasonable-sounding, but I've also never seen a period in history where the demand for faster, cheaper computer ever saturated. The next gen processor is essentially always in high demand.
The way I'd put it is: demand for capability rarely saturates. But demand for a specific form of compute capital does. When efficiency and substitution outpace new monetization channels, the growth curve flattens even as capability keeps rising.
It’s a good analysis, and timely as Nvidia seems poised to gobble up the world Godzilla-style.
In the long term, I can see the confluence of more efficient workloads and migrations to the edge. Those two trends are reinforcing, and can combine to create space for other players. Apple and Samsung may benefit largely from this.
But I believe strongly in Jevons paradox here; and I doubt overall demand slows down anytime soon.
And if you’re waist deep in semiconductors, you see just how daunting it is to develop competing GPUs or ASICs. There’s so much technical risk there…. As a competitor you gotta think “Nvidia has solved this issue ‘i’ and we haven’t….. need to make the right design decisions here…. so challenging…”. Now repeat for issues i,j,k,…..ad infinitum. It’s not impossible but very, very daunting. The Groks of the world may win a decent share…. Or they may chase for a decade and wind up nowhere.
I think Intels path is instructive here. They came to dominate their original home turf (PCs), and in fact they still do; their lead was too great to ever be truly displaced. But they couldn’t step out into new emerging fields. When the GPU gets replaced by something truly new, some other winner may likely emerge. But I’d wager my long position that Nvidia becomes a $10T enterprise by then.
Yes, Jevons paradox is underrated. As for non Nvidia chips, Amazon just announced that half of its inference at its huge Indiana data center will run through its own proprietary chips, called Inferentia.
It feels like 2005. Everything structurally was in place in 2005 for the financial crisis to occur. It wasn't until 2007 that real estate prices started to meaningfully plateau, which then caused the 2008 meltdown. But not only that, 2008 was a financial crisis was as bad as it was because of the interrelatedness of the relevant parties. Derivatives don't reduce systemic risk, they add to it.
Of course, an industry built off of confidently, loudly lying would not bother to pay attention. Rhymes, rhymes, rhymes....
I see the rhyme you’re pointing to — reflexive optimism funding overbuild — but I think the structural analogy to 2005 is limited. The 2008 crisis was credit-driven and balance-sheet-linked; the GPU boom is equity-funded and deflationary. If this cycle breaks, it’ll look more like a telecom or DRAM bust than a mortgage meltdown.
Substitution is definitely the risk, but I think the infra developers are far more exposed to that than Nvidia. We talk about "1 GW" datacenters but the power draw is a means to an end and more efficient algos or chips could kick the stool out from under the GE Vernovas and Vertivs of the world. The only thing that is truly a threat to NVidia is workloads shifting off the CUDA platform entirely, just as Intel has been gutshot by the transition from x86 dominance.
This is all perfectly reasonable-sounding, but I've also never seen a period in history where the demand for faster, cheaper computer ever saturated. The next gen processor is essentially always in high demand.
The way I'd put it is: demand for capability rarely saturates. But demand for a specific form of compute capital does. When efficiency and substitution outpace new monetization channels, the growth curve flattens even as capability keeps rising.
It’s a good analysis, and timely as Nvidia seems poised to gobble up the world Godzilla-style.
In the long term, I can see the confluence of more efficient workloads and migrations to the edge. Those two trends are reinforcing, and can combine to create space for other players. Apple and Samsung may benefit largely from this.
But I believe strongly in Jevons paradox here; and I doubt overall demand slows down anytime soon.
And if you’re waist deep in semiconductors, you see just how daunting it is to develop competing GPUs or ASICs. There’s so much technical risk there…. As a competitor you gotta think “Nvidia has solved this issue ‘i’ and we haven’t….. need to make the right design decisions here…. so challenging…”. Now repeat for issues i,j,k,…..ad infinitum. It’s not impossible but very, very daunting. The Groks of the world may win a decent share…. Or they may chase for a decade and wind up nowhere.
I think Intels path is instructive here. They came to dominate their original home turf (PCs), and in fact they still do; their lead was too great to ever be truly displaced. But they couldn’t step out into new emerging fields. When the GPU gets replaced by something truly new, some other winner may likely emerge. But I’d wager my long position that Nvidia becomes a $10T enterprise by then.
Yes, Jevons paradox is underrated. As for non Nvidia chips, Amazon just announced that half of its inference at its huge Indiana data center will run through its own proprietary chips, called Inferentia.
On Amazon’s chip (and others like it): I respect the effort. But that’s different from buying it.
Yeah it just seems like Nvidia's margin is everyone else's opportunity.
As long as people keep buying GPUs that’s true. But if demand for GPUs declines, for any of the reasons I outline in my post…look out below.
It amazes me how history rhymes so well.
It feels like 2005. Everything structurally was in place in 2005 for the financial crisis to occur. It wasn't until 2007 that real estate prices started to meaningfully plateau, which then caused the 2008 meltdown. But not only that, 2008 was a financial crisis was as bad as it was because of the interrelatedness of the relevant parties. Derivatives don't reduce systemic risk, they add to it.
Of course, an industry built off of confidently, loudly lying would not bother to pay attention. Rhymes, rhymes, rhymes....
I see the rhyme you’re pointing to — reflexive optimism funding overbuild — but I think the structural analogy to 2005 is limited. The 2008 crisis was credit-driven and balance-sheet-linked; the GPU boom is equity-funded and deflationary. If this cycle breaks, it’ll look more like a telecom or DRAM bust than a mortgage meltdown.
Substitution is definitely the risk, but I think the infra developers are far more exposed to that than Nvidia. We talk about "1 GW" datacenters but the power draw is a means to an end and more efficient algos or chips could kick the stool out from under the GE Vernovas and Vertivs of the world. The only thing that is truly a threat to NVidia is workloads shifting off the CUDA platform entirely, just as Intel has been gutshot by the transition from x86 dominance.