Great piece, Dave. One near-term lever you didn’t mention: wide-band-gap silicon-carbide (SiC) power electronics.
• ~30-40 % of a rack’s load is overhead: AC-DC rectifiers, DC-DC converters, pumps, fans, UPS. Swapping legacy silicon switches for SiC lifts those stages from ~96 % to ~99 % efficiency, eliminating 70-100 W of waste heat for every 10 kW GPU node and cutting site PUE by ~8-12 % with <18-month payback.
• SiC ceramics as heat-spreaders/lids add ~2× thermal conductivity vs. today’s AlN, dropping GPU junction temps by 3-5 °C. Free head-room for those 1 kW Blackwells.
• It’s shipping (automotive volumes today, datacenter retrofit kits rolling out this year) so operators can buy time while photonics, analog in-memory, and neuromorphic chips mature.
SiC won’t replace Nvidia’s silicon logic, but it shrinks the thermal tax that threatens to stall AI build-outs right now- a practical bridge between overheated GPUs and the “post-GPU” future you outline.
In addition, it will be interesting to see how the cost curve changes. Right now I feel the software layer has little to no margin to make money as all the profits are with Nvidia. That has to shift a bit too. Sharing my article on the economics of it, primarily saddled by gpu costs
Great piece, Dave. One near-term lever you didn’t mention: wide-band-gap silicon-carbide (SiC) power electronics.
• ~30-40 % of a rack’s load is overhead: AC-DC rectifiers, DC-DC converters, pumps, fans, UPS. Swapping legacy silicon switches for SiC lifts those stages from ~96 % to ~99 % efficiency, eliminating 70-100 W of waste heat for every 10 kW GPU node and cutting site PUE by ~8-12 % with <18-month payback.
• SiC ceramics as heat-spreaders/lids add ~2× thermal conductivity vs. today’s AlN, dropping GPU junction temps by 3-5 °C. Free head-room for those 1 kW Blackwells.
• It’s shipping (automotive volumes today, datacenter retrofit kits rolling out this year) so operators can buy time while photonics, analog in-memory, and neuromorphic chips mature.
SiC won’t replace Nvidia’s silicon logic, but it shrinks the thermal tax that threatens to stall AI build-outs right now- a practical bridge between overheated GPUs and the “post-GPU” future you outline.
Thanks. Super interesting comment. I actually had not heard of wide-band-gap SiC before. I will have to dig into it. Appreciate the information!
Another one is Gallium Nitride or GaN which’s even better as you see NVIDIA is looking to test some GaN components with Navitas
Love the analysis.
In addition, it will be interesting to see how the cost curve changes. Right now I feel the software layer has little to no margin to make money as all the profits are with Nvidia. That has to shift a bit too. Sharing my article on the economics of it, primarily saddled by gpu costs
Tokenomics
https://open.substack.com/pub/pramodhmallipatna/p/the-token-economy
Private Model Econmics
https://open.substack.com/pub/pramodhmallipatna/p/private-model-economics-for-enterprise
Agent Economics
https://open.substack.com/pub/pramodhmallipatna/p/the-economics-of-ai-agents-a-high
Forced Physics DCT has a simple solution.
Nvidia needs to go quantum dude 😎