
"Of course everyone in the AI race - Microsoft, Google, their giant rivals and their startup challengers - views AI as the industry's next big platform and believes it will keep getting more useful and lucrative. But if or when the current investing frenzy subsides, or even if there's a big AI bust, a company that built a giant data center for AI still has a giant data center. That's why no one in Silicon Valley is terribly worried about the risk of overbuilding."
"Remember that the Nvidia chips powering today's massive AI model training projects were, until the advent of ChatGPT, known as GPUs - graphics processing units. Originally, GPUs were for manipulating images, and that made them valuable for gaming rigs as well as professional workstations. In the 2010s, the crypto industry discovered that the same devices were the perfect tool for bitcoin mining and other blockchain-related computing needs."
Major technology companies view AI as the next platform and expect continued growth in usefulness and profitability. Firms are investing billions in data centers to train frontier models and to support inference workloads that will reshape business backends and consumer services. Large AI server farms can be repurposed for other computing tasks, which mitigates the financial risk of overbuilding. GPUs evolved from graphics to crypto mining to AI training, demonstrating hardware versatility. Historical overinvestment in fiber during the dotcom era eventually proved useful when demand rose. Geopolitical competition with China is cited as an additional investment rationale.
Read at Axios
Unable to calculate read time
Collection
[
|
...
]