Nvidia shares sold off sharply this spring, fueled by increased worry that spending on artificial intelligence infrastructure, such as Nvidia’s semiconductor chips, was peaking.
The argument was that hyperscalers operating massive cloud data centers were overbuilding capacity and double-ordering chips, setting the stage for a demand reckoning that would cause Nvidia’s sales of increasingly faster and pricier graphic processing units, or GPUs, like its latest Blackwell lineup, to swoon.
💵💰Don't miss the move: Subscribe to TheStreet's free daily newsletter 💰💵
However, hyperscalers appear comfortable with their spending and are ready to increase it further.
On July 30, Microsoft, along with Meta Platforms, parent of Facebook and Instagram, offered capital expenditure guidance and comments on their AI buildout, reinforcing the idea that Nvidia’s best days aren’t behind it.
AI chatbots and agentic AIs send Nvidia sales surging
The launch of OpenAI’s ChatGPT in late 2022 was an eye-opener. The large language model’s ability to crunch and parse data more efficiently than traditional search led to it becoming the fastest app to reach one million users, kicking off a firestorm of chatbot research and development.
Recognizing that its long-time dominance in traditional search might be at risk, Google quickly responded with Gemini, a chatbot easily accessible for free on Google’s home page.
Related: Stocks respond to Fed cut decision, Meta Platforms & Microsoft earnings
Microsoft poured billions of dollars into a partnership deal with OpenAI to leverage ChatGPT, and eventually integrated its CoPilot AI throughout its popular Office 365 platform.
Meta Platforms launched an open-source chatbot, Llama, and Amazon invested billions of dollars into Anthropic, which rolled out the Claude chatbot.
Eventually, enterprises got in on the act, too, developing their own AI programs to support their businesses' specific needs.
Banks use agentic AI, or AI agents that can assist or replace traditional workers, to hedge portfolio and loan risks. Manufacturers are using it to improve quality. Retailers are experimenting with AI in supply chains. And health care companies are investigating its use in drug development and treatment.
The pick-and-shovel sellers of semiconductor chips, memory, and servers have powered all the activity behind the scenes.
Companies such as Super Micro have seen sales of liquid-cooled server racks surge, while memory companies like Micron have rolled out bigger and faster DRAM solutions suited to AI.
However, Nvidia has been the biggest beneficiary by far of the surge in spending to build out and upgrade existing data centers.
Nvidia’s latest Blackwell GPUs can cost $30,000 to $40,000 each, and fully equipped server racks can fetch millions. When coupled with Nvidia's CUDA software, these GPUs are faster and better suited to crunching massive workloads associated with AI than CPUs traditionally powering servers in data centers.
More Nvidia:
- Fund manager who predicted Nvidia stock rally reboots forecast on China
- Major analyst revamps Nvidia stock price target after China surprise
- Nvidia CEO hits Warren Buffett milestone
As a result, Nvidia's revenue has skyrocketed to $130 billion from about $27 billion in 2022, and thanks to juicy margins, net income has similarly soared to $73 billion last fiscal year from $9.8 billion in 2022.
Unsurprisingly, investors have rushed to buy Nvidia shares, sending its stock price 1,080% higher over the same period.
Given the massive run-up, perhaps it wasn’t shocking when many worried earlier this year that Nvidia’s growth was peaking and set to fall.
AI activity shifts from training chatbots and AI agents to using them
Training AI will remain a key cog supporting the need for chips and servers. Still, inference, or using existing AI models, will be a much bigger opportunity, particularly due to the rise of real-time AI applications, such as autonomous driving, which requires continuous AI activity.
Related: Leaked data shows Nvidia taking page from Zuckerberg's playbook
“Our analysis suggests that demand for AI-ready data center capacity will rise at an average rate of 33 percent a year between 2023 and 2030 in a midrange scenario,” wrote McKinsey, a global consulting firm, in 2024.
In short, AI inference will keep the gas pedal down on network infrastructure spending, supporting the need for increasingly more power.
Cloud service providers and hyperscalers, including Meta Platforms and Microsoft Azure, will spend the most on this buildout.
Cloud service providers account for over half of the planet’s AI-ready data center capacity. They’re already investing a lot in new and expanded data centers, but “This new capacity is not likely to keep pace with demand,” according to McKinsey's report.
As AI activity accelerates, so does the size of these centers and the power necessary to operate them. It wasn’t too long ago that 30-megawatt data centers were common. Now, “a 200-MW facility is considered normal,” said McKinsey's analysts.
And the size and scope of these data centers are likely to get eye-poppingly larger.
Meta Platforms, Microsoft point to more, not less, AI spending
It’s hard for big companies to generate significant revenue and profit growth because of the law of large numbers. Yet, Meta Platforms and Microsoft results show that top-line sales growth is accelerating rather than slowing down recently, thanks to AI-driven demand.
At Meta Platforms, revenue of $47.5 billion grew 22% year over year, up from 16% in the first quarter. Microsoft’s revenue of $76.4 billion increased by 18%, up from 13% in the prior quarter and 15% in the same quarter last year.
The companies don't see demand letting up and indicated that, if anything, they'll remain capacity constrained, leading to more spending on things like Nvidia’s chips.
Meta Platforms bets on superintelligence with massive data centers, Prometheus and Hyperion
Meta Platforms’ CEO Mark Zuckerberg spent time on the earnings conference call on the emergence of what he calls superintelligence, announcing a new group within his company, Meta Superintelligence Labs, that will work on the next generation of AI solutions.
Related: Meta delivers eye-popping AI announcement
“Our Prometheus cluster is coming online next year, and we think it's going to be the world's first gigawatt-plus cluster. We're also building out Hyperion, which will be able to scale up to 5 gigawatts over several years,” said Zuckerberg on Meta Platforms' earnings conference call.
The low end of this year’s capital expenditures guidance was adjusted to at least $66 billion, up from $64 billion. At the midpoint of its $66 billion to $72 billion forecast, Meta Platforms’ year-over-year spending will increase by $30 billion.
Meta Platform’s CFO, Susan Li, said next year, “We currently expect another year of similarly significant CapEx dollar growth in 2026.” If so, then 2026 spending could be over $85 billion.
“It [Meta Platforms] doubled its capital spending levels year over year to $17 billion, bringing that spending to just over $29 billion for the first half of 2025,” wrote portfolio manager Chris Versace on TheStreet Pro. “While Meta narrowed its 2025 capital spending forecast to $66 billion to $72 billion, up from $39.2 billion in 2024, that implies larger spending in the second half of 2025.”
Image source: Brad Barket/Getty
Microsoft plows big money into Azure to catch up to AI demand
Meanwhile, Microsoft’s CEO Satya Nadella isn’t resting on the company's laurels, either. In its cloud business, Azure, revenue surged 39% year-over-year, exceeding Wall Street’s 34% estimate to eclipse $75 billion.
“We stood up more than 2 gigawatts of new capacity over the past 12 months alone. And we continue to scale our own data center capacity faster than any other competitor,” said Nadella.
Related: Microsoft analysts reboot stock price targets ahead of Q4 earnings
Microsoft’s capital expenditures totaled $24.2 billion last quarter, its fiscal fourth quarter.
“We will continue to invest against the expansive opportunity ahead across both capital expenditures and operating expenses given our leadership position in commercial cloud, strong demand signals for our cloud and AI offerings, and significant contracted backlog,” said Microsoft CEO Amy Hood on the conference call.
The pace of growth may slow, but total dollars spent are expected to increase, translating into more purchases of Nvidia’s next-generation GPUs.
“Capital expenditure growth, as we shared last quarter, will moderate compared to FY '25 with a greater mix of short-lived assets,” said Hood. “We expect [fiscal] Q1 capital expenditures to be over $30 billion.”
That spending is geared toward meeting an enormous backlog of customer demand, which has Microsoft still playing catch-up on supply.
“We have $368 billion of contracted backlog we need to deliver, not just across Azure but across the breadth of the Microsoft Cloud,” said Hood. “Our investments, particularly in short-lived assets like servers, GPUs, CPUs, networking storage, is just really correlated to the backlog we see and the curve of demand… I thought we'd be in better supply-demand shape by June. And now I'm saying I hope I'm in better shape by December… Even with accelerating the spend and trying to pull leases in and get CPUs and GPUs in the system as quickly as we can, we are still seeing demand improve.”
The chase to catch up to demand likely means more Nvidia GPU sales, including higher demand for its priciest fully outfitted systems.
Given that backdrop, it wouldn’t be surprising if Wall Street currently ups its full-year revenue forecast for Nvidia from $201 billion.
Todd Campbell owns Nvidia shares.
Related: Billionaire Ackman has one-word message on stock market