3 key takeaways from Nvidia's earnings: China blow, cloud strength and AI future

The company's data center division continues to surge as companies, countries and cloud providers snap up GPUs for artificial intelligence software.

3 key takeaways from Nvidia's earnings: China blow, cloud strength and AI future

 There's meaningful upside from here going forward

Nvidia reported strong fiscal first-quarter earnings on Wednesday.

Wall Street was pleased with Nvidia's continued sales growth, which hit 69% during the quarter. The company's data center division continues to surge as companies, countries and cloud providers snap up Nvidia graphics processing units, or GPUs, for artificial intelligence software.

"The team continues to maintain a 1- 2 step lead ahead of competitors with its silicon/hardware/software platforms and a strong ecosystem, and the team is further distancing itself with its aggressive cadence of new product launches and more product segmentation over time," wrote JPMorgan analyst Harlan Sur.

Here are three big takeaways from the company's earnings:

China could be a $50 billion market for Nvidia, but U.S. export controls are getting in the way

Nvidia expects to sell about $45 billion in chips during the July quarter, it revealed on Wednesday, but that's missing about $8 billion in sales that the company would have recorded if not for the U.S. restricting exports of its H20 chip without a license.

Nvidia also said it missed out on $2.5 billion in sales during the April quarter thanks to the export restrictions on H20.

Nvidia CEO Jensen Huang said on the company's earnings call that China represented a $50 billion market that had effectively been closed to Nvidia.

He also said that the export controls were misguided, and would merely encourage Chinese AI developers to use homegrown chips, instead of making an American platform the world's choice for AI software.

"The U.S. has based its policy on the assumption that China cannot make AI chips. That assumption was always questionable, and now it's clearly wrong," Huang said.

He said that export controls were driving AI talent to use chips from homegrown Chinese rivals, such as Huawei.

"We want every developer in the world to prefer the American technology stacks," Huang told CNBC's Jim Cramer on Wednesday night.

Nvidia said it didn't have a replacement chip for China ready, but that it was considering options for "interesting products" that could be sold in the market.

Strength in the company's Blackwell business balanced out some concerns over the China impact.

"NVIDIA is putting digestion fears fully to rest, showing acceleration of the business other than the China headwinds around growth drivers that seem durable. Everything should get better from here," said Morgan Stanley analyst Joseph Moore.

Cloud providers are still Nvidia's most important customers

Nvidia says that it has many customers ranging from sovereign nations to universities to enterprises that want to research AI.

But it confirmed again on Wednesday that cloud providers — companies like Microsoft Azure, Google Cloud, Oracle Cloud Infrastructure and Amazon Web Services — still make up about half of its data center revenue, which reported $39.1 billion in sales during the quarter.

These companies tend to buy the fastest and latest Nvidia chips, including Blackwell, which comprised 70% of Nvidia's data center sales during the quarter, CFO Colette Kress said on the earnings call.

Microsoft, for example, had already deployed "tens of thousands" of Blackwell GPUs, the company said, processing "100 trillion tokens" in the first quarter. Tokens are a measure of AI output.

And it'll be first in line to get Blackwell Ultra, an updated version of the chip with additional memory and performance. Nvidia said shipments of those systems will start during the current quarter.

Bernstein's Stacy Rasgon said the "general outlook and environment overall seems very encouraging" as the company ramps up its Blackwell rollout and compute requirements grow.

"Amid a messy quarter, NVIDIA is comporting themselves extremely well," he said.

Looking forward: Blackwell and AI inference

For the past few years, many Nvidia GPUs were used for a resource-intensive process called training, where data is processed through an AI model until it gains new abilities.

Now, Huang is talking up the potential for Nvidia's GPUs to serve the AI models to millions of customers, a process called inference in the industry. He said on the earnings call that is where new surging demand is coming from.

"Overall, we believe NVDA's technology leadership remains strong, with growth in Blackwell shipments benefitting from exponential growth in reasoning AI and the achievement of economies of scale," said Deutsche Bank's Ross Seymore.

Huang says that the latest AI models need to generate more tokens — or create more output — in order to do "reasoning," which improves AI answers. Of course, Nvidia's latest Blackwell chips are designed for this, Huang said.

"We are witnessing a sharp jump in inference demand," Huang said. "OpenAI, Microsoft and Google are seeing a step-function leap in token generation."

Huang compared modern AI models to the "one-shot" approach that ChatGPT used when it first debuted in 2022, and said that the new models need "a hundred, a thousand times more" computing.

"It's essentially thinking to itself, breaking down a problem step by step," Huang said. "It might be planning multiple paths to an answer. It could be using tools, reading PDFs, reading web pages, watching videos and then producing a result."

Bonus: Jensen's concerns

Huang struck a notably more somber tone during the call, focusing heavily on the impact of export controls rather than his usual evangelizing about AI's world-changing potential.

He spoke at length on the call about U.S. chip restrictions and clearly stated how much of an impact the limits have on current and future business.

"The AI race is not just about chips," he said. "It's about which stack the world runs on. As that stack grows to include 6G and quantum, U.S. global infrastructure leadership is at stake."

CNBC's Kristina Partsinevelos contributed to this article.

Correction: Stacy Rasgon is an analyst at Bernstein. An earlier version misspelled his name.

Don’t miss these insights from CNBC PRO