Nvidia has contained its dominance of artificial intelligence (ai) datacentres, with its latest Quarterly Results Showing Revenue Growth of 16% – A 93% Increase from the Same from the Same;
The company's datacentre business reported Quarterly Revenue of $ 35.6BN, and Revenue of $ 115BN for the full year – a 142% increase compared to last year.
In his prepared remarks, nvidia ceo and founder jensen huang said: “Demand for Blackwell is amazing as reasoning ai adds another scaling law – Increasing Compute for Training Makes Models Smarter and Increasing Compute for Long Thinking Makes The Answer Smarter.
“We've successfully ramped up the massive-security of blackwell ai supercomputers, achieving bills of dollars in sales in its first quality. AI is advancing at light speed as agentic ai and physical ai set for the next wave of ai to revolutions the largest Industries. “
During the earnings call, Financial analysts questioned nvidia over deepsekWhich requires Less Powerful Graphics Processing Units (GPUS), and the fact that Cloud Service Provides (CSPS) Such as Microsoft are designing their own custom chips optimized formsed forms.
According to a transcript of the earnings call Posted on seeking alphaCSPS Account for About Half of Nvidia's business. But there is also growing demand from enterprise customers. Huang Said: “We see the growth of enterprise going forward,” which he believes represents a larger options to sell to sell nvidia gpus long term.
Huang used the earnings call to discuss why he thinks new ai models will drive up demand, even as ai models become more computerly efficient. “The More the Model Thinks, The Smarter The Answer,” He said. “Models like Openai, Grok-3 and Deepsek-R1 Are Reasoning Models that Apply Infererance Time Scaling. Reasoning Models Can Consume 100 Times More Compute. Future Reasoning Models Can Consume Much More Compute. “
When asked about the risk that csps was developing Application-specific integrated circuits (asics) INTEAD OF USING GPUS, Huang Responded by Talking About The Complexity of the Technology Stack that Sits on Top of the Challenge, Implying That That This Would Be a Challer Gpus. “The software stack is incredibly hard. Building an asic is no different to what we do do – we live a new architecture, “He said.
According to Huang, The Technology Ecosystem that Sits on Top of this Nvidia Architecture is 10 Times More Complex Today it was two years ago. “That's Fairly Obvious,” He Said, “Because the Amount of Software that the World is Building on top of Architecture is Growing exponitively and Ai I is Advanceing Very Quickly. So bringing that whole ecosystem [together] On top of multiple chips is hard. “
Discussing the Nvidia Results, Forrester Analyst Alvin Nguyen Said: “Having yet another record performance from nvidia seems commonplace despite despite despite despite despite despite despite despite of The record earnings represent the continued demand for nvidia ai products. The Emphasis on Reasoning Models Driving More, Not Less, Computation is a Good Verbal Countter to the Worries about Deepsek Impacting their Demand. “
However, in nguyen's opinion, huang's responses to questions
“Their response to the question about custom chips from amazon, microsoft and google threatening their business was dismissive and ignores the need for these companies to have options outside of nvidia and to Semiconductors tailored specifically to their ai training and infeRecing needs, ”He said.