Nvidia’s next-generation graphics processor for synthetic intelligence, known as Blackwell, will value between $30,000 and $40,000 per unit, CEO Jensen Huang instructed CNBC’s Jim Cramer.
“This may value $30 to $40 thousand {dollars},” Huang stated, holding up the Blackwell chip.
“We needed to invent some new know-how to make it doable,” he continued, estimating that Nvidia spent about $10 billion in analysis and growth prices.
The worth means that the chip, which is more likely to be in scorching demand for coaching and deploying AI software program like ChatGPT, will likely be priced in an analogous vary to its predecessor, the H100, or the “Hopper” technology, which value between $25,000 and $40,000 per chip, in accordance with analyst estimates. The Hopper technology, launched in 2022, represented a major value improve for Nvidia’s AI chips over the earlier technology.
Nvidia CEO Jensen Huang compares the scale of the brand new “Blackwell” chip versus the present “Hopper” H100 chip on the firm’s developer convention, in San Jose, California.
Nvidia
Nvidia broadcasts a brand new technology of AI chips about each two years. The newest, like Blackwell, are usually sooner and extra vitality environment friendly, and Nvidia makes use of the publicity round a brand new technology to rake in orders for brand spanking new GPUs. Blackwell combines two chips and is bodily bigger than the previous-generation.
Nvidia’s AI chips have pushed a tripling of quarterly Nvidia gross sales for the reason that AI growth kicked off in late 2022 when OpenAI’s ChatGPT was introduced. Many of the prime AI corporations and builders have been utilizing Nvidia’s H100 to coach their AI fashions over the previous 12 months. For instance, Meta is shopping for tons of of 1000’s of Nvidia H100 GPUs, it stated this 12 months.
Nvidia doesn’t reveal the record value for its chips. They arrive in a number of totally different configurations, and the worth an finish shopper like Meta or Microsoft may pay will depend on components comparable to the amount of chips bought, or whether or not the shopper buys the chips from Nvidia instantly via an entire system or via a vendor like Dell, HP, or Supermicro that builds AI servers. Some servers are constructed with as many as eight AI GPUs.
On Monday, Nvidia introduced no less than three totally different variations of the Blackwell AI accelerator — a B100, a B200, and a GB200 that pairs two Blackwell GPUs with an Arm-based CPU. They’ve barely totally different reminiscence configurations and are anticipated to ship later this 12 months.