10th Indian Delegation to Dubai, Gitex & Expand North Star – World’s Largest Startup Investor Connect
AI

Nvidia’s keynote at GTC held some surprises


SAN JOSE — “I hope you realize this is not a concert,” said Nvidia President Jensen Huang to an audience so large, it filled up the SAP Center in San Jose. This is how he introduced what is perhaps the complete opposite of a concert: the company’s GTC event. “You have arrived at a developers conference. There will be a lot of science describing algorithms, computer architecture, mathematics. I sense a very heavy weight in the room; all of a sudden, you’re in the wrong place.”

It may not have been a rock concert, but the the leather-jacket wearing 61-year old CEO of the world’s third-most-valuable company by market cap certainly had a fair number of fans in the audience. The company launched in 1993, with a mission to push general computing past its limits. “Accelerated computing” became the rallying cry for Nvidia: Wouldn’t it be great to make chips and boards that were specialized, rather than for a general purpose? Nvidia chips give graphics-hungry gamers the tools they needed to play games in higher resolution, with higher quality and higher frame rates.

It is not a huge surprise, perhaps, that the Nvidia CEO drew parallels to a concert. The venue was, in a word, very concert-y. Image Credits: TechCrunch / Haje Kamps

Monday’s keynote was, in a way, a return to the company’s original mission. “I want to show you the soul of Nvidia, the soul of our company, at the intersection of computer graphics, physics and artificial intelligence, all intersecting inside a computer.”

Then, for the next two hours, Huang did a rare thing: He nerded out. Hard. Anyone who had come to the keynote expecting him to pull a Tim Cook, with a slick, audience-focused keynote, was bound to be disappointed. Overall, the keynote was tech-heavy, acronym-riddled, and unapologetically a developer conference.

We need bigger GPUs

Graphics processing units (GPUs) is where Nvidia got its start. If you’ve ever built a computer, you’re probably thinking of a graphics card that goes in a PCI slot. That is where the journey started, but we’ve come a long way since then.

The company announced its brand-new Blackwell platform, which is an absolute monster. Huang says that the core of the processor was “pushing the limits of physics how big a chip could be.” It uses combines the power of two chips, offering speeds of 10 Tbps.

“I’m holding around $10 billion worth of equipment here,” Huang said, holding up a prototype of Blackwell. “The next one will cost $5 billion. Luckily for you all, it gets cheaper from there.” Putting a bunch of these chips together can crank out some truly impressive power.

The previous generation of AI-optimized GPU was called Hopper. Blackwell is between 2 and 30 times faster, depending on how you measure it. Huang explained that it took 8,000 GPUs, 15 megawatts and 90 days to create the GPT-MoE-1.8T model. With the new system, you could use just 2,000 GPUs and use 25% of the power.

These GPUs are pushing a fantastic amount of data around — which is a very good segue into another topic Huang talked about.

What’s next

Nvidia rolled out a new set of tools for automakers working on self-driving cars. The company was already a major player in robotics, but it doubled down with new tools for roboticists to make their robots smarter.

The company also introduced Nvidia NIM, a software platform aimed at simplifying the deployment of AI models. NIM leverages Nvidia’s hardware as a foundation and aims to accelerate companies’ AI initiatives by providing an ecosystem of AI-ready containers. It supports models from various sources, including Nvidia, Google and Hugging Face, and integrates with platforms like Amazon SageMaker and Microsoft Azure AI. NIM will expand its capabilities over time, including tools for generative AI chatbots.

“Anything you can digitize: So long as there is some structure where we can apply some patterns, means we can learn the patterns,” Huang said. “And if we can learn the patterns, we can understand the meaning. When we understand the meaning, we can generate it as well. And here we are, in the generative AI revolution.”



Source link

AI
by The Economic Times

IBM said Tuesday that it planned to cut thousands of workers as it shifts its focus to higher-growth businesses in artificial intelligence consulting and software. The company did not specify how many workers would be affected, but said in a statement the layoffs would “impact a low single-digit percentage of our global workforce.” The company had 270,000 employees at the end of last year. The number of workers in the United States is expected to remain flat despite some cuts, a spokesperson added in the statement. A massive supplier of technology to… Source link

AI
by The Economic Times

The number of Indian startups entering famed US accelerator and investor Y Combinator’s startup programme might have dwindled to just one in 2025, down from the high of 2021, when 64 were selected. But not so for Indian investors, who are queuing up to find the next big thing in AI by relying on shortlists made by YC to help them filter their investments. In 2025, Indian investors have invested in close to 10 Y Combinator (YC) AI startups in the US. These include Tesora AI, CodeAnt, Alter AI and Frizzle, all with Indian-origin founders but based in… Source link

by Techcrunch

Lovable, the Stockholm-based AI coding platform, is closing in on 8 million users, CEO Anton Osika told this editor during a sit-down on Monday, a major jump from the 2.3 million active users number the company shared in July. Osika said the company — which was founded almost exactly one year ago — is also seeing “100,000 new products built on Lovable every single day.” Source link