Copy link Facebook X Whatsapp Reddit Pinterest Flipboard Email Share this article 0 Join the conversation Follow us Add us as a preferred source on Google Newsletter The biggest gaming news, reviews and hardware deals Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Become a Member in Seconds Unlock instant access to exclusive member features.
Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors By submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over.
You are now subscribed Your newsletter sign-up was successful Want to add more newsletters? Join the club Get full access to premium articles, exclusive features and a growing list of member rewards.
.
Explore An account already exists for this email address, please log in.
Subscribe to our newsletter Jeff Dean and Bill Dally, Advancing to AI's next Frontier, Nvidia GTC 2026 - YouTube Watch On For more than 16 years, Nvidia's annual GTC event has been packed full with talks and presentations about all the things you can do with a GPU beyond rendering 3D graphics.
This year's conference was no different, of course, but hidden amongst all the talk was an insight into how Nvidia goes about designing its chips, and AI is naturally a big part of it all.It was Bill Dally, Nvidia's chief scientist, who let us in on this behind-the-scenes glimpse, whilst chatting with his equivalent at Google, Jeff Dean, on the topic of 'advancing to AI's next frontier'.
The first bit that caught my attention (thanks to Bearly AI on X) was how Nvidia uses an AI agent when it switches to a new process node."Every time we have a new semiconductor process, we have to port our standard cell library to it.
.
It's about 2,500 - 3,000 cells
## Editor's Note
What do you think about this news? Share your thoughts below!
Source: [PC Gamer](https://www.pcgamer.com/hardware/graphics-cards/not-that-we-should-be-at-all-surprised-but-nvidia-leans-on-ai-pretty-hard-for-speeding-up-how-it-plans-and-designs-its-next-generation-of-gpus/)
