Saturday, November 26, 2022

World No 1 Tech News Website

Into the GPU Chiplet Era: An interview with AMD’s Sam Naffziger

Must read

fatima khan
fatima khanhttps://myelectricsparks.com/fatima-khan/
A brand new writer in the fields, Fatima has been taken under my electric spark's RGB- rich and ensures she doesn't engage in excessive snark on the website. It's unclear what command and Conquer are; however, she can talk for hours about the odd rhythm games, hardware, product reviews, and MMOs that were popular in the 2000s. Fatima has been creating various announcements, previews, and other content while here, but particularly enjoys writing regarding Products' latest news in the market she's currently addicted to. She is likely talking to an additional blogger with her current obsession right now.

We recently had a chance to speak with Sam Naffziger, AMD’s Senior Vice President, Corporate Fellow, and Product Technology Architect, about what’s been happening with Radeon graphics over the past several years, and what we can expect going forward. AMD recently provided some tantalizing details on its upcoming RDNA 3 GPU architecture, which is slated to launch before the end of the year with a chiplet-based design, and that provides the background for our interview and Q&A session. We’ve taken our discussion and distilled it down to the key points, so we won’t present this in interview format.

Naffziger has been at AMD for 16 years and is responsible for multiple product areas, with a focus on driving higher performance per watt and improving the overall competitiveness of AMD‘s CPUs and GPUs. He’s also one of the primary people behind AMD‘s chiplet architecture, which has proven incredibly successful in the Ryzen and EPYC CPU lines and which will now be coming in some form to AMD RDNA 3 graphics. Naffziger outlined the challenges facing the company, and how he feels that innovative tech, like a chiplet-based GPU architecture, can result in both improved performance and power efficiency.  

Running Into the Power Wall

AMD Sam Naffziger” class=”expandable lazy-image-van” onerror=”if(this.src && this.src.indexOf(‘missing-image.svg’) !== -1){return true;};this.parentNode.replaceChild(window.missingImage(),this)” data-normal=”https://vanilla.futurecdn.net/tomshardware/media/img/missing-image.svg” data- data- data-original-mos=”https://cdn.mos.cms.futurecdn.net/yxcZZ8c8kxhJvAxwdkYm2W.jpg” data-pin-media=”https://cdn.mos.cms.futurecdn.net/yxcZZ8c8kxhJvAxwdkYm2W.jpg”/>AMD Sam Naffziger” class=”expandable lazy-image-van” onerror=”if(this.src && this.src.indexOf(‘missing-image.svg’) !== -1){return true;};this.parentNode.replaceChild(window.missingImage(),this)” data-normal=”https://vanilla.futurecdn.net/tomshardware/media/img/missing-image.svg” data- data- data-original-mos=”https://cdn.mos.cms.futurecdn.net/yxcZZ8c8kxhJvAxwdkYm2W.jpg” data-pin-media=”https://cdn.mos.cms.futurecdn.net/yxcZZ8c8kxhJvAxwdkYm2W.jpg”/><a href=AMD Sam Naffziger” class=”expandable lazy-image-van” onerror=”if(this.src && this.src.indexOf(‘missing-image.svg’) !== -1){return true;};this.parentNode.replaceChild(window.missingImage(),this)” data-normal=”https://vanilla.futurecdn.net/tomshardware/media/img/missing-image.svg” data- data- src=”https://cdn.mos.cms.futurecdn.net/yxcZZ8c8kxhJvAxwdkYm2W.jpg” data-pin-media=”https://cdn.mos.cms.futurecdn.net/yxcZZ8c8kxhJvAxwdkYm2W.jpg”/>

(Image credit: AMD)

At the heart of modern microprocessor design, power use and efficiency are becoming increasingly problematic, and no company is immune to the side effects. All signs point to increased power consumption from next-gen GPUs: The PCIe 5.0 power interface and upcoming power supplies that support it are capable of supplying up to 600W over a single 16-pin connector, portending a broader industry shift to higher-power GPUs. It’s basically known that Nvidia‘s Ada architecture will push to higher power limits than we’ve seen in the past — current rumors suggest we might see 450W TBP (typical board power), and perhaps even as much as 600W TBP for the top RTX 40-series GPUs. There hasn’t been word yet of TBP’s for AMD’s RDNA 3, but it’s fair to think it could follow the same trend. 

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article