Today at the AMD "together we advance_gaming" event, AMD revealed their new RDNA3 architecture along with the RX 7900 XTX and RX 7900 XT GPUs. Both of these new cards will be available on December 13th, and AMD threw plenty of shade at NVIDIA of the power use and connector issues during the event talking about how "easy" it is to upgrade to it and noting the power use.
Specifications:
AMD Radeon RX 7900 XT | AMD Radeon RX 7900 XTX | |
Memory | 20 GB - GDDR6 Infinity Cache - 80 MB Ray Accelerators - 84 |
24 GB - GDDR6 Infinity Cache - 96 MB Ray Accelerators - 96 |
Speed | Base Frequency - 1500 MHz Boost Frequency - Up to 2400 MHz Game Frequency - 2000 MHz |
Base Frequency - 1900 MHz Boost Frequency - Up to 2500 MHz Game Frequency - 2300 MHz |
Connections | DisplayPort 2.1 HDMI 2.1 USB Type-C |
DisplayPort 2.1 HDMI 2.1 USB Type-C |
Rendering | HDMI 4K Support 4K H264 Decode 4K H264 Encode H265/HEVC Decode H265/HEVC Encode AV1 Decode AV1 Encode |
HDMI 4K Support 4K H264 Decode 4K H264 Encode H265/HEVC Decode H265/HEVC Encode AV1 Decode AV1 Encode |
Power | Typical Board Power (Desktop) - 300 W Minimum PSU Recommendation - 750 W |
Typical Board Power (Desktop) - 355 W Minimum PSU Recommendation - 800 W |
Dimension | Length - 276 mm Slot Size - 2.5 slots |
Length - 287 mm Slot Size - 2.5 slots |
Pricing | $899 | $999 |
They also teased FSR3, which will be due out next year but didn't go into much detail on it. According to AMD FSR3 is "expected to deliver up to 2X more FPS compared to AMD FSR 2 in select games".
- AMD RDNA 3 Architecture – Featuring an advanced chiplet design, new compute units and second-generation AMD Infinity Cache technology, AMD RDNA 3 architecture delivers up to 54% more performance per watt than the previous-generation AMD RDNA 2 architecture. New compute units share resources between rendering, AI and raytracing to make the most effective use of each transistor for faster, more efficient performance than the previous generation.
- Chiplet Design – The world’s first gaming GPU with a chiplet design delivers up to 15% higher frequencies at up to 54% better power efficiency. It includes the new 5nm 306mm Graphics Compute Die (GCD) with up to 96 compute units that provide the core GPU functionality. It also includes six of the new 6nm Memory Cache Die (MCD) at 37.5mm, each with up to 16MB of second-generation AMD Infinity Cache technology.
- Ultra-Fast Chiplet Interconnect – Unleashing the benefits of second-generation AMD Infinity Cache technology, the new chiplets leverage AMD Infinity Links and high-performance fanout packaging to deliver up to 5.3TB/s of bandwidth.
- Expanded Memory and Wider Memory Bus – To meet the growing requirements of today’s demanding titles, the new graphics cards feature up to 24GB of high-speed GDDR6 memory running at 20Gbps over a 384-bit memory bus.
Based on the pricing, they seem like pretty great value to me. Having a flagship under $1K is a very good move when compared to what NVIDIA are offering. If the performance is in any way comparable, it should sell quite well.
From the press release: “These new graphics cards are designed by gamers for gamers. As we were developing the new cards, we not only incorporated feedback from our customers, but we built in the features and capabilities we wanted to use,” said Scott Herkelman, senior vice president & general manager, Graphics Business Unit at AMD. “We also realized that we needed to do something different to continue pushing the envelope of the technology, and I’m proud of what the team has accomplished with AMD RDNA 3 and the Radeon RX 7900 Series graphics cards. I can’t wait for gamers to experience the powerhouse performance, incredibly vivid visuals and amazing new features these new graphics cards offer.”
Full event can be seen below:
Direct Link
Also, it's still fun to see the Steam Deck picture on such events. AMD made the APU so it's only natural for them to highlight it but nice to see it again like this for a device that's helping to do so much for Linux gaming as a whole.
Quoting: ShmerlThough for a $1000 card, I don't see any point in using FSR.
FSR is a good option to keep native resolution while tweaking the internal render resolution.
Have you ever tried to run a 4k screen at 1440p or 1080p resolution? its vomit looking.. MUCH better solution to just upscale the render resolution rather then rely on pure resolution switching!
4k+FSR at 8k native screen will look allot better then running 4kres solution on 8k screen.. LCD's etc are just shit at resolutions outside native; They just can't compete with how well CRT tube screens could do it.
Last edited by TheRiddick on 4 November 2022 at 6:16 am UTC
Quoting: TheRiddickFSR is a good option to keep native resolution while tweaking the internal render resolution.
Have you ever tried to run a 4k screen at 1440p or 1080p resolution?
I see. But what's the point of having a 4K screen if you want to run something at lower resolution? It's like overpaying for something you aren't using.
And I agree, better to always run LCD at native resolution. But then I want the game to run at it. That's why I'm not buying 4K screens (yet). GPUs can't sustain that with high enough framerates for demanding games.
Last edited by Shmerl on 4 November 2022 at 6:32 am UTC
Fact is many games will run at above 60fps on my 6800xt at 4k, but some just can't, especially if I mod them.
Non the less I would only use FSR2 or similar, not a big fan of FSR1 but I know some people really like it, especially if you can't hit 60fps even at native, its a good temp solution until prices come down with cards.
I bought this LG C1 OLED because it was good price and OLED is GOD!
Quoting: TheRiddickIf your buying a monitor based on the shittiest performing game you own then you're forever going to be stuck at 1080/1440p...
I'd measure it by games I care about to work well enough. I.e. for instance I wanted to play Cyberpunk 2077 with decent framerate. That means I'm not going to play it on 4K screen. I get around 80-90 fps on 2560x1440 with 6800 XT. I can imagine framerate tanking way too much on 4K. And for these games I want good image quality, not upscaling. So this is a metric that shows that 4K just isn't ready for me to buy into.
I agree though, it depends on the types of games. Some not very demanding games can have acceptable framerate on 4K with such GPU.
Last edited by Shmerl on 4 November 2022 at 6:43 am UTC
I did play it all way through with my 6800xt at 4k but had things dialed down and in certain busy city areas I hit 40fps at times, it was pretty bad. But 60-70% of the game was well over 60fps.
Even the 4090 can't get that great fps with CP at 4k and DLSS3, its like 80-90fps on the outskirts... you know area of the game that isn't too taxing!
Good thing about CP is that it does have all the upscaling tech now. I plan to go back to it sometime, maybe when DLC releases.
Quoting: jordicomaI'm not to impressed. It's to expensive. Yes it's better than nvidia price, but still to expensive.too impressed. too expensive. NOT (to) please make the difference.
And I don't understand the AI accelerator part.
But that's exactly the point. I want to play it on native resolution and preferably best settings (not counting RT at least). But I also want high enough framerate, preferably something higher than 60. So it hitting 90 on 2560x1440 without any upscaling is pretty good. 4K just wouldn't work well enough for such scenario.
My point is, if you go for 4K, it means you want a better image quality. If you need to upscale becasue GPU can't cope with it kind of undermines the whole idea of that since it goes in the opposite direction image quality wise.
Last edited by Shmerl on 4 November 2022 at 6:50 am UTC
I'd assume they will be using it for that, also it can be used to speed up RT performance as shown by many custom RT projects on YT where they use software ways to speed things up, which can be run on hardware and accelerated with AI algorithms...
Just going to take time for these great things to come to the market/games.. NVIDIA sort of kicks things off but then they go all proprietary on everyone and it takes AMD and Intel to expand the tech possibilities.
Quoting: ShmerlBut that's exactly the point. I want to play it on native resolution and preferably best settings (not counting RT at least). But I also want high enough framerate, preferably something higher than 60. So it hitting 90 on 2560x1440 without any upscaling is pretty good. 4K just wouldn't work well enough for such scenario.
Then you should probably avoid running it through Linux/DXVK then... just saying. Translation layers and Linux drivers are always going to be a big issue for next-gen gaming and reaching those performance demands.
If you run CP2077 on windows it performs allot better overall (I tried both and decided to keep playing it on Windows rather then DXVK under Linux which was giving me too many woes!)
Another good example is KCD Kingdom Come Deliverance.
I get +20fps under Windows vs Linux.. So under Linux I get 50fps but under Windows I get 70fps.... if you buying a card based on Linux performance then you should just stay away from certain games for the time being... Drivers not there yet... I don't give two shits what anyone says, their JUST NOT THERE! yet...
(The fact 10bit colour is still a no no, and ray tracing is still experimental in most cases kind of says it all)
Last edited by TheRiddick on 4 November 2022 at 7:00 am UTC
Quoting: TheRiddickThen you should probably avoid running it through Linux/DXVK then... just saying. Translation layers and Linux drivers are always going to be a big issue for next-gen gaming and reaching those performance demands.
Well, we are here gaming on Linux, aren't we? So Windows isn't my consideration :) I obviously evaluate it all from perspective of Linux gaming and choose hardware with that in mind.
Last edited by Shmerl on 4 November 2022 at 7:02 am UTC
See more from me