Today at the AMD "together we advance_gaming" event, AMD revealed their new RDNA3 architecture along with the RX 7900 XTX and RX 7900 XT GPUs. Both of these new cards will be available on December 13th, and AMD threw plenty of shade at NVIDIA of the power use and connector issues during the event talking about how "easy" it is to upgrade to it and noting the power use.
Specifications:
AMD Radeon RX 7900 XT | AMD Radeon RX 7900 XTX | |
Memory | 20 GB - GDDR6 Infinity Cache - 80 MB Ray Accelerators - 84 |
24 GB - GDDR6 Infinity Cache - 96 MB Ray Accelerators - 96 |
Speed | Base Frequency - 1500 MHz Boost Frequency - Up to 2400 MHz Game Frequency - 2000 MHz |
Base Frequency - 1900 MHz Boost Frequency - Up to 2500 MHz Game Frequency - 2300 MHz |
Connections | DisplayPort 2.1 HDMI 2.1 USB Type-C |
DisplayPort 2.1 HDMI 2.1 USB Type-C |
Rendering | HDMI 4K Support 4K H264 Decode 4K H264 Encode H265/HEVC Decode H265/HEVC Encode AV1 Decode AV1 Encode |
HDMI 4K Support 4K H264 Decode 4K H264 Encode H265/HEVC Decode H265/HEVC Encode AV1 Decode AV1 Encode |
Power | Typical Board Power (Desktop) - 300 W Minimum PSU Recommendation - 750 W |
Typical Board Power (Desktop) - 355 W Minimum PSU Recommendation - 800 W |
Dimension | Length - 276 mm Slot Size - 2.5 slots |
Length - 287 mm Slot Size - 2.5 slots |
Pricing | $899 | $999 |
They also teased FSR3, which will be due out next year but didn't go into much detail on it. According to AMD FSR3 is "expected to deliver up to 2X more FPS compared to AMD FSR 2 in select games".
- AMD RDNA 3 Architecture – Featuring an advanced chiplet design, new compute units and second-generation AMD Infinity Cache technology, AMD RDNA 3 architecture delivers up to 54% more performance per watt than the previous-generation AMD RDNA 2 architecture. New compute units share resources between rendering, AI and raytracing to make the most effective use of each transistor for faster, more efficient performance than the previous generation.
- Chiplet Design – The world’s first gaming GPU with a chiplet design delivers up to 15% higher frequencies at up to 54% better power efficiency. It includes the new 5nm 306mm Graphics Compute Die (GCD) with up to 96 compute units that provide the core GPU functionality. It also includes six of the new 6nm Memory Cache Die (MCD) at 37.5mm, each with up to 16MB of second-generation AMD Infinity Cache technology.
- Ultra-Fast Chiplet Interconnect – Unleashing the benefits of second-generation AMD Infinity Cache technology, the new chiplets leverage AMD Infinity Links and high-performance fanout packaging to deliver up to 5.3TB/s of bandwidth.
- Expanded Memory and Wider Memory Bus – To meet the growing requirements of today’s demanding titles, the new graphics cards feature up to 24GB of high-speed GDDR6 memory running at 20Gbps over a 384-bit memory bus.
Based on the pricing, they seem like pretty great value to me. Having a flagship under $1K is a very good move when compared to what NVIDIA are offering. If the performance is in any way comparable, it should sell quite well.
From the press release: “These new graphics cards are designed by gamers for gamers. As we were developing the new cards, we not only incorporated feedback from our customers, but we built in the features and capabilities we wanted to use,” said Scott Herkelman, senior vice president & general manager, Graphics Business Unit at AMD. “We also realized that we needed to do something different to continue pushing the envelope of the technology, and I’m proud of what the team has accomplished with AMD RDNA 3 and the Radeon RX 7900 Series graphics cards. I can’t wait for gamers to experience the powerhouse performance, incredibly vivid visuals and amazing new features these new graphics cards offer.”
Full event can be seen below:
Direct Link
Also, it's still fun to see the Steam Deck picture on such events. AMD made the APU so it's only natural for them to highlight it but nice to see it again like this for a device that's helping to do so much for Linux gaming as a whole.
Well, we are here gaming on Linux, aren't we? So Windows isn't my consideration :) I obviously evaluate it all from perspective of Linux gaming and choose hardware with that in mind.
Well many people do run Windows via Linux VM passthrough.
The point I was trying to make is you should consider how well said game and hardware runs on NON prototype experimental drivers before deciding on things (I'm talking about Windows drivers here).
Because RADV and AMDVLK are not perfect and will be giving you quite a skewed perspective on how said hardware should run. It also gives you a idea of how much further the Linux drivers need to come before the HW meets its true potential under Linux.
Your going to see lots of benchmarks and performance figures over the coming weeks and most will be Windows based. When looking at the Phoronix benchmarks when they come out; you should compare in order to understand the issues. Imagine buying a 7900XT and having it perform like a 5700XT because well, drivers suck...
May be when Star Citizen will release Linux version with their Vulkan renderer there will be some heavy native game to test.
And if 7900XT is better than 6800XT then why not. Drivers are already good and are improving. Translation hit will be around, so that should be taken into account.
Last edited by Shmerl on 4 November 2022 at 7:13 am UTC
Anyway when I bought my 6800XT it was capable of 4k 60fps no problem for majority of my games, the issue became when I ran those games under Linux and had perf hits up to %40 in some cases. Or other random graphical issues.
I generally don't buy things based on prototype driver scores, its a bad way to do it.
Just gotta hope eventually performance will catch up and I think it has quite a bit, still not great in all things.
Bottom line is I'm not going to be looking at how well the 7900XTX performs under Linux.. its probably going to be all sorts of broken on release for Linux drivers, or have performance issues.
Just gotta look at how well it does under stable windows drivers and hope for a better future under Linux... eventually.
Last edited by TheRiddick on 4 November 2022 at 7:14 am UTC
By that time they'll be marketing 16K may be, lol. But who cares ;)
Last edited by Shmerl on 4 November 2022 at 7:26 am UTC
I think in a couple more GPU generations 4K will become feasible with good framerates without upscaling.
It already is. Just like I said some games can't be fixed, they broke and don't perform well at higher resolutions. Even KCD runs just fine at 4k with special optimized persets on my 6800xt.. but again that is under windows, under Linux there seems to be a limit to what DXVK can do, or the drivers.
That doesn't just apply to potential target framerate on given resolution. For instance VRAM requirements on Linux are generally higher than on Windows in case of vkd3d-proton.
Last edited by Shmerl on 4 November 2022 at 7:34 am UTC
I think in a couple more GPU generations 4K will become feasible with good framerates without upscaling.4k runs just fine on current gen in most games, my 2070 super can hold 60fps in most games easily, even without DLSS. (though it's pretty borderline on many so I tend to keep it at 1440p to maintain 120+ fps) 3000/4000 series or 6800 series plus shouldn't have any issues for most unless you've got some config problems or the game just has terrible optimization.
By that time they'll be marketing 16K may be, lol. But who cares ;)
VR with the index is 2880x1600 but requires 144fps to not be vomit inducing, so it actually takes a bit more power to render than 4k/60
Last edited by raptor85 on 4 November 2022 at 9:13 am UTC
Cards should get a rating on how loud they are.
I wouldn't buy a reference card, noise being one of the reasons. AMD partners like Sapphire generally make better cooled and more silent designs since they need to differentiate in something.
That's exactly why I'd like that information ;-). Not only of a reference design but all cards, since for me that can be an argument for a card. A big one to be true.
With climate change creeping up on us, current international political affairs affecting prices and availability of energy, and the market saying yes to devices like the Switch and the Deck, perhaps it's about time to demand these companies to release things more align with what the world needs.
We need to use much less energy. Not the same, and definitely not more.
At least we won't see a 70 class card from AMD costing $900.
LTT has a well-educated video on AMD's announcement entitled 'Goodbye NVIDIA!'
Also, pointed out in the video, the 4090 only supports DisplayPort 1.4a while AMD's new offerings support 2.1
Last edited by StalePopcorn on 4 November 2022 at 5:34 pm UTC
though it's pretty borderline on many so I tend to keep it at 1440p to maintain 120+ fps
That's exactly the point. Why pay for 4K screen if you can't run many games at high enough framerate and need to lower resolution? To me it would feel like paying and not using it. That's why I think 2560x1440 is still the optimal resolution that GPU can handle currently.
Last edited by Shmerl on 4 November 2022 at 4:40 pm UTC
Cards should get a rating on how loud they are.
I wouldn't buy a reference card, noise being one of the reasons. AMD partners like Sapphire generally make better cooled and more silent designs since they need to differentiate in something.
That's exactly why I'd like that information ;-). Not only of a reference design but all cards, since for me that can be an argument for a card. A big one to be true.
I might also add airflow requirements for the cooler, plz? That would be nice to know. I build ITX boxes and getting enough air to these things can be a pain.
Movies/streaming, work, there ARE other things we do on computers besides games you know :Pthough it's pretty borderline on many so I tend to keep it at 1440p to maintain 120+ fps
That's exactly the point. Why pay for 4K screen if you can't run many games at high enough framerate and need to lower resolution? To me it would feel like paying and not using it. That's why I think 2560x1440 is still the optimal resolution that GPU can handle currently.
Also remember my card is 2 full generations behind and not even the highest end of it's generation and can run 4k 99% of the time just fine, a 3xxx or 4xxx series or any newer radion should do 4k just fine at 60/120fps.
Movies/streaming, work, there ARE other things we do on computers besides games you know :P
I'm fine for that with 2560x1440. You can't have a separate monitor for every use case, so I think that's still the optimum, while 4K is just ahead of current generation of GPUs when it comes to gaming.
99% is not my experience. If I'd be getting ≥ 144 fps with my current GPU and resolution, then I'd consider a higher res screen.
And as above, for games I care about I'm not even getting 100 fps so far.
Last edited by Shmerl on 4 November 2022 at 6:35 pm UTC
Am I the only one to think that even AMD's new cards are better than Nvidia's on power consumption, 300/355 TDP is still crazy high?There's still 100 watt and less cards on the market widely available, nobody's putting a gun to your head forcing you to get a higher end card. An extra 200 watts is also not really a lot compared to most other things in a normal household, you can literally offset that usage by not watching TV while the computer is on.
With climate change creeping up on us, current international political affairs affecting prices and availability of energy, and the market saying yes to devices like the Switch and the Deck, perhaps it's about time to demand these companies to release things more align with what the world needs.
We need to use much less energy. Not the same, and definitely not more.
See more from me