Confused on Steam Play and Proton? Be sure to check out our guide.
We do often include affiliate links to earn us some pennies. See more here.

Today at the AMD "together we advance_gaming" event, AMD revealed their new RDNA3 architecture along with the RX 7900 XTX and RX 7900 XT GPUs. Both of these new cards will be available on December 13th, and AMD threw plenty of shade at NVIDIA of the power use and connector issues during the event talking about how "easy" it is to upgrade to it and noting the power use. 


Pictured: RX 7900 XTX

Specifications:

  AMD Radeon RX 7900 XT AMD Radeon RX 7900 XTX
Memory 20 GB - GDDR6
Infinity Cache - 80 MB
Ray Accelerators - 84
24 GB - GDDR6
Infinity Cache - 96 MB
Ray Accelerators - 96
Speed Base Frequency - 1500 MHz
Boost Frequency - Up to 2400 MHz
Game Frequency - 2000 MHz
Base Frequency - 1900 MHz
Boost Frequency - Up to 2500 MHz
Game Frequency - 2300 MHz
Connections DisplayPort 2.1
HDMI 2.1
USB Type-C
DisplayPort 2.1
HDMI 2.1
USB Type-C
Rendering HDMI 4K Support
4K H264 Decode
4K H264 Encode
H265/HEVC Decode
H265/HEVC Encode
AV1 Decode
AV1 Encode
HDMI 4K Support
4K H264 Decode
4K H264 Encode
H265/HEVC Decode
H265/HEVC Encode
AV1 Decode
AV1 Encode
Power Typical Board Power (Desktop) - 300 W
Minimum PSU Recommendation - 750 W
Typical Board Power (Desktop) - 355 W
Minimum PSU Recommendation - 800 W
Dimension Length - 276 mm
Slot Size - 2.5 slots
Length - 287 mm
Slot Size - 2.5 slots
Pricing $899 $999

They also teased FSR3, which will be due out next year but didn't go into much detail on it. According to AMD FSR3 is "expected to deliver up to 2X more FPS compared to AMD FSR 2 in select games".

  • AMD RDNA 3 Architecture – Featuring an advanced chiplet design, new compute units and second-generation AMD Infinity Cache technology, AMD RDNA 3 architecture delivers up to 54% more performance per watt than the previous-generation AMD RDNA 2 architecture. New compute units share resources between rendering, AI and raytracing to make the most effective use of each transistor for faster, more efficient performance than the previous generation.
  • Chiplet Design – The world’s first gaming GPU with a chiplet design delivers up to 15% higher frequencies at up to 54% better power efficiency. It includes the new 5nm 306mm Graphics Compute Die (GCD) with up to 96 compute units that provide the core GPU functionality. It also includes six of the new 6nm Memory Cache Die (MCD) at 37.5mm, each with up to 16MB of second-generation AMD Infinity Cache technology.
  • Ultra-Fast Chiplet Interconnect – Unleashing the benefits of second-generation AMD Infinity Cache technology, the new chiplets leverage AMD Infinity Links and high-performance fanout packaging to deliver up to 5.3TB/s of bandwidth.
  • Expanded Memory and Wider Memory Bus – To meet the growing requirements of today’s demanding titles, the new graphics cards feature up to 24GB of high-speed GDDR6 memory running at 20Gbps over a 384-bit memory bus.

Based on the pricing, they seem like pretty great value to me. Having a flagship under $1K is a very good move when compared to what NVIDIA are offering. If the performance is in any way comparable, it should sell quite well.

From the press release: “These new graphics cards are designed by gamers for gamers. As we were developing the new cards, we not only incorporated feedback from our customers, but we built in the features and capabilities we wanted to use,” said Scott Herkelman, senior vice president & general manager, Graphics Business Unit at AMD. “We also realized that we needed to do something different to continue pushing the envelope of the technology, and I’m proud of what the team has accomplished with AMD RDNA 3 and the Radeon RX 7900 Series graphics cards. I can’t wait for gamers to experience the powerhouse performance, incredibly vivid visuals and amazing new features these new graphics cards offer.”

Full event can be seen below:

YouTube Thumbnail
YouTube videos require cookies, you must accept their cookies to view. View cookie preferences.
Accept Cookies & Show   Direct Link

Also, it's still fun to see the Steam Deck picture on such events. AMD made the APU so it's only natural for them to highlight it but nice to see it again like this for a device that's helping to do so much for Linux gaming as a whole.

Article taken from GamingOnLinux.com.
Tags: AMD, Hardware, Misc
24 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. You can also follow my personal adventures on Bluesky.
See more from me
The comments on this article are closed.
All posts need to follow our rules. For users logged in: please hit the Report Flag icon on any post that breaks the rules or contains illegal / harmful content. Guest readers can email us for any issues.
73 comments
Page: «3/4»
  Go to:

TheRiddick Nov 4, 2022
Well, we are here gaming on Linux, aren't we? So Windows isn't my consideration :) I obviously evaluate it all from perspective of Linux gaming and choose hardware with that in mind.

Well many people do run Windows via Linux VM passthrough.
The point I was trying to make is you should consider how well said game and hardware runs on NON prototype experimental drivers before deciding on things (I'm talking about Windows drivers here).

Because RADV and AMDVLK are not perfect and will be giving you quite a skewed perspective on how said hardware should run. It also gives you a idea of how much further the Linux drivers need to come before the HW meets its true potential under Linux.

Your going to see lots of benchmarks and performance figures over the coming weeks and most will be Windows based. When looking at the Phoronix benchmarks when they come out; you should compare in order to understand the issues. Imagine buying a 7900XT and having it perform like a 5700XT because well, drivers suck...
Shmerl Nov 4, 2022
Obviously Windows benchmarks don't tell the story of how it's on Linux. Ideally we would have native Linux games using Vulkan, but that's still a rarity. Most high end games like that are Windows only. But I'm not going to start using Windows because of that :)

May be when Star Citizen will release Linux version with their Vulkan renderer there will be some heavy native game to test.

And if 7900XT is better than 6800XT then why not. Drivers are already good and are improving. Translation hit will be around, so that should be taken into account.


Last edited by Shmerl on 4 November 2022 at 7:13 am UTC
TheRiddick Nov 4, 2022
There are some native Linux Vulkan games with high end graphics, but nothing new from memory. And some are using in built translators so it can be hard to tell unless Vulkan is a API choice under Windows also, but even then under Windows they can just use DX11 translators to offer Vulkan.

Anyway when I bought my 6800XT it was capable of 4k 60fps no problem for majority of my games, the issue became when I ran those games under Linux and had perf hits up to %40 in some cases. Or other random graphical issues.

I generally don't buy things based on prototype driver scores, its a bad way to do it.
Just gotta hope eventually performance will catch up and I think it has quite a bit, still not great in all things.

Bottom line is I'm not going to be looking at how well the 7900XTX performs under Linux.. its probably going to be all sorts of broken on release for Linux drivers, or have performance issues.
Just gotta look at how well it does under stable windows drivers and hope for a better future under Linux... eventually.


Last edited by TheRiddick on 4 November 2022 at 7:14 am UTC
Shmerl Nov 4, 2022
Well, when I buy GPUs / monitors I do it based on Linux performance, not on Windows one. And kind of estimating how certain games would work. That's why I'm not in a rush to buy 4K screens :)
TheRiddick Nov 4, 2022
Well like I said most times its fine. 4k OLED 120hz is a game changer however. But yeah, the performance struggles with some games really can make it not worth it sometimes.
Shmerl Nov 4, 2022
I think in a couple more GPU generations 4K will become feasible with good framerates without upscaling.

By that time they'll be marketing 16K may be, lol. But who cares ;)


Last edited by Shmerl on 4 November 2022 at 7:26 am UTC
TheRiddick Nov 4, 2022
I think in a couple more GPU generations 4K will become feasible with good framerates without upscaling.

It already is. Just like I said some games can't be fixed, they broke and don't perform well at higher resolutions. Even KCD runs just fine at 4k with special optimized persets on my 6800xt.. but again that is under windows, under Linux there seems to be a limit to what DXVK can do, or the drivers.
Shmerl Nov 4, 2022
I expect translated games to stay around. May be some native Linux games will appear, but many will still be Windows only. So possibly Linux considerations for hardware will always be different from Windows on some level.

That doesn't just apply to potential target framerate on given resolution. For instance VRAM requirements on Linux are generally higher than on Windows in case of vkd3d-proton.


Last edited by Shmerl on 4 November 2022 at 7:34 am UTC
raptor85 Nov 4, 2022
I think in a couple more GPU generations 4K will become feasible with good framerates without upscaling.

By that time they'll be marketing 16K may be, lol. But who cares ;)
4k runs just fine on current gen in most games, my 2070 super can hold 60fps in most games easily, even without DLSS. (though it's pretty borderline on many so I tend to keep it at 1440p to maintain 120+ fps) 3000/4000 series or 6800 series plus shouldn't have any issues for most unless you've got some config problems or the game just has terrible optimization.

VR with the index is 2880x1600 but requires 144fps to not be vomit inducing, so it actually takes a bit more power to render than 4k/60


Last edited by raptor85 on 4 November 2022 at 9:13 am UTC
STiAT Nov 4, 2022
Cards should get a rating on how loud they are.

I wouldn't buy a reference card, noise being one of the reasons. AMD partners like Sapphire generally make better cooled and more silent designs since they need to differentiate in something.

That's exactly why I'd like that information ;-). Not only of a reference design but all cards, since for me that can be an argument for a card. A big one to be true.
Arehandoro Nov 4, 2022
Am I the only one to think that even AMD's new cards are better than Nvidia's on power consumption, 300/355 TDP is still crazy high?

With climate change creeping up on us, current international political affairs affecting prices and availability of energy, and the market saying yes to devices like the Switch and the Deck, perhaps it's about time to demand these companies to release things more align with what the world needs.

We need to use much less energy. Not the same, and definitely not more.
lejimster Nov 4, 2022
I was watching Wendell's live stream from Level1techs. They were at the launch event and there was a free bar. One of the tech reviewers came over for a chat and accidentally leaked that they had been designed to boost upto 3Ghz. So I expect AIB cards to be clocked more aggressively.
Sakuretsu Nov 4, 2022
I couldn't care less about those stupid halo products but setting the price ceiling to $1000 is much better than what NVIDIA's doing.

At least we won't see a 70 class card from AMD costing $900.
StalePopcorn Nov 4, 2022
These cards look exciting but I'll have to wait for reviews because I'm already rocking a 6800XT that I'm quite happy with.

LTT has a well-educated video on AMD's announcement entitled 'Goodbye NVIDIA!'

Also, pointed out in the video, the 4090 only supports DisplayPort 1.4a while AMD's new offerings support 2.1


Last edited by StalePopcorn on 4 November 2022 at 5:34 pm UTC
Mountain Man Nov 4, 2022
With more and larger fans on these video cards, we're going to need to tie our PCs down to keep them from going airborne.
Shmerl Nov 4, 2022
though it's pretty borderline on many so I tend to keep it at 1440p to maintain 120+ fps

That's exactly the point. Why pay for 4K screen if you can't run many games at high enough framerate and need to lower resolution? To me it would feel like paying and not using it. That's why I think 2560x1440 is still the optimal resolution that GPU can handle currently.


Last edited by Shmerl on 4 November 2022 at 4:40 pm UTC
denyasis Nov 4, 2022
Cards should get a rating on how loud they are.

I wouldn't buy a reference card, noise being one of the reasons. AMD partners like Sapphire generally make better cooled and more silent designs since they need to differentiate in something.

That's exactly why I'd like that information ;-). Not only of a reference design but all cards, since for me that can be an argument for a card. A big one to be true.

I might also add airflow requirements for the cooler, plz? That would be nice to know. I build ITX boxes and getting enough air to these things can be a pain.
raptor85 Nov 4, 2022
though it's pretty borderline on many so I tend to keep it at 1440p to maintain 120+ fps

That's exactly the point. Why pay for 4K screen if you can't run many games at high enough framerate and need to lower resolution? To me it would feel like paying and not using it. That's why I think 2560x1440 is still the optimal resolution that GPU can handle currently.
Movies/streaming, work, there ARE other things we do on computers besides games you know :P

Also remember my card is 2 full generations behind and not even the highest end of it's generation and can run 4k 99% of the time just fine, a 3xxx or 4xxx series or any newer radion should do 4k just fine at 60/120fps.
Shmerl Nov 4, 2022
Movies/streaming, work, there ARE other things we do on computers besides games you know :P

I'm fine for that with 2560x1440. You can't have a separate monitor for every use case, so I think that's still the optimum, while 4K is just ahead of current generation of GPUs when it comes to gaming.

99% is not my experience. If I'd be getting ≥ 144 fps with my current GPU and resolution, then I'd consider a higher res screen.

And as above, for games I care about I'm not even getting 100 fps so far.


Last edited by Shmerl on 4 November 2022 at 6:35 pm UTC
raptor85 Nov 4, 2022
Am I the only one to think that even AMD's new cards are better than Nvidia's on power consumption, 300/355 TDP is still crazy high?

With climate change creeping up on us, current international political affairs affecting prices and availability of energy, and the market saying yes to devices like the Switch and the Deck, perhaps it's about time to demand these companies to release things more align with what the world needs.

We need to use much less energy. Not the same, and definitely not more.
There's still 100 watt and less cards on the market widely available, nobody's putting a gun to your head forcing you to get a higher end card. An extra 200 watts is also not really a lot compared to most other things in a normal household, you can literally offset that usage by not watching TV while the computer is on.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.