We do often include affiliate links to earn us some pennies. See more here.

Today at the AMD "together we advance_gaming" event, AMD revealed their new RDNA3 architecture along with the RX 7900 XTX and RX 7900 XT GPUs. Both of these new cards will be available on December 13th, and AMD threw plenty of shade at NVIDIA of the power use and connector issues during the event talking about how "easy" it is to upgrade to it and noting the power use. 


Pictured: RX 7900 XTX

Specifications:

  AMD Radeon RX 7900 XT AMD Radeon RX 7900 XTX
Memory 20 GB - GDDR6
Infinity Cache - 80 MB
Ray Accelerators - 84
24 GB - GDDR6
Infinity Cache - 96 MB
Ray Accelerators - 96
Speed Base Frequency - 1500 MHz
Boost Frequency - Up to 2400 MHz
Game Frequency - 2000 MHz
Base Frequency - 1900 MHz
Boost Frequency - Up to 2500 MHz
Game Frequency - 2300 MHz
Connections DisplayPort 2.1
HDMI 2.1
USB Type-C
DisplayPort 2.1
HDMI 2.1
USB Type-C
Rendering HDMI 4K Support
4K H264 Decode
4K H264 Encode
H265/HEVC Decode
H265/HEVC Encode
AV1 Decode
AV1 Encode
HDMI 4K Support
4K H264 Decode
4K H264 Encode
H265/HEVC Decode
H265/HEVC Encode
AV1 Decode
AV1 Encode
Power Typical Board Power (Desktop) - 300 W
Minimum PSU Recommendation - 750 W
Typical Board Power (Desktop) - 355 W
Minimum PSU Recommendation - 800 W
Dimension Length - 276 mm
Slot Size - 2.5 slots
Length - 287 mm
Slot Size - 2.5 slots
Pricing $899 $999

They also teased FSR3, which will be due out next year but didn't go into much detail on it. According to AMD FSR3 is "expected to deliver up to 2X more FPS compared to AMD FSR 2 in select games".

  • AMD RDNA 3 Architecture – Featuring an advanced chiplet design, new compute units and second-generation AMD Infinity Cache technology, AMD RDNA 3 architecture delivers up to 54% more performance per watt than the previous-generation AMD RDNA 2 architecture. New compute units share resources between rendering, AI and raytracing to make the most effective use of each transistor for faster, more efficient performance than the previous generation.
  • Chiplet Design – The world’s first gaming GPU with a chiplet design delivers up to 15% higher frequencies at up to 54% better power efficiency. It includes the new 5nm 306mm Graphics Compute Die (GCD) with up to 96 compute units that provide the core GPU functionality. It also includes six of the new 6nm Memory Cache Die (MCD) at 37.5mm, each with up to 16MB of second-generation AMD Infinity Cache technology.
  • Ultra-Fast Chiplet Interconnect – Unleashing the benefits of second-generation AMD Infinity Cache technology, the new chiplets leverage AMD Infinity Links and high-performance fanout packaging to deliver up to 5.3TB/s of bandwidth.
  • Expanded Memory and Wider Memory Bus – To meet the growing requirements of today’s demanding titles, the new graphics cards feature up to 24GB of high-speed GDDR6 memory running at 20Gbps over a 384-bit memory bus.

Based on the pricing, they seem like pretty great value to me. Having a flagship under $1K is a very good move when compared to what NVIDIA are offering. If the performance is in any way comparable, it should sell quite well.

From the press release: “These new graphics cards are designed by gamers for gamers. As we were developing the new cards, we not only incorporated feedback from our customers, but we built in the features and capabilities we wanted to use,” said Scott Herkelman, senior vice president & general manager, Graphics Business Unit at AMD. “We also realized that we needed to do something different to continue pushing the envelope of the technology, and I’m proud of what the team has accomplished with AMD RDNA 3 and the Radeon RX 7900 Series graphics cards. I can’t wait for gamers to experience the powerhouse performance, incredibly vivid visuals and amazing new features these new graphics cards offer.”

Full event can be seen below:

YouTube Thumbnail
YouTube videos require cookies, you must accept their cookies to view. View cookie preferences.
Accept Cookies & Show   Direct Link

Also, it's still fun to see the Steam Deck picture on such events. AMD made the APU so it's only natural for them to highlight it but nice to see it again like this for a device that's helping to do so much for Linux gaming as a whole.

Article taken from GamingOnLinux.com.
Tags: AMD, Hardware, Misc
24 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. You can also follow my personal adventures on Bluesky.
See more from me
The comments on this article are closed.
All posts need to follow our rules. For users logged in: please hit the Report Flag icon on any post that breaks the rules or contains illegal / harmful content. Guest readers can email us for any issues.
73 comments
Page: «4/4
  Go to:

raptor85 Nov 4, 2022
And as above, for games I care about I'm not even getting 100 fps so far.
Driver/wayland issues maybe? (I'm using xorg, as it's still where games perform best and I'm not sold on wayland yet, it still has way too many issues) There's no reason your card, which is newer than mine, should be getting worse FPS in games, I'm often breaking 200fps at 1440p.


Last edited by raptor85 on 4 November 2022 at 6:48 pm UTC
Shmerl Nov 4, 2022
And as above, for games I care about I'm not even getting 100 fps so far.
Driver/wayland issues maybe? There's no reason your card, which is newer than mine, should be getting worse FPS in games.

No, games like Cyberpunk 2077 or Star Citizen are just very heavy / demanding. So I use this as a metric for target framerate / resolution balance, and 4K would clearly not be optimal for that yet :)

But I agree, a lot of that depends on the games. For less demanding ones it's less of an issue.


Last edited by Shmerl on 4 November 2022 at 6:51 pm UTC
raptor85 Nov 4, 2022
And as above, for games I care about I'm not even getting 100 fps so far.
Driver/wayland issues maybe? There's no reason your card, which is newer than mine, should be getting worse FPS in games.

No, games like Cyberpunk 2077 or Star Citizen are just very heavy / demanding. So I use this as a metric for target framerate / resolution balance, and 4K would clearly not be optimal for that yet :)

But I agree, a lot of that depends on the games. For less demanding ones it's less of an issue.
Interesting, I'm more of death stranding, elden ring, FF15, SMT, Monster Hunter, residient evil, etc as far as the AAA games in the past two years and while they're all pretty demanding they all run great, (Elden ring and Monster Hunter i bump to 1440p, both of them have weird spots that lag which sucks because 99% of the game works just fine at 4k :/) I've heard horror stories about cyberpunk being poorly optomized though so that's probably part of it :/
emphy Nov 4, 2022
...
There's still 100 watt and less cards on the market widely available, nobody's putting a gun to your head forcing you to get a higher end card.
...

I'm sure people in Pakistan are happy to know that nobody put a gun on our heads to buy goods with a high energy consumption.
Valck Nov 5, 2022
Am I the only one to think that even AMD's new cards are better than Nvidia's on power consumption, 300/355 TDP is still crazy high?

With climate change creeping up on us, current international political affairs affecting prices and availability of energy, and the market saying yes to devices like the Switch and the Deck, perhaps it's about time to demand these companies to release things more align with what the world needs.

We need to use much less energy. Not the same, and definitely not more.
There's still 100 watt and less cards on the market widely available, nobody's putting a gun to your head forcing you to get a higher end card. An extra 200 watts is also not really a lot compared to most other things in a normal household, you can literally offset that usage by not watching TV while the computer is on.
Averaged over the past year and rounded up generously, my entire single person household averaged 160 Watts. And I am confident I could reduce that by another 25% without too much effort.
Peak usage is undeniably higher when the microwave/tea kettle/washing machine are on, but they aren't for most of the time. Top consumers are the PC and the fridge, with the PC sitting at a whopping 80 right now, and hitting 200 with heavy gaming which I admittedly don't do that much these days, most of the games get by on 150ish or so.
I don't play the likes of CP77 or twitchy shooter type games, I had my fill of that crap when I was younger, but some of the sims I like do look pretty and have their respective cost in wall power.

So in summary, even "just" two hundred watts just for a GPU is insane, as are a thousand credits to pay for it, and nobody needs to commend AMD or Nvidia for not going higher still.
Consumers and marketing people alike need to get off that trainSUV, and fast. In fact, DO take the train the next time.

ED: To be fair, that number doesn't include heating, which is still fossil gas and would probably amount to about the same 150ish.figure again, and to bring that down in any meaningful way would require a massive investment in insulation on the landlord's side. Which I'm fairly certain will be coming, if and when legislation eventually requires it. So we're talking 350ish Watts instead, but still. That's one entire household's worth, just for the GPU.


Last edited by Valck on 5 November 2022 at 3:36 am UTC
raptor85 Nov 5, 2022
A full household at 160 watts? Unless you live like the middle ages I'm not sure how that's possible, the average household in the US uses 1200 watts when not at peak, 150 watts you could literally run your entire house off a single small solar panel and a car battery...hell you could run off just the car battery for 10 hours a day and just swap it out daily. (not even a full size panel, like a 1/4 size panel). Furnaces aren't "another 150 watts" the blower alone is around 800 watts, (can peak over 2000 during initial startup) assuming yours is gas. Do you not have any appliances? A refrigerator uses far more than that alone, a central fan uses almost 800, etc...the only way you average 160 is if you're offsetting your usage with home solar or you do nothing but sit around reading books by candlelight all day.

No, I'm thinking you dropped a 0 and you mean 1600, which is closer to average.


Last edited by raptor85 on 5 November 2022 at 5:13 am UTC
pete910 Nov 5, 2022
View PC info
  • Supporter Plus
A full household at 160 watts? Unless you live like the middle ages I'm not sure how that's possible, the average household in the US uses 1200 watts when not at peak, 150 watts you could literally run your entire house off a single small solar panel and a car battery...hell you could run off just the car battery for 10 hours a day and just swap it out daily. (not even a full size panel, like a 1/4 size panel). Furnaces aren't "another 150 watts" the blower alone is around 800 watts, (can peak over 2000 during initial startup) assuming yours is gas. Do you not have any appliances? A refrigerator uses far more than that alone, a central fan uses almost 800, etc...the only way you average 160 is if you're offsetting your usage with home solar or you do nothing but sit around reading books by candlelight all day.

No, I'm thinking you dropped a 0 and you mean 1600, which is closer to average.


To be frank if my fridge/freezer used 200w I'd throw it in skip, if yours is I suggest you do the same! Hell I have a 48inch oled that only uses 83w in use and 0.5w on standby.

Most houses during the day will not consume that much as most are out the majority of the day at work ect, lights are off so bar the fridge ect very little would be on by default.
Lets be fair here, the vast majority of households now are using LED lighting so not as if houses will have 10 rooms lit constantly with 60w bulbs in each burning all evening/night.

Ok you could say, "but the kettle/toster" True, but there on for 5 mins at most, Not several hours like a computer you are using playing a game.

Don't get me wrong people have the washers on ect that uses far more than 160w but that's not all day everyday for the most part.
F.Ultra Nov 5, 2022
View PC info
  • Supporter
Am I the only one to think that even AMD's new cards are better than Nvidia's on power consumption, 300/355 TDP is still crazy high?

With climate change creeping up on us, current international political affairs affecting prices and availability of energy, and the market saying yes to devices like the Switch and the Deck, perhaps it's about time to demand these companies to release things more align with what the world needs.

We need to use much less energy. Not the same, and definitely not more.
There's still 100 watt and less cards on the market widely available, nobody's putting a gun to your head forcing you to get a higher end card. An extra 200 watts is also not really a lot compared to most other things in a normal household, you can literally offset that usage by not watching TV while the computer is on.
Averaged over the past year and rounded up generously, my entire single person household averaged 160 Watts. And I am confident I could reduce that by another 25% without too much effort.
Peak usage is undeniably higher when the microwave/tea kettle/washing machine are on, but they aren't for most of the time. Top consumers are the PC and the fridge, with the PC sitting at a whopping 80 right now, and hitting 200 with heavy gaming which I admittedly don't do that much these days, most of the games get by on 150ish or so.
I don't play the likes of CP77 or twitchy shooter type games, I had my fill of that crap when I was younger, but some of the sims I like do look pretty and have their respective cost in wall power.

So in summary, even "just" two hundred watts just for a GPU is insane, as are a thousand credits to pay for it, and nobody needs to commend AMD or Nvidia for not going higher still.
Consumers and marketing people alike need to get off that trainSUV, and fast. In fact, DO take the train the next time.

ED: To be fair, that number doesn't include heating, which is still fossil gas and would probably amount to about the same 150ish.figure again, and to bring that down in any meaningful way would require a massive investment in insulation on the landlord's side. Which I'm fairly certain will be coming, if and when legislation eventually requires it. So we're talking 350ish Watts instead, but still. That's one entire household's worth, just for the GPU.

On the other hand the card uses 300watts on peak, not during idle. My RX480 is rated to draw 100W but currently browsing here with a YT video playing in the background it draws about 7-10W and it's very likely that the RX7900XTX would draw even less under the same load.


Last edited by F.Ultra on 5 November 2022 at 6:17 pm UTC
whizse Nov 6, 2022
View PC info
  • Supporter
A refrigerator uses far more than that alone,
Just FIY.

I have a (10 year old) 367 litre / 80 gallon fridge measured at 18W average.

Also keep in mind that a smaller household (one or two people) will probably have an appliance half that size.
TheRiddick Nov 6, 2022
You can't really average your power consumption out over a day, you're power sockets NEED to be able to handle PEAK everything on period. But if you run allot of things off GAS then you can easily claim a lower electricity consumption.

Energy costs here in Australia (especially SA) are getting out of hand, gov is starting to panic (you know the guys loaded with cash who basically never need to worry about utility bills ever!).

I already pay 77cents a day (about 50c USD) just to have the house connected to the grid, if I turn everything off and die, I'd still be paying 77c a day! (GAS is a entirely different per day charge on top)

Average kwh cost is 34c btw which is reduced down from like 49c or something with some subsidies.


Last edited by TheRiddick on 6 November 2022 at 12:36 am UTC
denyasis Nov 7, 2022
Usage also depends on season and weather. In the fall/spring, I can keep the AC and heat off (like I have for the past 2 months). I'm sub 800w/h a day and that is worth a full family and 20 hour a day occupation/use. It doubles in the summer with HVAC and window AC use (older units).

But that also includes charging the car which since it is electric. Even though it's 10 years old, it's still cheaper than gas. The equivalent is 1$ for a gallon in terms of electrical cost.
Valck Nov 8, 2022
No, I'm thinking you dropped a 0 and you mean 1600, which is closer to average.
Usage also depends on season and weather. In the fall/spring, I can keep the AC and heat off (like I have for the past 2 months). I'm sub 800w/h a day and that is worth a full family and 20 hour a day occupation/use. It doubles in the summer with HVAC and window AC use (older units).

But that also includes charging the car which since it is electric. Even though it's 10 years old, it's still cheaper than gas. The equivalent is 1$ for a gallon in terms of electrical cost.
So let's see... half a year at 800, plus half a year at double that ie. 1600, is (800+1600)/2=1200, divided by "a full family", I'm guessing four? gives 1200/4=300 Watt, averaged over a year. Including the car. Which goes to show that economy of scale works here as well, who would have thought.
No, I don't think I have to be living in a mediaeval hovel to get to the 300–350ish numbers I gave earlier, in fact I'm surprised how high they are in comparison. I don't have a car, nor do I need one, living in a functioning city with public transport and shops in walking distance (for how long they will continue to exist is another matter entirely though).

And yes of course, peak usage; still mindboggling that using such a beast easily doubles your total average consumption *for that time*. Plus the CPU is also bound to get a good workout, so doubling isn't even enough...
denyasis Nov 8, 2022
And there are some differences in usage for a family home compared to a single occupant. Even in my case, one would expect lower usage during day (work) hours and night.

I will add, my usage might be a bit low for a household. My region has very very cheap fossil fuel prices. It costs less to use natural gas is many cases compared to electric. So, all of my heat generating appliances use gas. The HVAC, oven, water heater, even the clothes dryer.

Efficiency matters a lot too. My home was purchased needing some work. All our appliances are new and more efficient(I hope). The microwave stands at 1600w (according to it's label) fridge is about 780w (according to math).

I will say the biggest change was this summer when I had my house insulated. I've never lived in an insulated house before and the change was dramatic! It chopped about 30% off our electrical consumption (mostly AC). Not to mention the temperature in the house was more stable. Whereas the upstairs bedrooms were about 10 C hotter in the summer, with insulation, it stuck to 3-5 C. I think it's also why I'm able to keep the heat and AC off longer this fall (not to mention very pleasant temps here this year).

By my math, the savings will balance out the cost in less than 8 years.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.