Today at the AMD "together we advance_gaming" event, AMD revealed their new RDNA3 architecture along with the RX 7900 XTX and RX 7900 XT GPUs. Both of these new cards will be available on December 13th, and AMD threw plenty of shade at NVIDIA of the power use and connector issues during the event talking about how "easy" it is to upgrade to it and noting the power use.
Specifications:
AMD Radeon RX 7900 XT | AMD Radeon RX 7900 XTX | |
Memory | 20 GB - GDDR6 Infinity Cache - 80 MB Ray Accelerators - 84 |
24 GB - GDDR6 Infinity Cache - 96 MB Ray Accelerators - 96 |
Speed | Base Frequency - 1500 MHz Boost Frequency - Up to 2400 MHz Game Frequency - 2000 MHz |
Base Frequency - 1900 MHz Boost Frequency - Up to 2500 MHz Game Frequency - 2300 MHz |
Connections | DisplayPort 2.1 HDMI 2.1 USB Type-C |
DisplayPort 2.1 HDMI 2.1 USB Type-C |
Rendering | HDMI 4K Support 4K H264 Decode 4K H264 Encode H265/HEVC Decode H265/HEVC Encode AV1 Decode AV1 Encode |
HDMI 4K Support 4K H264 Decode 4K H264 Encode H265/HEVC Decode H265/HEVC Encode AV1 Decode AV1 Encode |
Power | Typical Board Power (Desktop) - 300 W Minimum PSU Recommendation - 750 W |
Typical Board Power (Desktop) - 355 W Minimum PSU Recommendation - 800 W |
Dimension | Length - 276 mm Slot Size - 2.5 slots |
Length - 287 mm Slot Size - 2.5 slots |
Pricing | $899 | $999 |
They also teased FSR3, which will be due out next year but didn't go into much detail on it. According to AMD FSR3 is "expected to deliver up to 2X more FPS compared to AMD FSR 2 in select games".
- AMD RDNA 3 Architecture – Featuring an advanced chiplet design, new compute units and second-generation AMD Infinity Cache technology, AMD RDNA 3 architecture delivers up to 54% more performance per watt than the previous-generation AMD RDNA 2 architecture. New compute units share resources between rendering, AI and raytracing to make the most effective use of each transistor for faster, more efficient performance than the previous generation.
- Chiplet Design – The world’s first gaming GPU with a chiplet design delivers up to 15% higher frequencies at up to 54% better power efficiency. It includes the new 5nm 306mm Graphics Compute Die (GCD) with up to 96 compute units that provide the core GPU functionality. It also includes six of the new 6nm Memory Cache Die (MCD) at 37.5mm, each with up to 16MB of second-generation AMD Infinity Cache technology.
- Ultra-Fast Chiplet Interconnect – Unleashing the benefits of second-generation AMD Infinity Cache technology, the new chiplets leverage AMD Infinity Links and high-performance fanout packaging to deliver up to 5.3TB/s of bandwidth.
- Expanded Memory and Wider Memory Bus – To meet the growing requirements of today’s demanding titles, the new graphics cards feature up to 24GB of high-speed GDDR6 memory running at 20Gbps over a 384-bit memory bus.
Based on the pricing, they seem like pretty great value to me. Having a flagship under $1K is a very good move when compared to what NVIDIA are offering. If the performance is in any way comparable, it should sell quite well.
From the press release: “These new graphics cards are designed by gamers for gamers. As we were developing the new cards, we not only incorporated feedback from our customers, but we built in the features and capabilities we wanted to use,” said Scott Herkelman, senior vice president & general manager, Graphics Business Unit at AMD. “We also realized that we needed to do something different to continue pushing the envelope of the technology, and I’m proud of what the team has accomplished with AMD RDNA 3 and the Radeon RX 7900 Series graphics cards. I can’t wait for gamers to experience the powerhouse performance, incredibly vivid visuals and amazing new features these new graphics cards offer.”
Full event can be seen below:
Direct Link
Also, it's still fun to see the Steam Deck picture on such events. AMD made the APU so it's only natural for them to highlight it but nice to see it again like this for a device that's helping to do so much for Linux gaming as a whole.
Quoting: ShmerlAnd as above, for games I care about I'm not even getting 100 fps so far.Driver/wayland issues maybe? (I'm using xorg, as it's still where games perform best and I'm not sold on wayland yet, it still has way too many issues) There's no reason your card, which is newer than mine, should be getting worse FPS in games, I'm often breaking 200fps at 1440p.
Last edited by raptor85 on 4 November 2022 at 6:48 pm UTC
Quoting: raptor85Quoting: ShmerlAnd as above, for games I care about I'm not even getting 100 fps so far.Driver/wayland issues maybe? There's no reason your card, which is newer than mine, should be getting worse FPS in games.
No, games like Cyberpunk 2077 or Star Citizen are just very heavy / demanding. So I use this as a metric for target framerate / resolution balance, and 4K would clearly not be optimal for that yet :)
But I agree, a lot of that depends on the games. For less demanding ones it's less of an issue.
Last edited by Shmerl on 4 November 2022 at 6:51 pm UTC
Quoting: ShmerlInteresting, I'm more of death stranding, elden ring, FF15, SMT, Monster Hunter, residient evil, etc as far as the AAA games in the past two years and while they're all pretty demanding they all run great, (Elden ring and Monster Hunter i bump to 1440p, both of them have weird spots that lag which sucks because 99% of the game works just fine at 4k :/) I've heard horror stories about cyberpunk being poorly optomized though so that's probably part of it :/Quoting: raptor85Quoting: ShmerlAnd as above, for games I care about I'm not even getting 100 fps so far.Driver/wayland issues maybe? There's no reason your card, which is newer than mine, should be getting worse FPS in games.
No, games like Cyberpunk 2077 or Star Citizen are just very heavy / demanding. So I use this as a metric for target framerate / resolution balance, and 4K would clearly not be optimal for that yet :)
But I agree, a lot of that depends on the games. For less demanding ones it's less of an issue.
Quoting: raptor85...
There's still 100 watt and less cards on the market widely available, nobody's putting a gun to your head forcing you to get a higher end card.
...
I'm sure people in Pakistan are happy to know that nobody put a gun on our heads to buy goods with a high energy consumption.
Quoting: raptor85Averaged over the past year and rounded up generously, my entire single person household averaged 160 Watts. And I am confident I could reduce that by another 25% without too much effort.Quoting: ArehandoroAm I the only one to think that even AMD's new cards are better than Nvidia's on power consumption, 300/355 TDP is still crazy high?There's still 100 watt and less cards on the market widely available, nobody's putting a gun to your head forcing you to get a higher end card. An extra 200 watts is also not really a lot compared to most other things in a normal household, you can literally offset that usage by not watching TV while the computer is on.
With climate change creeping up on us, current international political affairs affecting prices and availability of energy, and the market saying yes to devices like the Switch and the Deck, perhaps it's about time to demand these companies to release things more align with what the world needs.
We need to use much less energy. Not the same, and definitely not more.
Peak usage is undeniably higher when the microwave/tea kettle/washing machine are on, but they aren't for most of the time. Top consumers are the PC and the fridge, with the PC sitting at a whopping 80 right now, and hitting 200 with heavy gaming which I admittedly don't do that much these days, most of the games get by on 150ish or so.
I don't play the likes of CP77 or twitchy shooter type games, I had my fill of that crap when I was younger, but some of the sims I like do look pretty and have their respective cost in wall power.
So in summary, even "just" two hundred watts just for a GPU is insane, as are a thousand credits to pay for it, and nobody needs to commend AMD or Nvidia for not going higher still.
Consumers and marketing people alike need to get off that
ED: To be fair, that number doesn't include heating, which is still fossil gas and would probably amount to about the same 150ish.figure again, and to bring that down in any meaningful way would require a massive investment in insulation on the landlord's side. Which I'm fairly certain will be coming, if and when legislation eventually requires it. So we're talking 350ish Watts instead, but still. That's one entire household's worth, just for the GPU.
Last edited by Valck on 5 November 2022 at 3:36 am UTC
No, I'm thinking you dropped a 0 and you mean 1600, which is closer to average.
Last edited by raptor85 on 5 November 2022 at 5:13 am UTC
Quoting: raptor85A full household at 160 watts? Unless you live like the middle ages I'm not sure how that's possible, the average household in the US uses 1200 watts when not at peak, 150 watts you could literally run your entire house off a single small solar panel and a car battery...hell you could run off just the car battery for 10 hours a day and just swap it out daily. (not even a full size panel, like a 1/4 size panel). Furnaces aren't "another 150 watts" the blower alone is around 800 watts, (can peak over 2000 during initial startup) assuming yours is gas. Do you not have any appliances? A refrigerator uses far more than that alone, a central fan uses almost 800, etc...the only way you average 160 is if you're offsetting your usage with home solar or you do nothing but sit around reading books by candlelight all day.
No, I'm thinking you dropped a 0 and you mean 1600, which is closer to average.
To be frank if my fridge/freezer used 200w I'd throw it in skip, if yours is I suggest you do the same! Hell I have a 48inch oled that only uses 83w in use and 0.5w on standby.
Most houses during the day will not consume that much as most are out the majority of the day at work ect, lights are off so bar the fridge ect very little would be on by default.
Lets be fair here, the vast majority of households now are using LED lighting so not as if houses will have 10 rooms lit constantly with 60w bulbs in each burning all evening/night.
Ok you could say, "but the kettle/toster" True, but there on for 5 mins at most, Not several hours like a computer you are using playing a game.
Don't get me wrong people have the washers on ect that uses far more than 160w but that's not all day everyday for the most part.
Quoting: ValckQuoting: raptor85Averaged over the past year and rounded up generously, my entire single person household averaged 160 Watts. And I am confident I could reduce that by another 25% without too much effort.Quoting: ArehandoroAm I the only one to think that even AMD's new cards are better than Nvidia's on power consumption, 300/355 TDP is still crazy high?There's still 100 watt and less cards on the market widely available, nobody's putting a gun to your head forcing you to get a higher end card. An extra 200 watts is also not really a lot compared to most other things in a normal household, you can literally offset that usage by not watching TV while the computer is on.
With climate change creeping up on us, current international political affairs affecting prices and availability of energy, and the market saying yes to devices like the Switch and the Deck, perhaps it's about time to demand these companies to release things more align with what the world needs.
We need to use much less energy. Not the same, and definitely not more.
Peak usage is undeniably higher when the microwave/tea kettle/washing machine are on, but they aren't for most of the time. Top consumers are the PC and the fridge, with the PC sitting at a whopping 80 right now, and hitting 200 with heavy gaming which I admittedly don't do that much these days, most of the games get by on 150ish or so.
I don't play the likes of CP77 or twitchy shooter type games, I had my fill of that crap when I was younger, but some of the sims I like do look pretty and have their respective cost in wall power.
So in summary, even "just" two hundred watts just for a GPU is insane, as are a thousand credits to pay for it, and nobody needs to commend AMD or Nvidia for not going higher still.
Consumers and marketing people alike need to get off thattrainSUV, and fast. In fact, DO take the train the next time.
ED: To be fair, that number doesn't include heating, which is still fossil gas and would probably amount to about the same 150ish.figure again, and to bring that down in any meaningful way would require a massive investment in insulation on the landlord's side. Which I'm fairly certain will be coming, if and when legislation eventually requires it. So we're talking 350ish Watts instead, but still. That's one entire household's worth, just for the GPU.
On the other hand the card uses 300watts on peak, not during idle. My RX480 is rated to draw 100W but currently browsing here with a YT video playing in the background it draws about 7-10W and it's very likely that the RX7900XTX would draw even less under the same load.
Last edited by F.Ultra on 5 November 2022 at 6:17 pm UTC
Quoting: raptor85A refrigerator uses far more than that alone,Just FIY.
I have a (10 year old) 367 litre / 80 gallon fridge measured at 18W average.
Also keep in mind that a smaller household (one or two people) will probably have an appliance half that size.
Energy costs here in Australia (especially SA) are getting out of hand, gov is starting to panic (you know the guys loaded with cash who basically never need to worry about utility bills ever!).
I already pay 77cents a day (about 50c USD) just to have the house connected to the grid, if I turn everything off and die, I'd still be paying 77c a day! (GAS is a entirely different per day charge on top)
Average kwh cost is 34c btw which is reduced down from like 49c or something with some subsidies.
Last edited by TheRiddick on 6 November 2022 at 12:36 am UTC
See more from me