Every article tag can be clicked to get a list of all articles in that category. Every article tag also has an RSS feed! You can customize an RSS feed too!
We do often include affiliate links to earn us some pennies. See more here.

Today AMD formally revealed the next-generation Radeon GPUs powered by the RDNA 2 architecture and it looks like they’re going to thoroughly give NVIDIA a run for your money.

What was announced: Radeon RX 6900 XT, Radeon RX 6800 XT, Radeon 6800 with the Radeon RX 6800 XT looking like a very capable GPU that sits right next to NVIDIA's 3080 while seeming to use less power. All three of them will support Ray Tracing as expected with AMD adding a "high performance, fixed-function Ray Accelerator engine to each compute unit". However, we're still waiting on The Khronos Group to formally announce the proper release of the vendor-neutral Ray Tracing extensions for Vulkan which still aren't finished (provisional since March 2020) so for now DirectX RT was all they mentioned.

Part of the big improvement in RDNA 2 comes from what they learned with Zen 3 and their new "Infinity Cache", which is a high-performance, last-level data cache they say "dramatically" reduces latency and power consumption while delivering higher performance than previous designs. You can see some of the benchmarks they showed in the image below:

As always, it's worth waiting on independent benchmarks for the full picture as both AMD and NVIDIA like to cherry-pick what makes them look good of course.

Here's the key highlight specifications:

  RX 6900 XT RX 6800 XT RX 6800
Compute Units 80 72 60
Process TSMC 7nm TSMC 7nm TSMC 7nm
Game clock (MHz) 2,015 2,015 1,815
Boost clock (MHz) 2,250 2,250 2,105
Infinity Cache (MB) 128 128 128
Memory 16GB GDDR6 16GB GDDR6 16GB GDDR6
TDP (Watt) 300 300 250
Price (USD) $999 $649 $579
Available 08/12/2020 18/11/2020 18/11/2020

You shouldn't need to go buying a new case either, as AMD say they had easy upgrades in mind as they built these new GPUs for "standard chassis" with a length of 267mm and 2x8 standard 8-pin power connectors, and designed to operate with existing enthusiast-class 650W-750W power supplies.

There was a big portion of the event dedicated to DirectX which doesn’t mean much for us, but what we’ve been able to learn from the benchmarks shown is that they’re powerful cards and they appear to fight even NVIDIA’s latest high end consumer GPUs like the GeForce 3080. So not only are AMD leaping over Intel with the Ryzen 5000, they’re also now shutting NVIDIA out in the cold too. Incredible to see how far AMD has surged in the last few years. This is what NVIDIA and Intel have needed, some strong competition.

How will their Linux support be? You're probably looking at around the likes of Ubuntu 21.04 next April (or comparable distro updates) to see reasonable out-of-the-box support, thanks to newer Mesa drivers and an updated Linux Kernel but we will know a lot more once they actually release and can be tested.

As for what’s next? AMD confirmed that RDNA3 is well into the design stage, with a release expected before the end of 2022 for GPUs powered by RDNA3.

You can view the full event video in our YouTube embed below:

YouTube Thumbnail
YouTube videos require cookies, you must accept their cookies to view. View cookie preferences.
Accept Cookies & Show   Direct Link

Feel free to comment as you watch as if you have JavaScript enabled it won't refresh the page.

Additionally if you missed it, AMD also recently announced (October 27) that they will be acquiring chip designer Xilinx.

Article taken from GamingOnLinux.com.
37 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
101 comments
Page: «9/11»
  Go to:

mos Oct 29, 2020
Quoting: undeadbydawn
Quoting: mosCool.
The only detail I'm currently interested in though is WHEN THE FECKING rdna1 PRICES WILL GO DOWN

On an entirely selfish level, hopefully not before I sell my Red Devil
Quote me
dreamer_ Oct 29, 2020
Quoting: DiableI'll buy a AMD card when they have working Linux drivers available at launch.
I think you should quantify this statement to: you will buy AMD card *at launch* when the Linux drivers will be available *at launch*. Otherwise, it doesn't make much sense.

Quoting: DiableI'm not waiting six months for the Mesa guy to get their new cards working when Nvidia has Linux drivers available for their new card on day one. I'm not supporting a company that treats my OS of choice like a second class citizens.
You got it backwards - NVIDIA releases drivers at launch day because they don't care - if the driver is broken, they will release the fixed version after some time, perhaps, maybe. Right now NVIDIA driver is broken for kernel 5.9 for example.

AMD does things correctly - starting work on supporting new GPUs in kernel early and improving it over time - support for RX 6000 series landed already in kernel 5.9 (released ~3 weeks ago) and Mesa 20.2 (released ~2 weeks ago). We can't say how good the support will be until the cards start showing up, but AMD seems to be gradually improving the process of working with open source. Also, RDNA2 is a refresh of RDNA, not a whole new architecture.

BTW, if you use Ubuntu LTS, then AMD releases closed source version of their drivers to be used until the open source version is provided via LTS point release. So this way you *have* an option of using your GPU at launch.
Creak Oct 29, 2020
Quoting: DiableI'll buy a AMD card when they have working Linux drivers available at launch. I'm not waiting six months for the Mesa guy to get their new cards working when Nvidia has Linux drivers available for their new card on day one. I'm not supporting a company that treats my OS of choice like a second class citizens.
In the defense of AMD devs, for each generation of GPU and CPU they are getting closer and closer to be ready at launch time. For instance, here, and if I'm not mistaken, Big Navi are supported in Mesa latest code (though not yet perfectly AFAIK). Now we're just missing the Mesa and Linux releases.

Personally, I don't really mind not having these GPUs ready at launch time since they are too expensive for me, but it means that the previous generation prices should go down a little bit! 😉

I doubt much companies really need day 1 releases though, as most companies won't on the latest GPUs as they just got out of the oven. For instance, I don't think Google Stadia is really interested right now about being able to get their hands on the latest GPUs, even though their servers runs on AMD (which is also a proof that even in a private company, open source apparently matters too, otherwise they would have chosen NVIDIA which have better performance/dollar ratio).

Which makes me think Microsoft and Sony are also using AMD GPUs in their consoles, so apparently it's not that bad 😉


Last edited by Creak on 29 October 2020 at 9:59 pm UTC
jarhead_h Oct 30, 2020
Quoting: TheRiddick
Quoting: jarhead_hThe 850 should handle the power draw of the 6900XT just fine

Why do people think they need a 850W PSU for a 300W card?

Because it's always better to have more than you need than to need and not have. The 3090RTX is supposed to be a 350w card. According to LTT it can pull almost 500w and caused them noticeable problems on their test system because it only had an 850w PSU. Now do you think I trust AMD's power rating? Do you? If so, why would you do that?


Last edited by jarhead_h on 30 October 2020 at 6:18 am UTC
Sojiro84 Oct 30, 2020
I am happy that AMD can finally compete with NVIDIA again. Man, that women (and the team) is a genius. She turned that company around from the brink.

I always had a NVIDIA card, but now that I am on Linux, AMD all the way and I am very pleased with my 5700 XT and my 580 before that. Runs like a dream and I get the sweet new goodies like ACO first.

Will be a while before I need a new PC, but seems it is time to start saving again so in 3 years I can do a big upgrade again.
Valck Oct 30, 2020
Prompted by some of the comments in this thread regarding the majority of gamers, even on Linux, using Nvidia, I found that I needed to take another look at this sites' stats:

https://www.gamingonlinux.com/render_chart.php?id=920&type=stats
 
$ echo -e "AMD: $((152+130+59+52+50))\nNvidia: $((114+112+83+75+61))"
AMD: 443
Nvidia: 445


Of course I'm only looking at a single source that is most certainly biased, and even then only at the top ten graphics cards, and in October of 2020, the "majority" is indeed still green.



I will probably fall out of that top ten some time early next year–depending on how fast RDNA2 can make their inroads–, as I'm considering upgrading from my current 'pole position' RX580 to an RX6800 (non-XT) that probably won't even see the top twenty for the first half of next year at least... once the initial hype is over and prices hopefully come down a little. USD 580 (EUR 500 at current exchange rates) likely means a store price of EUR 650 or possibly even more after duty and taxes, which means its actual cost is closer to about USD 750-ish, and that is something I am definitely not willing to spend on any graphics card.


Analogous for "board power"–I can't say I'm comfortable with 250W, that's about 20% more than what I'd like to stay below. However, the RX580 is still holding up most of the time even when throttled to about 50-75%, so I'd hope to see the higher efficiency of RDNA2 make a definitive dent in the power curve. Summer temperatures the last couple of years meant no power hungry games for a few months; and the choice of a 40°C room, or only playing tetris for a while, is another thing I'm not comfortable with.

I'm definitely curious to see the idle power draw; the RX580 reports 33W at the minimum, which I find ridiculous to say the least, and my only hope is that it's in fact a display error from hwinf... although it really doesn't look like it, the whole system draws 70W on idle, where I'd reasonably expect maybe 40 at best, to no more than say 50 worst-case.


Last edited by Valck on 30 October 2020 at 9:41 am UTC
x_wing Oct 30, 2020
Quoting: jarhead_hBecause it's always better to have more than you need than to need and not have. The 3090RTX is supposed to be a 350w card. According to LTT it can pull almost 500w and caused them noticeable problems on their test system because it only had an 850w PSU. Now do you think I trust AMD's power rating? Do you? If so, why would you do that?

Bare in mind that LTT tests were made with a CPU with a high power consumption as well (the GPU by itself went slightly over 450W peaks). So, a 850w may not be enough for such configuration because you don't have enough current in the 12v line for both components but this is an extreme configuration.

IMO, going very high in PSU wattage is not always the best for efficiency as your computer will probably by idling more than 50% of the time (or more than 90% if you keep it always on... like me) and you will probably get better efficiency with a certificated PSU that has a wattage more in line with your total system consumption.


Last edited by x_wing on 30 October 2020 at 1:48 pm UTC
mos Oct 30, 2020
Quoting: Hori
Quoting: mos
Quoting: Guest
Quoting: subPlease consider supporting AMD.

I don't really mind proprietary games, but as a Linux user
I clearly want my computer infrastructure being as open as possible.
Hardware, drivers and libs.

This is where AMD shines, if you're smart enough to value this.

Or just be a conscious customer and evaluate products properly instead of relying on ideologies only.
As if AMD's GPU's clearly suck compared to NVIDIA's value-wise.
No they don't, and the diff between them is mostly ideological to begin with. The latter abhors public software model, the former at least partially supports it. So the ideology starts with the vendor in this case, rather than with the consumer.
Let's not forget that until this generation, AMD was miles behind Nvidia in terms of performance
Not to mention that you're basically beta (or sometimes it's better said alpha lol) testing their drivers for the first few months of use until they actually get to a point that they are ok.

Now don't get me wrong, I hope AMD does become a viable alternative and fierce opponent to Nvidia, but they cannot win my trust overnight. It will be a while until that. I can't trust them in the GPU space just as I cannot trust Intel in the CPU space. They have a long history of mistakes, bad products and laziness (including over-rebranding old products).
I do expect them to make it right and heal their reputation, but until then, I will wait.

And no, I'm not an Nvidia fanboy at all. I just want to go with the product that has the best chance of working well and offers the performance I need. Just as I used to choose Intel over AMD for CPUs in the past (for similar reasons) and eventually the situation made a complete switch to the point I'd avoid Intel like the plague, this could happen also in the GPU space but it's not yet the case.
Don't quite get what your gripe with ATI/AMD GPUs exactly is. They've announced themselves as a competitive GPU maker with the Radeon 7-thousand-something (or was it 9k-something?) back in the day. And been only getting better. Is it not enough to earn your 'trust' (whatever that means)?
As for the 'sins' you've mentioned, hasn't NVIDIA been up to roughly the same stuff?
Shmerl Oct 30, 2020
Quoting: x_wingIMO, going very high in PSU wattage is not always the best for efficiency as your computer will probably by idling more than 50% of the time (or more than 90% if you keep it always on... like me) and you will probably get better efficiency with a certificated PSU that has a wattage more in line with your total system consumption.

I got 750W one and so far it was enough. You can get good quality efficient PSU to reduce power wasting:

https://seasonic.com/prime-ultra-titanium
Avehicle7887 Oct 30, 2020
Time for some GPU Porn: https://phonemantra.com/gigabyte-and-sapphire-unveil-reference-radeon-rx-6800-and-rx-6800-xt/

Looks like I'm going with the Sapphire 6800XT all the way.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.
Buy Games
Buy games with our affiliate / partner links: