AMD came out of the gates swinging wildly at Computex 2021 with new chips, new tech and lots more new including: AMD 3D chiplet technology, AMD Ryzen 5000 G-Series desktop APUs, next-gen gaming laptops with their new AMD Radeon 6000M Series Mobile Graphics and their DLSS competitor in FidelityFX Super Resolution.
There's quite a lot to unpack here and we're still going through it, so we will update the article if we missed anything vital. The big one is no doubt the FidelityFX Super Resolution, an open source spatial upscaling technology that can be compared with NVIDIA DLSS (which is coming to Proton!). Being open source is quite exciting though! Although not yet, AMD said "in due course" it will be under the GPUOpen branch and under the MIT license.
With the FidelityFX Super Resolution tech AMD are betting big, with it clearly firing shots at NVIDIA with it being fully cross-platform across DirectX 11 & 12, Vulkan, and even NVIDIA GPUs too. AMD say when it's released "FSR can be ported onto multiple platforms without restriction.".
Direct Link
AMD continue pushing the boundaries of their processor tech, with the introduction of AMD 3D chiplet technology. What could be a real breakthrough in packaging technology combines AMD's innovative chiplet architecture with 3D stacking they claim "provides over 200 times the interconnect density of 2D chiplets and more than 15 times the density compared to existing 3D packaging solutions" which they've been collaborating on with TSMC. They showed it in a real-world application too as they did this 3D bonding with a 5000 Series processor prototype. AMD claim they're going to begin production with these 3D chiplets by the end of this year.
We're finally seeing AMD bring their next-generation APUs to the desktop for system builders too with the AMD Ryzen 5000 G-Series desktop APUs. They've split them between consumer models and business models, here's the consumer models that we care about (click to enlarge each image):
The AMD Ryzen 5000 G-Series desktop APUs will be available "later this year".
On top of that AMD also announced the new AMD Radeon 6000M Series Mobile Graphics, based on RDNA2 they say it gives "up to 1.5x" higher performance or "up to 43 percent" lower power at the same performance as the RDNA architecture. It also brings over their AMD Infinity Cache and Ray Tracing to next-gen laptops.
Model |
Compute Units & Ray Accelerators |
GDDR6 |
Game Clock9 (MHz) |
Memory Interface |
Infinity Cache |
AMD Radeon RX 6800M
|
40 |
12 GB |
2300Mhz @ 145W |
192-bit |
96 MB |
AMD Radeon RX 6700M
|
36 |
10 GB |
2300Mhz @ 135W |
160-bit |
80 MB |
AMD Radeon RX 6600M
|
28 |
8 GB |
2177Mhz @ 100W |
128-bit
|
32 MB |
"At Computex, we highlighted the growing adoption of our high-performance computing and graphics technologies as AMD continues setting the pace of innovation for the industry," said Dr. Su. "With the launches of our new Ryzen and Radeon processors and the first wave of AMD Advantage notebooks, we continue expanding the ecosystem of leadership AMD products and technologies for gamers and enthusiasts. The next frontier of innovation in our industry is taking chip design into the third dimension. Our first application of 3D chiplet technology at Computex demonstrates our commitment to continue pushing the envelope in high-performance computing to significantly enhance user experiences. We are proud of the deep partnerships we have cultivated across the ecosystem to power the products and services that are essential to our daily lives."
If you want to catch the whole thing, you can watch it in the below video:
Direct Link
I think for an anti-feature, it's good enough to end DLSS. Because it works everywhere and will also get better over time. The other side of it - it's general purpose. While AI/ML is more limited to specific use cases. So it is better in some cases and worse in others. Everything is a trade off.
This sounds like it was written by someone who doesn't understand either of these features. Hardware-assisted AI/ML being "limited" to specific use cases while an upscaling tech being "general purpose"? Give me a break...
Honestly, you sound like you don't know how Machine Learning and Artificial Intelligence works. It's typically incredibly focused. Using ML/AI in an implementation of anything doesn't magically make it versatile or flexible.
However, that said, the point of this technology is to enable fast frame rates of complex scenes (potentially with ray tracing thrown in) at high resolutions. That's a pretty focused target. So I'm not really sure why an upscaler like FSR is somehow more "general purpose" than DLSS.
No, I know how ML works because I use it in my daily job. Don't try to lecture me how these things work.
I've liked your detailed responses that follow this post, because they explain how DLSS works. Thank you.
But this initial response was you dismissing other people's opinions snarkily. I've no time for that attitude. The air of superiority and mocking of other people's comments is infuriating. And you continue to do so. Language like "the other user" (they have a name), "Tried to pretend" , "gimme a break", etc. So condescending.
As for avoiding Gsync being silly - Toms Hardware ran an article pointing out that fully gsync-compatible monitors still require hardware to reach that certification. That hardware carries a premium, of course. Might be pennies, who know? But I know I'm not going to use it, so I'm happy to avoid paying for it.
(source: https://www.tomshardware.com/uk/features/gsync-vs-freesync-nvidia-amd-monitor)
Personally, I think a more honest name for it would be Artificial Instinct. 'Cause like, instinct is this stuff animals do without actually being smart or thinking or figuring it out, where through evolution they became really good at some particular task they need to be good at to survive. Machine Learning AI seems to be this forced evolution thing where you let the algorithms survive that are good at some specific task, until you've evolved a black box that can do that specialized thing really well but has no general reasoning ability. So it's instinct, but for some (marketing) reason we call it "intelligence".I think for an anti-feature, it's good enough to end DLSS. Because it works everywhere and will also get better over time. The other side of it - it's general purpose. While AI/ML is more limited to specific use cases. So it is better in some cases and worse in others. Everything is a trade off.
This sounds like it was written by someone who doesn't understand either of these features. Hardware-assisted AI/ML being "limited" to specific use cases while an upscaling tech being "general purpose"? Give me a break...
Honestly, you sound like you don't know how Machine Learning and Artificial Intelligence works. It's typically incredibly focused.
Ah, thanks for bringing up my favorite pet peeve (OK, one of many really) of the last ~5 years or so: the fact that we went from AI = "Skynet wakes up and takes over the world" to AI = "my coffeemaker can deduce how many sugars I take my coffee with and suggest new recipes for my morning coffee based off a list it downloaded from the coffeemaker vendor's server".
I wouldn't even call this "instinct", because instinct implies the subconscious ability to adapt to new situations by creating new responses on the fly. I'd just call it for what it is: highly specialized deterministic algorithms programmed by highly specialized people to do highly specialized tasks, with the ability to better adapt to these tasks by using a preprogrammed set of highly specialized criteria to perform a highly specialized form of reflection.
In other words the key differentiating factor when compared to normal software is "highly specialized", so kudos to the developers that have dedicated thousands of man-hours to make such a tech possible, but otherwise AI/ML is simply just another form of good ole programming. Nothing more, nothing less, despite what 21st century marketing-speak would have us believe.
"Instinct" doesn't imply that because in its nature, an instinct is a genetically encoded sequence of reactions to specific stimuli. It can be as simple as a plant reacting to the change of humidity and as complex as humans striving to preserve their race.
Sure, but instinct can evolve on its own and create new reactions and motives over time (and over generations), while our algorithms can't, for the simple reason that we don't adequately understand the natural mechanism that creates these new reactions and motives, so we can't perfectly imitate it (just as we can't imitate e.g. abstract thought and convert it into electrical signals - we simply don't know enough about how the brain works). As it stands, no matter how complex our "AI" appears to be, its limit to "create" is only what we've programmed it to be able to create, and no more. So the more "specialized" the person creating the AI is, the more "specialized" that AI can ultimately be.
People don't have to be "highly-specialized" to implement a neural-network or a genetics algorithm - as you said, they're just regular algorithms - "good ol' programming".
Well alright, if you say so - I was just trying to be gracious and not call them "good ole script kiddies" :P But joking aside, I'm not bashing the quality of these algorithms, or their usefulness, or their limitations, or how easy or hard it is to create them and/or train them; I'm only bashing the marketing-focused change in lingo: what we market today as AI/ML, and how we market it to the masses, is parsecs away from actual intelligence, let alone actual awareness, i.e. what "AI" originally used to mean. And IMHO we shouldn't be calling it instinct either because even instinctive reactions in nature are (or can be) infinitely more complex and more nondeterministic than what we can currently do with our AI/ML stuff.
Do you not draw a distinction between dismissing a technology and dismissing a person? If you are the literal personification of Machine Learning, such that an attack on it is an attack on you, I'm sure we all apologize, and I for one would like to express my starstruck feelings at finally meeting an actual Platonic Ideal.I've liked your detailed responses that follow this post, because they explain how DLSS works. Thank you.Don't you think it's also very condescending and sassy to dismiss innovative technology as "something over-specialized and good-as-dead"?
But this initial response was you dismissing other people's opinions snarkily. I've no time for that attitude. The air of superiority and mocking of other people's comments is infuriating. And you continue to do so. Language like "the other user" (they have a name), "Tried to pretend" , "gimme a break", etc. So condescending.
Last edited by Purple Library Guy on 3 June 2021 at 4:26 pm UTC
Just being Nvidia-only is good enough for me to fully get behind FSR. And the AMD showcase video for it was pretty impressive given how young the technology is (dunno what user "sub" was talking about above, claiming that FSR doesn't look as good - not only is there barely any difference, the whole point of these technologies is that they won't look as good, but you'll get 100%+ FPS out of them at high-res, and if you can only tell the difference in a side-by-side video, then that's clear "good enough").
I'm a strong AMD supporter and I absolutely hate the politics of Nvidia exploiting vendor-lockins.
Let's see how this does when it ships. I'd be more than happy to be proven wrong here,
as the open approach across vendors is the way to go, imho.
Just being Nvidia-only is good enough for me to fully get behind FSR. And the AMD showcase video for it was pretty impressive given how young the technology is (dunno what user "sub" was talking about above, claiming that FSR doesn't look as good - not only is there barely any difference, the whole point of these technologies is that they won't look as good, but you'll get 100%+ FPS out of them at high-res, and if you can only tell the difference in a side-by-side video, then that's clear "good enough").
I'm a strong AMD supporter and I absolutely hate the politics of Nvidia exploiting vendor-lockins.
Let's see how this does when it ships. I'd be more than happy to be proven wrong here,
as the open approach across vendors is the way to go, imho.
Yep, I'm with you. I was just surprised you found FSR to be poor quality. I was watching that video and thinking, holy cow, I can't tell the difference, but the framerates are 60%+ better! And the way you can choose quality or framerates, very nice. I hope it succeeds.
Just being Nvidia-only is good enough for me to fully get behind FSR. And the AMD showcase video for it was pretty impressive given how young the technology is (dunno what user "sub" was talking about above, claiming that FSR doesn't look as good - not only is there barely any difference, the whole point of these technologies is that they won't look as good, but you'll get 100%+ FPS out of them at high-res, and if you can only tell the difference in a side-by-side video, then that's clear "good enough").
I'm a strong AMD supporter and I absolutely hate the politics of Nvidia exploiting vendor-lockins.
Let's see how this does when it ships. I'd be more than happy to be proven wrong here,
as the open approach across vendors is the way to go, imho.
Yep, I'm with you. I was just surprised you found FSR to be poor quality. I was watching that video and thinking, holy cow, I can't tell the difference, but the framerates are 60%+ better! And the way you can choose quality or framerates, very nice. I hope it succeeds.
We'll see. I indeed find the tech demo rather washed out (compared to DLSS 2.0 tech demos I saw).
You'll notice that from left to right there are less and less high-contrast sharp edges in the scene itself.
I claim the choice of the tech demo and it's segmentation was not by chance.
While it suggests "hey, it's the same scene you see" it looks to me that the scene was specifically crafted
to hide the problematic bits.
Yeah. You said "a partially emotional response was justified and expected" but you didn't give a rationale for why that should be the case. "You work in the field" is not the same as "You are individually under attack when anyone says anything about that field". I have a good friend who works for an oil company; he doesn't get defensive any time someone worries about pipeline spills--in fact, he's the first to point out that all pipelines leak.Do you not draw a distinction between dismissing a technology and dismissing a person? If you are the literal personification of Machine Learning, such that an attack on it is an attack on you, I'm sure we all apologize, and I for one would like to express my starstruck feelings at finally meeting an actual Platonic Ideal.I've liked your detailed responses that follow this post, because they explain how DLSS works. Thank you.Don't you think it's also very condescending and sassy to dismiss innovative technology as "something over-specialized and good-as-dead"?
But this initial response was you dismissing other people's opinions snarkily. I've no time for that attitude. The air of superiority and mocking of other people's comments is infuriating. And you continue to do so. Language like "the other user" (they have a name), "Tried to pretend" , "gimme a break", etc. So condescending.
Have you read the rest of my comment? If you did, then you have my answer.
And you said "It's not my job to educate the layman". Well, nice of you to go above and beyond, and condescend to correct the unwashed, I suppose.
Just being Nvidia-only is good enough for me to fully get behind FSR. And the AMD showcase video for it was pretty impressive given how young the technology is (dunno what user "sub" was talking about above, claiming that FSR doesn't look as good - not only is there barely any difference, the whole point of these technologies is that they won't look as good, but you'll get 100%+ FPS out of them at high-res, and if you can only tell the difference in a side-by-side video, then that's clear "good enough").
I'm a strong AMD supporter and I absolutely hate the politics of Nvidia exploiting vendor-lockins.
Let's see how this does when it ships. I'd be more than happy to be proven wrong here,
as the open approach across vendors is the way to go, imho.
Yep, I'm with you. I was just surprised you found FSR to be poor quality. I was watching that video and thinking, holy cow, I can't tell the difference, but the framerates are 60%+ better! And the way you can choose quality or framerates, very nice. I hope it succeeds.
We'll see. I indeed find the tech demo rather washed out (compared to DLSS 2.0 tech demos I saw).
You'll notice that from left to right there are less and less high-contrast sharp edges in the scene itself.
I claim the choice of the tech demo and it's segmentation was not by chance.
While it suggests "hey, it's the same scene you see" it looks to me that the scene was specifically crafted
to hide the problematic bits.
Well, it's a demo, so you wouldn't expect otherwise, right? They'll be savvy to show the tech doing its best, naturally.
But my point is that side by side... ok, maybe you can see the difference. But I struggled to see any meaningful difference, so I can absolutely guarantee that when a) I'm immersed in game and b) there's no side-by-side comparison, I'm definitely not going to see any difference.
But I do care about framerate. I do definitely see when framerate starts to suffer. So any tech that can deliver solid framerates while still looking beautiful at 4K... that gets my vote.
I mean, for me, comparing FSR to DLSS is like those reviews that compare iPhone to Android. I just don't care. I'm never going to buy Nvidia, or Apple... so I admit that I'm extremely narrow-minded about this stuff. FSR isn't as good as DLSS? Okay, then. So what? I guarantee that FSR is better than not having FSR, and that's all that matters to me, personally!
Last edited by scaine on 3 June 2021 at 9:47 pm UTC
Yes it is. The standards for how it's OK to talk about technologies and about persons are fundamentally different. It is ethically OK to diss a technology or treat a technology with disrespect, even if you are wrong about it. There is no victim there because technologies are not persons.Yeah. You said "a partially emotional response was justified and expected" but you didn't give a rationale for why that should be the case. "You work in the field" is not the same as "You are individually under attack when anyone says anything about that field". I have a good friend who works for an oil company; he doesn't get defensive any time someone worries about pipeline spills--in fact, he's the first to point out that all pipelines leak.Do you not draw a distinction between dismissing a technology and dismissing a person? If you are the literal personification of Machine Learning, such that an attack on it is an attack on you, I'm sure we all apologize, and I for one would like to express my starstruck feelings at finally meeting an actual Platonic Ideal.I've liked your detailed responses that follow this post, because they explain how DLSS works. Thank you.Don't you think it's also very condescending and sassy to dismiss innovative technology as "something over-specialized and good-as-dead"?
But this initial response was you dismissing other people's opinions snarkily. I've no time for that attitude. The air of superiority and mocking of other people's comments is infuriating. And you continue to do so. Language like "the other user" (they have a name), "Tried to pretend" , "gimme a break", etc. So condescending.
Have you read the rest of my comment? If you did, then you have my answer.
And you said "It's not my job to educate the layman". Well, nice of you to go above and beyond, and condescend to correct the unwashed, I suppose.
It's not about "dismissing a technology" and then "dismissing a person"
So. A person treating a technology with disrespect, even if you think that person is wrong about that technology, is not a reason to treat that person with disrespect. There is no equivalence between those two things. If you do, they have done nothing wrong but you have. You don't seem to get this. But aside from its intuitive obviousness, it flows very directly from every ethical theory I'm aware of; you could friggin' prove it with formal logic. On a less formal level, refusing to get it is going to get you into a lot of angry situations due to you ragging on people for, as far as they and logic can tell, no reason.
Last edited by Purple Library Guy on 4 June 2021 at 5:25 pm UTC
So, basically, your position is that if someone says bad things about a technology you support, you are within your rights to diss them personally, because those are both the same. You are wrong and your arguments for it being the case are transparently mistaken. While you are no doubt an expert on certain technologies, I do not get the impression that you are particularly knowledgeable about ethical philosophy or the analysis of discourse. I do know something about these things, so by your standards (which amount to an extension of the "argument from authority" logical fallacy) perhaps you should be taking my word that you are wrong.Yeah. You said "a partially emotional response was justified and expected" but you didn't give a rationale for why that should be the case. "You work in the field" is not the same as "You are individually under attack when anyone says anything about that field". I have a good friend who works for an oil company; he doesn't get defensive any time someone worries about pipeline spills--in fact, he's the first to point out that all pipelines leak.Do you not draw a distinction between dismissing a technology and dismissing a person? If you are the literal personification of Machine Learning, such that an attack on it is an attack on you, I'm sure we all apologize, and I for one would like to express my starstruck feelings at finally meeting an actual Platonic Ideal.I've liked your detailed responses that follow this post, because they explain how DLSS works. Thank you.Don't you think it's also very condescending and sassy to dismiss innovative technology as "something over-specialized and good-as-dead"?
But this initial response was you dismissing other people's opinions snarkily. I've no time for that attitude. The air of superiority and mocking of other people's comments is infuriating. And you continue to do so. Language like "the other user" (they have a name), "Tried to pretend" , "gimme a break", etc. So condescending.
Have you read the rest of my comment? If you did, then you have my answer.
And you said "It's not my job to educate the layman". Well, nice of you to go above and beyond, and condescend to correct the unwashed, I suppose.
It's not about "dismissing a technology" and then "dismissing a person"
Yes it is. The standards for how it's OK to talk about technologies and about persons are fundamentally different ... There is no victim there because technologies are not persons.
That technology was created by brilliant engineers. When someone cusses their creation because of unrelated prejudice, those engineers become the victims(it has social and monetary effects) - and also this forum's users because Fear + Uncertainty + Doubt is a just a shady tactic and being the victim of this misinformation-warfare is bad for every layman because it might affect their daily lives. You seem to be very concerned about what the individual in question feels despite the fact that the individual dismissed the feelings and concerns of the experts. I didn't call out that user's opinion because they were not interested in a piece of technology but because their comment was made with prejudice - it was just the cherry on top that he didn't even understand the tech(same happened when they mentioned CUDA). This case is not like rightfully criticizing the oil pipelines from your example, but much closer to calling a new linux feature shit/DOA because you're an ms investor and use windows. Hiding behind the pretense that you're only attacking a piece of technology and you're not actively hurting anyone is nonsense. It wasn't constructive criticism created from professional insight - it was just propaganda and this is what I responded with:
This sounds like it was written by someone who doesn't understand either of these features. Hardware-assisted AI/ML being "limited" to specific use cases while an upscaling tech being "general purpose"? Give me a break...
As you can see, I didn't call anyone stupid but correctly inferred(in an annoyed tone!) that the individual doesn't understand the tech and just posted a hot-take. Later, they had the chance to explain themselves but they doubled-down on this antagonistic behavior by attacking CUDA instead. If the user's goal was just to post their opinion(despite not being an expert) then they had the chance to do a bit of "research" and acquire knowledge. But as you all can see, it turned into something else: you all took offense instead and the user doubled-down. I don't know if the user changed their mind and it's not like I expect them to start to publicly appreciate the tech but none of the experts want to be the indirect victims of such foul campaigns. If you don't like certain business strategies of a company then you can call those out explicitly, no need to attack the others.
Similarly, when you and another user misunderstood our AI tech as "something less than instinct" you were technically incorrect and just fell to the trap of the common "true-AI" misconception simply because of your incorrect ideas about artificial "intelligence" and "instinct" - you were actively dismissing our work and achievements but I understood that you're doing it as some short of generic self-defense from this world's endless marketing garbage(still not a good justification) and not because you've an actual motive. I've tried to correct you but with a different tone.
There are also standards about what is OK in a healthy/rational community when it comes to information-sharing - as I said, if you want to completely give up impartiality and be totally OK with becoming something shallow - like the phoronix.com or the r/PS5 forums - then it's your choice but the experience of the many is above the whims of the individual. If you start sharing your opinion about things you've no experience with and dismiss the struggle of the experts then that forum will quickly reach facebook-levels of anti-intellectualism and soon, the experts will think about that forum as a cesspool - which in turn will hurt everyone who likes to lurk there.
I predict this mistaken attitude is going to get you into a lot of fights. Indeed, no doubt it already has.
Edit: and they deleted their account so this comment might look a bit odd.
Last edited by Liam Dawe on 6 June 2021 at 1:46 pm UTC
See more from me