During GDC, AMD announced the latest big upgrade to come for AMD FidelityFX Super Resolution with version 3.1 planned to launch in Q2 2024.
The main improvements include:
- Upscaling image quality improvements:
- Improved temporal stability at rest and in movement – less flickering and/or shimmering and “Fizziness” around objects in motion.
- Ghosting reduction and better preservation of detail.
- Decoupling FSR 3 upscaling from frame generation:
- Allows FSR 3.1 frame generation technology to work with other upscaling solutions.
- New AMD FidelityFX API:
- Makes it easier for developers to debug and allows forward compatibility with updated versions of FSR.
- Vulkan and Xbox Game Development Kit (GDK) support.
The first game announced to be getting AMD FSR 3.1 is Ratchet & Clank: Rift Apart. AMD included some example gifs on the improvement in the quality expected from AMD FSR 3.1 like a reduction in ghosting which you can see first in AMD FSR 2.2 below and then AMD FSR 3.1:
AMD said that was Ratchet & Clank: Rift Apart running at 1080p using AMD FSR 2.2/3.1 in Performance mode. They also once again reminded people that a 60FPS minimum is needed before Frame Generation will give the best experience in image quality and input latency.
Various other games have been announced to be getting AMD FSR 3 on top of Ratchet & Clank: Rift Apart including Cyberpunk 2077, Dragon's Dogma 2, Dying Light 2 Stay Human, Frostpunk 2, Ghost of Tsushima Director’s Cut, NARAKA: BLADEPOINT, and Warhammer 40,000: Space Marine 2.
More in the full announcement.
Instead of spending god knows how much on dlss/fsr spend it on better hardware
Quoting: pete910More crap we don't need,
Instead of spending god knows how much on dlss/fsr spend it on better hardware
Problem is that the better hardware doesn't always exist.
Technology doesn't always advance in sync. Monitors first get higher resolution or refresh rate and then GPU:s need to catch up. Also GPU features like raytracing rise demands for the GPU. Dropping resolution gets the framerate back to acceptable level, but FRS and DLSS offer a good compromise.
Now i wonder if Ai can actually become useful here with the inclusion of Ai processing in AMD & Intel chips. Perhaps they can work out where the HUD and other Static elements are in any game you apply it too, to reduce shimmering / ghosting ?
Even if it's just a silly experiment i want to be able to use it via a steam launch command on any game i want. It might end up really great on some titles.
*edit forgot to add. What i DON'T want to happen is for Devs to use this technology as an excuse to reduce optimization on their already un-optimized titles to sh*t out more junky AAA cash grab games.
Last edited by Lofty on 21 March 2024 at 7:09 pm UTC
Quoting: LoftyLooking forward to be able to add this to any game even using Emulators to push games upto 240 FPS.
Frame generation that requires engine-provided motion vectors isn't suited for emulators, unless the emulator is able to perform matrix interpolation on its end (see RT64). At this stage, you might as well have the emulator render more frames by interpolating matrices, as oppoesed to using artifact-prone frame generation solutions.
Lossless Scaling's LSFG1 does frame generation without motion vectors pretty well, but it only works on Windows. AMD's AFMF also works, but is more finnicky due to it turning itself off if too much motion occurs at a time. It also requires games to use exclusive fullscreen and doesn't seem to work with OpenGL-based games or emulators from my testing. It's too bad because AFMF's quality is generally more convincing than LSFG1 (when it doesn't turn itself off).
It'll probably take a while until we see an universal frame generation solution that works on Linux, given it needs to hook onto low-level X11/Wayland stuff.
Generating more than 1 frame per input frame is something I'd like to see as well (outside of chaining Lossless Scaling and AFMF together...). This one can be viable already for games that are forced to run at 30 FPS and nothing more, or 60 FPS on a fast GPU at relatively low resolutions. However, there's no algorithm out there that can do this in real-time currently.
Last edited by Calinou on 21 March 2024 at 9:07 pm UTC
Quoting: AnzaPlus this also woerks on Nvidia, so 30series card can use it unlike the latest DLSS incarnation that needs a 40XXcard.Quoting: pete910More crap we don't need,
Instead of spending god knows how much on dlss/fsr spend it on better hardware
Problem is that the better hardware doesn't always exist.
Technology doesn't always advance in sync. Monitors first get higher resolution or refresh rate and then GPU:s need to catch up. Also GPU features like raytracing rise demands for the GPU. Dropping resolution gets the framerate back to acceptable level, but FRS and DLSS offer a good compromise.
Quoting: AnzaQuoting: pete910More crap we don't need,
Instead of spending god knows how much on dlss/fsr spend it on better hardware
Problem is that the better hardware doesn't always exist.
Technology doesn't always advance in sync. Monitors first get higher resolution or refresh rate and then GPU:s need to catch up. Also GPU features like raytracing rise demands for the GPU. Dropping resolution gets the framerate back to acceptable level, but FRS and DLSS offer a good compromise.
Not really anything to do with monitors.
What I am meaning is either better hardware per tier or find ways of making the hardware more affordable at a given tier rather than £1000 odd for a 80 class card which is just **** ridiculous.
I have the money for a 4090 but be damned if I'm feeding the greed of these companies!
Quoting: pete910or find ways of making the hardware more affordable at a given tier rather than £1000 odd for a 80 class card which is just **** ridiculous.Oh God, tell me about it. I had originally planned to branch out a bit from portables and learn to build a desktop machine this year, but the prices, even on the more entry-level end of things (which is really all I'm aiming for), are utterly insane! I think I might have to put it off 'til next year, at this rate.
I have the money for a 4090 but be damned if I'm feeding the greed of these companies!
Quoting: pete910Quoting: AnzaQuoting: pete910More crap we don't need,
Instead of spending god knows how much on dlss/fsr spend it on better hardware
Problem is that the better hardware doesn't always exist.
Technology doesn't always advance in sync. Monitors first get higher resolution or refresh rate and then GPU:s need to catch up. Also GPU features like raytracing rise demands for the GPU. Dropping resolution gets the framerate back to acceptable level, but FRS and DLSS offer a good compromise.
Not really anything to do with monitors.
What I am meaning is either better hardware per tier or find ways of making the hardware more affordable at a given tier rather than £1000 odd for a 80 class card which is just **** ridiculous.
I have the money for a 4090 but be damned if I'm feeding the greed of these companies!
Upscaling should help in both scenarios. Though if money is the issue, previous gen cards might offer close enough performance. Every update doesn't always have things that are absolutely essential. New and shiny things take time to be implemented into games. Downside is that hardware becomes obsolete faster.
Also one saving opportunity is to avoid playing the latest AAA titles. Indie games and older AAA titles are less demanding on hardware. Easier said than done though, I still updated to beefier machine, though still I spend most of my time in indie games.
Quoting: PenglingQuoting: pete910or find ways of making the hardware more affordable at a given tier rather than £1000 odd for a 80 class card which is just **** ridiculous.Oh God, tell me about it. I had originally planned to branch out a bit from portables and learn to build a desktop machine this year, but the prices, even on the more entry-level end of things (which is really all I'm aiming for), are utterly insane! I think I might have to put it off 'til next year, at this rate.
I have the money for a 4090 but be damned if I'm feeding the greed of these companies!
If a cheap desktop PC for office/multimedia use cases is your goal, I'd probably point towards prebuilt mini PCs nowadays, which are more cost-efficient than building one yourself (on top of being much smaller). These have laptop CPUs (like the 7840U or 7940HS), which are slower than high-end desktop CPUs but they benefit from the same fast IGPs as you find in these laptops.
Last edited by Calinou on 26 March 2024 at 3:01 am UTC
See more from me