Confused on Steam Play and Proton? Be sure to check out our guide.
We do often include affiliate links to earn us some pennies. See more here.

NVIDIA has revealed the first details of their third generation of RTX GPUs with Ada Lovelace, plus DLSS 3 is coming with big improvements too.

Models announced so far includes:

  • RTX 4090 - $1,599 (£1,679) - 24GB GDDR6X - 450W - October 12th
  • RTX 4080 - $1,199 (£1,269) - 16gb GDDR6X - 320W or 12GB GDDR6X - 285W at $899 (£949) - (November sometime)

Click pictures to enlarge:

Some other features mentioned:

  • Streaming multiprocessors with up to 83 teraflops of shader power — 2x over the previous generation.
  • Third-generation RT Cores with up to 191 effective ray-tracing teraflops — 2.8x over the previous generation.
  • Fourth-generation Tensor Cores with up to 1.32 Tensor petaflops — 5x over the previous generation using FP8 acceleration.
  • Shader Execution Reordering (SER) that improves execution efficiency by rescheduling shading workloads on the fly to better utilize the GPU’s resources. As significant an innovation as out-of-order execution was for CPUs, SER improves ray-tracing performance up to 3x and in-game frame rates by up to 25%.
  • Ada Optical Flow Accelerator with 2x faster performance allows DLSS 3 to predict movement in a scene, enabling the neural network to boost frame rates while maintaining image quality.
  • Architectural improvements tightly coupled with custom TSMC 4N process technology results in an up to 2x leap in power efficiency.
  • Dual NVIDIA Encoders (NVENC) cut export times by up to half and feature AV1 support. The NVENC AV1 encode is being adopted by OBS, Blackmagic Design DaVinci Resolve, Discord and more.

During the event they showed off Cyberpunk 2077 and Microsoft Flight Simulator running with DLSS3 and the performance uplift did seem pretty impressive. Oh, and Portal RTX is a thing coming as a free DLC in November.

YouTube Thumbnail
YouTube videos require cookies, you must accept their cookies to view. View cookie preferences.
Accept Cookies & Show   Direct Link

"DLSS is one of our best inventions and has made real-time ray tracing possible. DLSS 3 is another quantum leap for gamers and creators," said Jensen Huang, founder and CEO of NVIDIA. "Our pioneering work in RTX neural rendering has opened a new universe of possibilities where AI plays a central role in the creation of virtual worlds."

DLSS3 will release with Ada Lovelace on October 12th and these games / engines will support it:

  • A Plague Tale: Requiem
  • Atomic Heart
  • Black Myth: Wukong
  • Bright Memory: Infinite
  • Chernobylite
  • Conqueror's Blade
  • Cyberpunk 2077
  • Dakar Rally
  • Deliver Us Mars
  • Destroy All Humans! 2 - Reprobed
  • Dying Light 2 Stay Human
  • F1® 22
  • F.I.S.T.: Forged In Shadow Torch
  • Frostbite Engine
  • HITMAN 3
  • Hogwarts Legacy
  • ICARUS
  • Jurassic World Evolution 2
  • Justice
  • Loopmancer
  • Warhammer 40,000: Darktide
  • Marauders
  • Microsoft Flight Simulator
  • Midnight Ghost Hunt
  • Mount & Blade II: Bannerlord
  • Naraka Bladepoint
  • NVIDIA Omniverse™
  • NVIDIA Racer RTX
  • PERISH
  • Portal With RTX
  • Ripout
  • S.T.A.L.K.E.R 2: Heart of Chornobyl
  • Scathe
  • Sword and Fairy 7
  • SYNCED
  • The Lord of the Rings: Gollum
  • The Witcher 3: Wild Hunt
  • THRONE AND LIBERTY
  • Tower of Fantasy
  • Unity
  • Unreal Engine 4 & 5

Oh, they also announced RTX Remix, a free modding platform built in their NVIDIA Omniverse that they say allows people to make RTX mods for various games that include "enhanced materials, full ray tracing, NVIDIA DLSS 3, and NVIDIA Reflex". It will support DirectX 8 and DirectX 9 games with fixed function graphics pipelines.

Those with a keen eye might spot a familiar bit of open source tech being used for it too:

DXVK for those who don't quite get it. The same translation tech used in Proton to get Windows games to run with Vulkan. So all mods made with it will run with Vulkan!

See the full video below:

YouTube Thumbnail
YouTube videos require cookies, you must accept their cookies to view. View cookie preferences.
Accept Cookies & Show   Direct Link

In other related NVIDIA GPU news, recently EVGA has broken off from NVIDIA and will no longer do their GPUs. In a brief announcement on their official forum they posted:

Hi all,

You may have heard some news regarding the next generation products from EVGA. Please see below for a message on future products and support:

 
  • EVGA will not carry the next generation graphics cards.
  • EVGA will continue to support the existing current generation products.
  • EVGA will continue to provide the current generation products.
EVGA is committed to our customers and will continue to offer sales and support on the current lineup. Also, EVGA would like to say thank you to our great community for the many years of support and enthusiasm for EVGA graphics cards.

EVGA Management
Article taken from GamingOnLinux.com.
16 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
48 comments
Page: «2/3»
  Go to:

mahagr Sep 20, 2022
Those RTX 40 series cards look interesting as well as nVidia using dxvk to improve older DX9 games. Let's hope those also benefit Linux users.

What comes to the prices, all of the announced cards are too expensive me to consider buying any of them. And that is even if I had the money -- I just cannot justify spending that much money for occasional gaming session even if I had more time to play games. And for kids.. NO WAY.

It also looks like that nVidia will not drop RTX 3060 prices at all, which likely means that the other models will not see much price reduction either. I guess that it is a good (but nonetheless disappointing) thing as I already happen to own RTX 30 cards.
randyl Sep 20, 2022
I wish US anti-trust regulators would dig into Nvidia because they're absolutely abusing their dominant market position to the detriment of consumers. That is the core trigger for the Sherman Act and other US antitrust law.

The base edition of this card is $200 more than the 3080FE (which MSRP'd at $699). This is absolutely bonkers.

I'm also waiting for RDNA3 and have hopes that something power efficient, which matters a lot to me, will be on the table from AMD. For now I'll stick with the 1660ti I have.
Lofty Sep 20, 2022
350 - 450TDP. Summer is going to be real fun when running this along side a 250w intel CPU and the all the accessories. Buying used isn't going to solve that a few years down the line, not to mention a used buyer is still going to need a high quality power supply, its not like you can bung one of these in a $50 case with a $50 PSU either without issues.

I just bought a cheap Mini PC for every day tasks to save on power. Given the current and probably 'new normal' electricity prices i considered it essential. To my surprise in everything other than gaming it works wonders. It's completely silent, running Standard Linux Mint 20.1 on MESA drivers. It plays all the way upto 4k 60FPS ! @ 15w on youtube. Which is no mean feat compared to my thirsty older desktop PC which can't even play 1080p 60 without dropping a few frames (due to lack of hardware acceleration) using 115w when only watching a YT or Twitch stream !

Seriously, invest in a cheap power meter of some kind and be amazed at how much a desktop is costing you doing practically nothing. Not to mention other household appliances. I have an old TV we practically use as a heater in winter because it is CCFL backlit and uses a ton of power and its only 32"!

I think when im just running a mini PC and a steam deck in winter im going to need an extra pair of socks while gaming.

That's a thought.. maybe you could use these new NVidia GPU's as a foot heater.. 🤔️
x_wing Sep 20, 2022
OMG this prices... lets hope that Intel can create a good GPU a kick them all with an aggressive pricing strategy.
Purple Library Guy Sep 20, 2022
Quoting: DrMcCoylol, 450W. I'm sorry, but this is utterly ridiculous.
I'm waiting for cards to reach the point where you start playing a game, the graphics card starts drawing power, and the breaker blows.
Mohandevir Sep 20, 2022
Quoting: Purple Library Guy
Quoting: DrMcCoylol, 450W. I'm sorry, but this is utterly ridiculous.
I'm waiting for cards to reach the point where you start playing a game, the graphics card starts drawing power, and the breaker blows.

Making me remember...

https://www.youtube.com/watch?v=inWKw8nqQlI

96 days... Lol!
denyasis Sep 20, 2022
I had to look..
In 2016, the MSRP of a 1080TI, was $499 (that was harder to find than I thought it would be, so I may be a bit off). In 2022, that would be about $615-$620.

I'm pretty sure my last build cost as much as a 4090... And I'm relatively sure it can still run anything released now or for the next few years, lol.

Side now, how would you cool that? I have enough trouble keeping my 1070Ti cool in my little itx case....
fireplace Sep 20, 2022
Cool, but what about proper Wayland support? Or free kernel modules that are actually usable on desktop?
Jahimself Sep 20, 2022
Low budget becoming the price of high end, couple of years ago.
tfk Sep 20, 2022
1959 euros for a video card. Didn't know inflation was that bad.

I will wait and see what AMD will do.
redneckdrow 10 years Sep 20, 2022
Too little too late, in my case. I'm extremely happy with my new RX 6600!
iWeaker4You Sep 20, 2022
I just bought an RX 6700 XT and the truth is that I am very happy to have it, I am seeing that NVIDIA is inflating its prices a lot, something that I already doubt if it will be worth acquiring or not one of its graphics.
iskaputt Sep 20, 2022
And here is me, contemplating when to get a new card in the ~200 watts, 400-500 Euros region.
Lofty Sep 20, 2022
Quoting: iskaputtAnd here is me, contemplating when to get a new card in the ~200 watts, 400-500 Euros region.

My absolute limit for a PC in total watts is around 275 - 300w, when you factor in a couple of monitors. Summers without AC means i barely get to play a game for a long period of time before the temperature creeps up especially in humid weather. I suppose it depends on the building and room size, but for me i can't imagine doubling that ! let alone the energy costs.

Steam deck 2.0 FTW i think that will be a goto for lots of peoples summer gaming, you can sit outside in the shade and still game. Things really are changing in the PC gaming world and you would think given the success of Nintendo switch powered by nvidia, they would of have released/teased a Steam deck equivalent by now instead of focusing on the wealthiest gamers and industrial customers.
KohlyKohl Sep 20, 2022
Quoting: redneckdrowToo little too late, in my case. I'm extremely happy with my new RX 6600!

Mine is arriving tomorrow and it'll be the first time I've not had an NVidia GPU in my desktop.

After the eVGA announcement I decided I'm done with NVidia and bought this card.

It would be really nice if eVGA decided to manufacture AMD GPUs and that other NVidia manufactures move away from NVidia and start making AMD and Intel GPUs instead.
CatKiller Sep 20, 2022
View PC info
  • Supporter Plus
Quoting: GuestYou know why they implement RTX for those closed-space games or two decade old games? Because perfomance hit won't be as big as doing it for open world games with dynamic daynight cycle, where raytracing would help the most to provide GI or at least global occlusion.
Nothing to do with day/night cycles. They do look great with ray tracing. See Q2RTX, for example. It's because open areas need more rays in order to adequately hit every surface. Just geometry.
gradyvuckovic Sep 20, 2022
Quoting: cookiEoverdoseHaha, every time - "quantum leap", they don't know what that actually means.

Yup every time I see a company describe something as a 'quantum leap' I feel compelled to point out to everyone that the definition of quantum is literally 'the smallest amount or unit of something'.

Which means what they're saying is that DLSS 3 is the smallest possible improvement they could possibly make over DLSS 2? Did they fix a typo and increase the number?
gradyvuckovic Sep 21, 2022
Quoting: Guest
Quoting: CatKiller
Quoting: GuestYou know why they implement RTX for those closed-space games or two decade old games? Because perfomance hit won't be as big as doing it for open world games with dynamic daynight cycle, where raytracing would help the most to provide GI or at least global occlusion.
Nothing to do with day/night cycles. They do look great with ray tracing. See Q2RTX, for example. It's because open areas need more rays in order to adequately hit every surface. Just geometry.
If there is no daynight cycle, then baking light and reflection data is possible, and there is not much need in realtime raytracing, that's why I mentioned it in the first place.

Exactly.

It's easy to take these old games from a decade or more ago, with their relatively simple lighting engines, and throw some raytracing on them for a noticeable difference. You could make a game like Portal look almost as good as the raytraced result by simply updating the game engine to a more recent version, increasing the resolution of the baked lightmaps, placing more light probes, and using more recent real time approximations of GI, such as screenspace GI, using more real time light sources with shadow mapping enabled, etc.

In traditional boxy shaped level designs of old first person games, like Doom for example, you could easily setup a lightmap, some light probes, reflection probes with parallax projection shapes, bake the result, and the outcome is going to look so close to the raytraced result, that in many cases the raytrace result is simply not worth it.

Even outdoor scenes, if the lighting is static, such as a counter strike or half life level for example, the results of baked lighting can look fantastic. And this is how most game engines have worked for a long time and how most engines still work today.

This is exactly what I do in Blender, when I setup a scene to be rendered in Eevee instead of Cycles, I setup and bake lighting for a scene, bake reflection probes, etc, and the results I can get from a 2 second Eevee render are almost as good as 5 minute renders from Cycles, which is a full blown pathtracer.

The only issue is if there are dynamic changes in lighting, such as a dynamic day/night cycle, but even in those situations there are options, such as baking all the non-sky based lighting separately from the sky, and having different baked versions of the sky lighting if the day/night cycle is always the same with no weather variations, or other trickery to still fall back to baked results.

And sometimes, really, the difference is just not noticeable even if the baked result is sometimes 'wrong', if it's close enough most people would not be able to tell you if it is wrong or not by simply looking at it. They'd need a side by side comparison with the 'ground truth' raytraced result to know for sure where or how it's wrong.

There are more modern solutions like Godot's SDFGI which is real time too.

And if you mix those kinds of algorithms with something like screenspace raymarching for global illumination, which basically takes that approximated or baked result and 'corrects' it where possible with any information available in the screenspace, the result is almost perfect and for a fraction of the performance cost of raytracing.

Why use 10x the processing power to compute a result that looks barely any different to an approximated result?

There's a lot of ways in which real time raytracing doesn't make sense...

The places where real time raytracing makes the most sense, are in situations where the lighting setup is so dynamic there is basically no room for baking. But there aren't many games which tick that box, survival building/crafting games mostly, or games like No Mans Sky, where the 'game world' is procedural and huge, so baking lighting is just out of the question.


Last edited by gradyvuckovic on 21 September 2022 at 1:35 am UTC
Phlebiac Sep 21, 2022
I find this page to be a nice resource:
https://www.videocardbenchmark.net/gpu_value.html

I asked Phoronix about doing something similar (with Linux benchmarks, of course!), but I think he didn't want to get into tracking pricing (although it would probably help his funding via affiliate links).
CatKiller Sep 21, 2022
View PC info
  • Supporter Plus
Quoting: Guest
Quoting: CatKiller
Quoting: GuestYou know why they implement RTX for those closed-space games or two decade old games? Because perfomance hit won't be as big as doing it for open world games with dynamic daynight cycle, where raytracing would help the most to provide GI or at least global occlusion.
Nothing to do with day/night cycles. They do look great with ray tracing. See Q2RTX, for example. It's because open areas need more rays in order to adequately hit every surface. Just geometry.
If there is no daynight cycle, then baking light and reflection data is possible, and there is not much need in realtime raytracing, that's why I mentioned it in the first place.
I guess I misinterpreted. I thought you were lumping in the day/night cycle (no biggie, looks really nice) with the open world (needs lots of samples to work at all).
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.