NVIDIA have revealed the GeForce RTX 4070 as the latest in their Ada Lovelace architecture lineup. Plus, there's an initial release of RTX Remix.
For the RTX 4070, NVIDIA claim it gives the ability to "run most modern games at over 100 frames per second at 1440p resolution" (although this is when using DLSS). Specifications in comparison to the 4070 Ti, mostly highlighting the bits that are different between them (from here):
GeForce RTX 4070 Ti | GeForce RTX 4070 | |
---|---|---|
NVIDIA CUDA Cores | 7680 | 5888 |
Boost Clock (GHz) | 2.61 | 2.48 |
Base Clock (GHz) | 2.31 | 1.92 |
Standard Memory Config | 12 GB GDDR6X | 12 GB GDDR6X |
Memory Interface Width | 192-bit | 192-bit |
NVIDIA Encoder (NVENC) | 2x 8th Generation | 1x 8th Generation |
Idle Power (W) | 12 | 10 |
Video Playback Power (W) | 20 | 16 |
Average Gaming Power (W) | 226 | 186 |
Total Graphics Power (W) | 285 | 200 |
Required System Power (W) | 700 | 650 |
From what NVIDIA say it gives "on average 2.6x faster with DLSS 3" when compared with the RTX 2070 SUPER and "on average 1.4x faster" than the GeForce RTX 3080 with DLSS 3. NVIDIA also said that in games not using Ray Tracing or DLSS the RTX 4070 is "on par" with the RTX 3080 while "running at nearly half the power — and offering an additional 2GB of memory" plus they've given it 36MB of L2 cache.
When will you be able to get one? The GeForce RTX 4070 will be available from tomorrow priced at $599 / £589.
At least on pricing, it seems a fair bit more reasonable than what NVIDIA has been putting out previously, although it still feels quite inflated. Do mention in the comments what you think to the value of it. Do you think it's worth it?
NVIDIA also additionally just recently tagged the first public release of RTX Remix, their software to add path-tracing to classic games. It's available under the MIT license, making use of DXVK with their own fork. More about how it works can be read on their GitHub Wiki, it's what they used to create Portal with RTX.
It looks crazy now that a desktop GPU's _idle_ power is as much as the Steam Deck playing a not-so-basic game.
Last edited by Beamboom on 12 April 2023 at 8:06 pm UTC
Quoting: BeamboomI think it sounded to be a really good price? I paid twice that for my 3080 back then, and that was long after launch (although during the chip shortage period).
Several reviews show it outperformed by the 3080ti. I guess you get DLSS3 and better RT cores as a trade-off.
I'm considering switching to team red with a 6950. It seems similar cost-wise for more performance, excluding RT, at least while you can still get one.
Last edited by rcrit on 12 April 2023 at 8:55 pm UTC
Quoting: ShabbyXNever had I paid so much attention to power until the Steam Deck, and specifically just a few days ago when I started replaying Shadow of Mordor. I went in with the expectation that I should set everything to low, ended up setting everything to Ultra and it still runs the game at 40fps without hitches and consumes just 12 watts.
It looks crazy now that a desktop GPU's _idle_ power is as much as the Steam Deck playing a not-so-basic game.
Im watching some steamdeck playthrough's. Im always amazed at how they are able to put a mix of high/ultra and get 30-60fps on the deck. Sure it's 720p or with FSR a bit lower but AFAIK at 40fps the frametimes are similar to 60fps so it's not so bad.
It's making me wonder when will monitor manufacturers offer a 40hz mode for their monitors, that could help people on lower end hardware.
Last edited by Lofty on 13 April 2023 at 5:28 pm UTC
its always fixes or support new linux technology but never performance improvement.
If someone can correct me then I would appreciate it
Last edited by habernir on 13 April 2023 at 9:50 am UTC
Quoting: ShabbyXNever had I paid so much attention to power until the Steam Deck, and specifically just a few days ago when I started replaying Shadow of Mordor. I went in with the expectation that I should set everything to low, ended up setting everything to Ultra and it still runs the game at 40fps without hitches and consumes just 12 watts.I think nVidia are taking a particularly wrong turn energy-wise with the 'Lovelace lineup.
It looks crazy now that a desktop GPU's _idle_ power is as much as the Steam Deck playing a not-so-basic game.
Let's hope AMD is able to react wisely as their latest low-energy offerings have been disappointing with their PCI-E bandwith. What we need is a PCI-E 16x low-power GPU to take the 75W crown, and nobody delivers in that category...
See more from me