The GPU race continues on once again, as NVIDIA have now officially announced the GeForce RTX 2000 series of GPUs and they're launching in September.
This new series will be based on their Turing architecture and their RTX platform. These new RT Cores will "enable real-time ray tracing of objects and environments with physically accurate shadows, reflections, refractions and global illumination." which sounds rather fun.
They will start off with three models to succeed their current top of the line:
- RTX 2070 with 8GB GDDR6, available in October
- RTX 2080 with 8GB GDDR6, available in September
- RTX 2080 Ti with 11GB GDDR6, available in September
Naturally, for a brand new series they won't be cheap!
The "Founders Edition" NVIDIA are offering will be £1,099/$1,199 for the RTX 2080 Ti, £749/$799 for the RTX 2080 and £569/$599 for the RTX 2070. From what I've seen, these editions will have a higher clock boost over the normal editions.
The normal "Reference" editions will be cheaper of course, with the RTX 2080 Ti at $999, RTX 2080 at $699 and RTX 2070 at $499. Unsure on the UK prices for the normal editions, as I can't see them listed currently but you get the idea.
Direct Link
NVIDIA generally have good support for new GPUs on Linux, so I'm sure a brand new driver is already on the way to be released soon.
See more on the official NVIDIA site, their announcement blog post and this post as well.
Will you be picking one up, will you be waiting for the normal edition or will you wait and see what AMD have to offer?
Anyway I think I know whats going on, compare the mad max benchmarks of the two links and you will see two things, first is that the Vega cards now have slower performance for some reason (regression) and second is the 1070ti has had its performance increased.
This could be CPU related, I dunno if the results were done on the similar hardware, the recent results were done at 5ghz so we should have seen the CPU eliminated from the results somewhat since it will not struggle at 5ghz.
Anyway I prefer to go by the lastest bench results, and they clearly indicate VEGA has some issues, maybe Phoronix has some configuration issues I dunno.....
Last edited by TheRiddick on 21 August 2018 at 4:55 am UTC
Quoting: TheRiddick1920x1080 results are not overly reliably, the best way to do comparisons is 1440p and 4k because it actually stresses the GPU.
Not really. From the current hardware, nothing handles 4K well, let alone with 144Hz. So such tests are of low value. Current GPUs just didn't catch up to such monitors yet. May be next generation will be more applicable.
Quoting: TheRiddickAnyway I think I know whats going on, compare the mad max benchmarks of the two links and you will see two things, first is that the Vega cards now have slower performance for some reason (regression) and second is the 1070ti has had its performance increased.
This tells me that such benchmarks are obscuring actual hardware, since bottlenecks happen somewhere in the driver and regressions or improvements can occur.
Last edited by Shmerl on 21 August 2018 at 4:59 am UTC
Its about stressing the GPU not if the benchmark performs at a playable FPS (most are over 60fps with 1440p and some at 4k btw).
Anyway I've mentioned the regression in results on my phoronix link, maybe someone will know whats going on..
Last edited by TheRiddick on 21 August 2018 at 5:01 am UTC
I guess full raytracing engines are still a thing of the future
Last edited by TheRiddick on 21 August 2018 at 5:09 am UTC
Believe me when I say that I spend 80% or more of my gaming time on PC. Still, I can't convince myself that a $500 video card is worth it when you can buy an entire console gaming system for $300. Come on.
Anyway when XBOX2 and PS5 hit, things might change in that respect, but I believe the next gen consoles might also be $100 more then previous launch price due to the worlds failing economy....
On another note:
I'm still with an Radeon HD 7950 alongside a Phenom II X4 940.
And it surprisingly works well for me (Playing FHD DOOM and Wolf2).
When I was way younger I burnt too much money on "fresh" hardware, reading reviews about every 2 % of performance advantage which in the end just cost a non-linear fortune compared to something much cheaper that would have served me equally well.
That's also why I couldn't care less about a performance lead of Nvidia - a company hostile towards open standards when they realize they can implement and rigorously exploit a vendor lock-in (see CUDA <-> OpenCL support).
Next card will be AMD again. For sure.
The open driver ecosystem on Linux has come a long way and it's working great.
People should be reminded that ray tracing compute heavily relies on FP16 performance.
See more from me