Ready for your next upgrade? NVIDIA think you might be and they're teasing what is most likely the GeForce RTX 3000 launch at the end of this month.
We don't know what they're actually going to call them, although they will be based on the already revealed Ampere architecture announced back in May. It's probably safe to say RTX 3000 for now, going by the last two generations being 1000 and 2000 but NVIDIA may go for something more fancy this time.
So what's going on? On Twitter, the official NVIDIA GeForce account tweeted the hashtag "UltimateCountdown" along with an 8 second teaser with some sort of sci-fi space explosion and the sound of a ticking clock. Additionally, their Twitter header image is now this:
Their first GeForce release was on August 31, 1999 - so it's pretty clear what's going to come on August 31, 2020 don't you think? Would be weird if it wasn't. Wccftech actually managed to grab some info from secret sources, that gave a roadmap of their planned releases and it seems they will launch in September.
We've put it on our calendar to keep an eye on.
Quoting: WorMzyI'm going to go out on a limb and guess they aren't going to announce that they're open sourcing their Linux driver and/or co-operating fully with the nouveau developers to make the open source driver feature complete.
Would be nice to be surprised....
Actual huang sadness about opensource collaboration
Back to topic if ampere is successfull maybe big navi could dont be out and leave for rdna 3 next year
Last edited by mrdeathjr on 10 August 2020 at 6:45 pm UTC
Quoting: Comandante ÑoñardoI repeat: We need a third player in the GPU market.Intel have been working on Xe for quite a while now. Actual real-world performance, and how much it's directed towards gaming rather than machine learning, are still to be determined, of course.
Quoting: Comandante ÑoñardoThey should calm down and release new GPUs every three years and not every year.I'll freely admit that I find AMD's products and release schedule confusing, but on the Nvidia side:
Fermi 2010
Kepler 2012
Maxwell 2014
Pascal 2016
Turing 2018
Consumer Ampere? 2020
You don't have to buy a new card every generation if you don't want to.
Last edited by CatKiller on 10 August 2020 at 7:51 pm UTC
Quoting: CatKillerI miss the days when Matrox still had competitive cards... they jad great hardware, stable drivers and their G series were mostly open source! Then they just couldn't keep up with the insane chase for frames per second, and said screw it and just focused on carxs that can control many screens...Quoting: Comandante ÑoñardoI repeat: We need a third player in the GPU market.Intel have been working on Xe for quite a while now. Actual real-world performance, and how much it's directed towards gaming rather than machine learning, are still to be determined, of course.
Oh well, the few times I have delved into AMD land, I have left disappointed.
Quoting: CatKillerIntel have been working on Xe for quite a while now. Actual real-world performance, and how much it's directed towards gaming rather than machine learning, are still to be determined, of course.
Also, Intel have problems with switching to smaller node process in CPUs, which also I assume will affect their GPUs.
Last edited by Shmerl on 10 August 2020 at 7:47 pm UTC
Quoting: CatKillerQuoting: Comandante ÑoñardoI repeat: We need a third player in the GPU market.Intel have been working on Xe for quite a while now. Actual real-world performance, and how much it's directed towards gaming rather than machine learning, are still to be determined, of course.
Quoting: Comandante ÑoñardoThey should calm down and release new GPUs every three years and not every year.I'll freely admit that I find AMD's products and release schedule confusing, but on the Nvidia side:
Fermi 2010
Kepler 2012
Maxwell 2014
Pascal 2016
Turing 2018
Consumer Ampere? 2020
You don't have to buy a new card every generation if you don't want to.
If you skip a generation or two, you'll get larger performance boost compared to what you had before. Not every generation is going to double your FPS, but if you wait as long as I have, that is likely going to happen. I still have GTX 970...
Quoting: AnzaQuoting: CatKillerQuoting: Comandante ÑoñardoI repeat: We need a third player in the GPU market.Intel have been working on Xe for quite a while now. Actual real-world performance, and how much it's directed towards gaming rather than machine learning, are still to be determined, of course.
Quoting: Comandante ÑoñardoThey should calm down and release new GPUs every three years and not every year.I'll freely admit that I find AMD's products and release schedule confusing, but on the Nvidia side:
Fermi 2010
Kepler 2012
Maxwell 2014
Pascal 2016
Turing 2018
Consumer Ampere? 2020
You don't have to buy a new card every generation if you don't want to.
If you skip a generation or two, you'll get larger performance boost compared to what you had before. Not every generation is going to double your FPS, but if you wait as long as I have, that is likely going to happen. I still have GTX 970...
I change my main videocard every 4 years.
I buy the most powerful video card I can buy; in November of 2015 I bought a GTX 970 and in September of 2019 I bought an RTX 2060 SUPER
The performance gap between the two card is huge!
Quoting: Comandante ÑoñardoFor a long time my computer upgrades revolved around whether or not I could run everything in the MAME library. It wasn't until a few iterations back that I could run the Gauntlet games that ran with a 3DFx Voodoo chip when I stopped... but now my upgrade cycle is determined by if VR runs well... so sadly I have actually not been able to skip a generation since the 980GTX... But I think I may have to with the RTX 3xxxx. As I don't feel like shelling out the money I'd likely need to go with the Ti version this time (I usually get the non-Ti as the they are a decent chunk less.) I sold off my 1080 to a friend, but still have my 980GTX sitting in an external case for my portable VR set up... which apparently is too weak for the Index, but works well for the original Vive.Quoting: AnzaQuoting: CatKillerQuoting: Comandante ÑoñardoI repeat: We need a third player in the GPU market.Intel have been working on Xe for quite a while now. Actual real-world performance, and how much it's directed towards gaming rather than machine learning, are still to be determined, of course.
Quoting: Comandante ÑoñardoThey should calm down and release new GPUs every three years and not every year.I'll freely admit that I find AMD's products and release schedule confusing, but on the Nvidia side:
Fermi 2010
Kepler 2012
Maxwell 2014
Pascal 2016
Turing 2018
Consumer Ampere? 2020
You don't have to buy a new card every generation if you don't want to.
If you skip a generation or two, you'll get larger performance boost compared to what you had before. Not every generation is going to double your FPS, but if you wait as long as I have, that is likely going to happen. I still have GTX 970...
I change my main videocard every 4 years.
I buy the most powerful video card I can buy; in November of 2015 I bought a GTX 970 and in September of 2019 I bought an RTX 2060 SUPER
The performance gap between the two card is huge!
Been waiting too long for something that can do a decent job at 4k
Quoting: AnzaQuoting: CatKillerQuoting: Comandante ÑoñardoI repeat: We need a third player in the GPU market.Intel have been working on Xe for quite a while now. Actual real-world performance, and how much it's directed towards gaming rather than machine learning, are still to be determined, of course.
Quoting: Comandante ÑoñardoThey should calm down and release new GPUs every three years and not every year.I'll freely admit that I find AMD's products and release schedule confusing, but on the Nvidia side:
Fermi 2010
Kepler 2012
Maxwell 2014
Pascal 2016
Turing 2018
Consumer Ampere? 2020
You don't have to buy a new card every generation if you don't want to.
If you skip a generation or two, you'll get larger performance boost compared to what you had before. Not every generation is going to double your FPS, but if you wait as long as I have, that is likely going to happen. I still have GTX 970...
I still have a GTX 780.
See more from me