Something that could be rather exciting for AMD enthusiasts, AMD has officially revealed the AMD Radeon VII at CES 2019. On top of that, 3rd generation Ryzen desktop processors are coming.
Getting ahead of the curve a little here, the Radeon VII is built on 7nm which makes it the first consumer-level GPU to be built with it which is interesting. AMD say it's built on an "enhanced second-generation AMD ‘Vega’ architecture" and it seems it will be a decent boost over the current Radeon RX Vega 64.
When compared directly with the RX Vega 64, AMD said it performed up to 27% higher in Blender, up to 27% higher in DaVinci Resolve and they saw up to 62% higher performance in the OpenCL LuxMark compute benchmark.
Some more specs:
- 60 compute units
- 3840 stream processors running at up to 1.8GHz
- 16GB of HBM2 memory (second-generation High-Bandwidth Memory)
- 1 TB/s memory bandwidth
- 4,096-bit memory interface
When it comes to gaming, that was also mentioned as well of course. It's nice to see Vulkan mentioned along side DirectX too! Naturally, they're only going for big Windows games right now but they did say it offered "35 percent higher performance in Battlefield V, and up to 42 percent higher performance in Strange Brigade 1" over the Vega 64 which is quite impressive.
The Radeon VII will be available February 7, 2019 for around $699 USD.
Additionally, they've teamed up with Google to power Project Stream, Googles new cloud gaming service using their Radeon Pro GPUs.
On top of that, 3rd generation Ryzen desktop processors are coming. They will also be built on 7nm tech, based on the Zen 2 core architecture and AMD say it's the "world's first" to support PCIe 4.0 connectivity. Sounds like it's going to be a beast, as they did a preview of it against an Intel i9 9900k where the Ryzen processor came out on top while also using around 30% less power.
They're launching the AMD Ryzen 3000 series sometime in the middle of 2019.
For notebook/laptop users, they also revealed the 2nd Gen AMD Ryzen Mobile processor with Radeon Vega Graphics coming to a range of devices from companies like Acer, Asus, Dell, HP, Huawei, Lenovo and Samsung throughout 2019.
You can see their CES 2019 video here.
Quoting: cRaZy-bisCuiT7 nm but only GTX 2080 performance? Why? For the same price?
Nice only because we got a open well performing driver on Linux for AMD. Still I'm not much impressed.
Half of the structure (14nm -> 7nm) cannot bring you double of the performance, because the are negative physical effects working against you (read a physics book for details).
And therefor I must correct you, the improvements are impressive. You could read the article, it's a bit half information mixing CPU/GPU!?? But under the line from a Science POV the same architecture, but in 7nm -> for CPU&GPU getting a boost of ~33%+ and sametime a decrease of ~33%+ power consumption is more than impressive and very realistic. And the GPU doubles the memory, but also the impressive bandwidth (most important thing above all (e.g. higher resolution gaming and better VR resolutions and textures)) makes it future prove.
(If you can't get the textures fast enough into the memory, it doesn't matter how fast the GPU can compute in theory)
If you don't care about your energy bill and have water cooling, sure overclock it and go for your 80% performance boost. But why? Better save power, low temperature, less noise - and sufficient more performance!
AMD took a good balance decision there - against bottlenecks, not just to impress on paper!!!
Last edited by lelouch on 9 January 2019 at 10:12 pm UTC
Quoting: lelouchQuoting: cRaZy-bisCuiT7 nm but only GTX 2080 performance? Why? For the same price?
Nice only because we got a open well performing driver on Linux for AMD. Still I'm not much impressed.
Half of the structure (14nm -> 7nm) cannot bring you double of the performance, because the are negative physical effects working against you (read a physics book for details).
Integration density ideally scales quadratically with the inverse of the structure size.
Hence, half the structure size should roughly result in a 4 times higher integration density.
Yet there are many contribution of losses that do not allow performance to scale linearly with the integration density.
Please correct me if I'm wrong. :)
Quoting: GuestAMD kills there own marketing so much
hey we have these third gen cpus but the fourth gen is already being worked on
that's like if sony said hey we are launching the ps4 today but ps5 is already being worked on
like why would anyone buy the current gen when the next gen is already happening?
also great a $700usd gpu that will cost extra once you get next months power bill
APU master race!
this card is also senseless
its the same price as the 2080 and as fast
the only difference is 16GB | 8GB + RT
no game needs 8GB so why would i need 16?
if this card isnt much much cheaper than a 2080 i dont see 1 reason to buy it
Quoting: mylkathis card is also senseless
its the same price as the 2080 and as fast
the only difference is 16GB | 8GB + RT
no game needs 8GB so why would i need 16?
if this card isnt much much cheaper than a 2080 i dont see 1 reason to buy it
It's surely not a mainstream card. We should wait for something more reasonably priced while high end at the same time, I think.
https://www.pcworld.com/article/3332205/amd/amd-ceo-lisa-su-interview-ryzen-raytracing-radeon.html
Quote“Some people may have noticed on the package some extra room,” she said with a chuckle. “There is some extra room on that package and I think you might expect we will have more than eight cores.” — Lisa Su.
So more than 8 core Ryzens are going to come out eventually.
Quoting: ShmerlQuoting: mylkathis card is also senseless
its the same price as the 2080 and as fast
the only difference is 16GB | 8GB + RT
no game needs 8GB so why would i need 16?
if this card isnt much much cheaper than a 2080 i dont see 1 reason to buy it
It's surely not a mainstream card. We should wait for something more reasonably priced while high end at the same time, I think.
thats not my point
both cards cost 700, both have kinda the same performance according to AMD
would you take 16GB, or 8GB + RT + DLSS
i dont see AMD winning here. at least nvidia has a competitor again and i hope both lower the prices for the new generation soon
Quoting: mylkai dont see AMD winning here. at least nvidia has a competitor again and i hope both lower the prices for the new generation soon
They don't aim to. They can't beat Nvidia in power consumption yet for example, due to using GCN. But they still need to offer something competitive, so they are doing it.
If you expect something more drastic, that will only happen with new architecture.
See more from me