Raja Koduri from Intel put out a bit of a teaser on Twitter recently for for their upcoming dedicated GPU.
In the Twitter post, which was retweeted by the official Intel Graphics Twitter account was the below image which has the date of June 2020 on the license plate. Not exactly cryptic, it's a pretty clear teaser towards a release date for the Intel Xe or whatever they actually end up calling it once it's out. That's pure speculation of course on my part but it would line up given who sent the tweet and Intel previously saying the Xe series will be out in 2020.
We've yet to really see any solid information on exactly how powerful they will be. What we do know though, is that they should get first-class Linux support as Intel has been working through their drivers on Linux. They talked openly before about their commitment to open source and their focus on Linux gaming too so it's quite exciting.
NVIDIA and AMD could use more GPU competition, as the more we have the more it should hopefully push them to improve both their hardware and prices for future generations.
I don't care what AMD says, by the time their big NAVI comes out its going to be way too late and cost WAY too much. That's how its been for quite some time now and I see no reason why it would change.
Quoting: TheRiddickI don't care what AMD says, by the time their big NAVI comes out its going to be way too late and cost WAY too much. That's how its been for quite some time now and I see no reason why it would change.
How aren't Nvidia 2080 cards too much though? They are already crazily priced, not something I'm interested in paying for a GPU. I doubt they'll lower those prices. So if AMD will also price it that way (like they did with Radeon VII), it's not likely I'm going to get those cards.
Quoting: TheRiddickHonestly the only reason to go NVIDIA is for the TOP END card performance at 4k, if your 1080p or 1440p gamer then there is absolutely no reason to only consider them
IMHO for monitors higher refresh rate (matched with framerate in games) is more valuable than higher resolution. So 4K is quite a red herring to chase such expensive cards. I.e. I'd take something like 2560x1440 with > 100 fps over 4K with lower framerate. And even super high cards would struggle to push high framerate at 4K on max settings in demanding games. It will probably take a few generations of GPUs still, for 4K to become usable at high framerates. So no rush there.
Last edited by Shmerl on 7 October 2019 at 11:49 pm UTC
Quoting: GustyGhostIf Intel plan on getting into the consumer dGPU space, they very likely will be strong armed by Hollywood and friends to implement a "secure content path" all for "the benefit of the users". Get ready for more firmware or hardware level DRM.
OK , that comment flew over my head.
A bit of context please ?
Quoting: razing32A bit of context please ?
You cannot use your current GPU without loading a proprietary blob firmware which uses encryption to make sure that the GPU ultimately obeys somebody else that isn't you. Part of this (mis)functionality is required for other "protection" schemes such as HDCP. Who would pressure for such (mis)functionality to be integrated into hardware and firmware?
Quoting: GustyGhostQuoting: razing32A bit of context please ?
You cannot use your current GPU without loading a proprietary blob firmware which uses encryption to make sure that the GPU ultimately obeys somebody else that isn't you. Part of this (mis)functionality is required for other "protection" schemes such as HDCP. Who would pressure for such (mis)functionality to be integrated into hardware and firmware?
Plus, aren't those companies trying to hard-locked diversify their platforms?
Providing some features only on one (premium) product and not on others, allowing them to use (almost) the same silicon for a wide range of product which can significantly lower the cost.
I guess this is also one argument against a fully open stack for at least some companies - looking in particular at you Nvidia.
Guess I learned something new :)
See more from me