Confused on Steam Play and Proton? Be sure to check out our guide.
We do often include affiliate links to earn us some pennies. See more here.

Raja Koduri from Intel put out a bit of a teaser on Twitter recently for for their upcoming dedicated GPU.

In the Twitter post, which was retweeted by the official Intel Graphics Twitter account was the below image which has the date of June 2020 on the license plate. Not exactly cryptic, it's a pretty clear teaser towards a release date for the Intel Xe or whatever they actually end up calling it once it's out. That's pure speculation of course on my part but it would line up given who sent the tweet and Intel previously saying the Xe series will be out in 2020.

We've yet to really see any solid information on exactly how powerful they will be. What we do know though, is that they should get first-class Linux support as Intel has been working through their drivers on Linux. They talked openly before about their commitment to open source and their focus on Linux gaming too so it's quite exciting.

NVIDIA and AMD could use more GPU competition, as the more we have the more it should hopefully push them to improve both their hardware and prices for future generations.

Article taken from GamingOnLinux.com.
14 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
18 comments
Page: «2/2
  Go to:

TheRiddick Oct 7, 2019
Apparently NVIDIA will have their next series of cards out Q1 2020 so it will be interesting. Honestly the only reason to go NVIDIA is for the TOP END card performance at 4k, if your 1080p or 1440p gamer then there is absolutely no reason to only consider them, Intel and AMD are releasing middle end cards, there will be no top end.

I don't care what AMD says, by the time their big NAVI comes out its going to be way too late and cost WAY too much. That's how its been for quite some time now and I see no reason why it would change.
Shmerl Oct 7, 2019
Quoting: TheRiddickI don't care what AMD says, by the time their big NAVI comes out its going to be way too late and cost WAY too much. That's how its been for quite some time now and I see no reason why it would change.

How aren't Nvidia 2080 cards too much though? They are already crazily priced, not something I'm interested in paying for a GPU. I doubt they'll lower those prices. So if AMD will also price it that way (like they did with Radeon VII), it's not likely I'm going to get those cards.

Quoting: TheRiddickHonestly the only reason to go NVIDIA is for the TOP END card performance at 4k, if your 1080p or 1440p gamer then there is absolutely no reason to only consider them

IMHO for monitors higher refresh rate (matched with framerate in games) is more valuable than higher resolution. So 4K is quite a red herring to chase such expensive cards. I.e. I'd take something like 2560x1440 with > 100 fps over 4K with lower framerate. And even super high cards would struggle to push high framerate at 4K on max settings in demanding games. It will probably take a few generations of GPUs still, for 4K to become usable at high framerates. So no rush there.


Last edited by Shmerl on 7 October 2019 at 11:49 pm UTC
GustyGhost Oct 8, 2019
If Intel plan on getting into the consumer dGPU space, they very likely will be strong armed by Hollywood and friends to implement a "secure content path" all for "the benefit of the users". Get ready for more firmware or hardware level DRM.
Arten Oct 8, 2019
Why is that license plate on Tesla? They want compete wit Nvidia tesla in datacenters?
razing32 Oct 8, 2019
Quoting: GustyGhostIf Intel plan on getting into the consumer dGPU space, they very likely will be strong armed by Hollywood and friends to implement a "secure content path" all for "the benefit of the users". Get ready for more firmware or hardware level DRM.

OK , that comment flew over my head.
A bit of context please ?
GustyGhost Oct 11, 2019
Quoting: razing32A bit of context please ?

You cannot use your current GPU without loading a proprietary blob firmware which uses encryption to make sure that the GPU ultimately obeys somebody else that isn't you. Part of this (mis)functionality is required for other "protection" schemes such as HDCP. Who would pressure for such (mis)functionality to be integrated into hardware and firmware?
sub Oct 12, 2019
Quoting: GustyGhost
Quoting: razing32A bit of context please ?

You cannot use your current GPU without loading a proprietary blob firmware which uses encryption to make sure that the GPU ultimately obeys somebody else that isn't you. Part of this (mis)functionality is required for other "protection" schemes such as HDCP. Who would pressure for such (mis)functionality to be integrated into hardware and firmware?

Plus, aren't those companies trying to hard-locked diversify their platforms?
Providing some features only on one (premium) product and not on others, allowing them to use (almost) the same silicon for a wide range of product which can significantly lower the cost.

I guess this is also one argument against a fully open stack for at least some companies - looking in particular at you Nvidia.
razing32 Oct 13, 2019
Thanks.
Guess I learned something new :)
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.