With Intel's brand new dedicated GPU due next year, they've begun talking up their efforts of getting Linux support in early.
On Twitter, their team highlighted their work:
Our journey toward a new visual computing experience is underway, and that includes a commitment to the #OpenSource community. Local memory implementation is the first of many steps toward robust Linux support for our future discrete graphics solutions.
The post links to this set of patches which reads:
In preparation for upcoming devices with device local memory, introduce the concept of different memory regions, and a simple buddy allocator to manage them. At the end of the series are a couple of HAX patches which introduce a fake local memory region for testing purposes. Currently smoke tested on a Skull Canyon device.
Intel have traditionally been pretty great with their Linux support and so this isn't exactly surprising. Even so, it's very pleasing to see them hype this up so we know we're getting first-class support.
It's exciting, we've long needed another horse to enter the race. 2020 is certainly going to be interesting. We've no idea what their target audience will be for it though, hopefully the price will be reasonable.
Could you see yourself buying an Intel discrete GPU?
I'm still rocking my NVIDIA 980ti which, thankfully, still has a good amount of time left. I've been considering an AMD GPU for a while, but it seems waiting another year might be worth it.
Quoting: LinasI bet these will not be for gaming, but rather scaled up versions of their integrated GPUs. Something that is good for compute and workstation tasks. Will probably see these mostly on business workstations. Pure speculation, though.
Right now I am very happy with AMD. The drivers keep getting better, so I think we have not seen the full potential of what AMD can do on Linux yet.
That's 110% likely going to be case. Despite us certainly making the gpu manufacturer's money, the fact is that the computer/tech industry is primarily focused on the enterprise. Servers alone make the large majority of intel's revenue. Nvidia's making hundreds of thousands, if not millions, per sale over that of their RTX products (these include machine learning, A.I., deep learning, etc). All these products have release cycles that begin with their "workstation" prototype: i.e. Nvidia's volta gpus; everything else is basically a cut-down version of it. We get leftovers, per se.
intel is VERY, very late to the deep learning/A.I. market, which is the current blooming market, but they're sure to start there. This isn't a "more fps than your competitor's" battle.
Quoting: Xaero_VincentIf GVT-g is supported with their consumer discrete cards like their iGPUs now, then I'll definitely consider one in my next build.
This is EXACTLY what came to mind at first. Let's hope intel doesn't decide to segment the market and limit the product's capabilities like Nvidia does with their Titan that's marketed as a research gpu but has none of the features that most researchers would need. Give us full control of the hardware and I'm sold.
Last edited by sneakeyboard on 21 February 2019 at 7:53 pm UTC
Quoting: sneakeyboardOn the plus side, Linux has a lot more presence in those spaces than in desktops and gaming. Not so surprising then if drivers for Linux work fairly well.Quoting: LinasI bet these will not be for gaming, but rather scaled up versions of their integrated GPUs. Something that is good for compute and workstation tasks. Will probably see these mostly on business workstations. Pure speculation, though.
Right now I am very happy with AMD. The drivers keep getting better, so I think we have not seen the full potential of what AMD can do on Linux yet.
That's 110% likely going to be case. Despite us certainly making the gpu manufacturer's money, the fact is that the computer/tech industry is primarily focused on the enterprise. Servers alone make the large majority of intel's revenue. Nvidia's making hundreds of thousands, if not millions, per sale over that of their RTX products (these include machine learning, A.I., deep learning, etc). All these products have release cycles that begin with their "workstation" prototype: i.e. Nvidia's volta gpus; everything else is basically a cut-down version of it. We get leftovers, per se.
Quoting: appetrosyanFinally!I wouldn't say there's a power vacuum. People like to say that but AMD has always had cards equal to Nvidia's top. Well usually not the TIp top but the top. Bad pun. RX Vega 64 was pretty comparable to a 1080 and Radeon VII is pretty close to the 2080. That leaves Nvidia with one card on the high high end but I definitely wouldn't consider Vega or Radeon VII middle of the market
Intel has had an open source graphics stack long since AMD/ATI. If they can fit into the power vacuum left by AMD in the high end, and not let Ngreedya fill it in with more overpriced rock, I’m all for it.
I’d still much rather prefer a Risc V-sequel solution, that comes from a company that never abused the FOSS licenses and has no proprietary drivers, butintel is good enough.
Last edited by Scoopta on 22 February 2019 at 1:30 am UTC
To have only two choices is not good for consumers...
No such thing as ie. AMD Vega ITX compatible cards, the partners just sell these HUGE megalith brick cards. Kinda disappointing. We don't all want house sized pc cases... lol
Quoting: TheRiddickThe primary reason why I haven't bothered with AMD for past few years is because they don't really do much AIB at the high end anymore.For me the fact that their drivers are FOSS is infinitely more important than that although in fairness I have a full tower so size is really not of consequence to me.
No such thing as ie. AMD Vega ITX compatible cards, the partners just sell these HUGE megalith brick cards. Kinda disappointing. We don't all want house sized pc cases... lol
See more from me