As Intel ready up their dedicated desktop graphics, they have released fresh details on the specifications so here's what's about to arrive. Released in a short Q&A video, I'll spare you diving in with the details right here but the video is below if you wish to listen in.
Just like their CPUs they will be split across different tiers including Intel Arc 3, 5, and 7 graphics. With 7 beiong "High Performance Gaming", 5 being "Advanced Gaming" and 3 being "Enhanced Gaming". Here's the specs:
When it comes to the higher-end though, they're split between 8GB and 16GB. Intel said that partners will mostly ship with 8GB but their Limited Edition GPU doubles it to 16GB. That's about the only difference they've revealed in the line-up. They also said that each card is fully features with HDR, variable refresh XeSS, video encoding for AV1 and other popular codecs.
You can see their full video below:
Direct Link
Of course, for me personally there are other things that are important now. How would this work with VR on Linux? Would it support async reprojection for VR. Will it support VRR in the same seamless manner as my NV GPU does G-SYNC now?
Quoting: GuestThe only way Intel will get my attention is if
#1 their pricing is "pre-pandemic." I'm not paying $1000 for a card that I could buy for $250 2 years ago. This is going to be a rough time for Intel due to this because their costs will be higher and they are going to want to recoup all that R&D and material overhead bump somewhere.
#2 they offer a decent number outputs. I run 12 screens on my machine and the only GPU that comes close to my needs is a WX6800 with a piddly 6 outs i.e. I'd still need TWO and they cost like $6K!
#3 Obviously they need to be on par with other brands performance. This will probably be a pipe dream given my use case which means I and others like me are probably not going to care about this and it won't be competitive in a way that brings choice or better prices to the market as a whole.
#4 The drivers will have to be hella good. Right now because I run multi GPU I can deligate work with no disruption. Watching videos on one GPU, surveillance and logs on another and playing Tarkov on the main GPU: max performance no BS...for a single GPU it will need to toggle many different video demands with no hiccups to catch my eye.
I definitely agree with your first point, however, while I can't say I disagree, points #2 - #4 feel way out of proportion for a company's first foray into Graphics cards. You're talking about running TWELVE screens and splitting workflow between those screens... a task you seem to claim can only be handled by two of one specific card. There's no way ARC could come close to that out of the proverbial gate (I'd be floored if wrong). You are most definitely not the "market as a whole", but, rather, a very niche customer.
Like I said, though, I very much agree with your first point. Price-wise, ARC definitely needs to target their pricing very aggressively. Intel is a big company, to be sure, but they are a serious underdog in the graphics market. NVIDIA has decades on Intel, and so does AMD to be fair. Furthermore, AMD, I feel, is heftily outperforming Intel in integrated graphics as well, and Intel has been doing IG for some time.
At minimum, I feel ARC has to be able to handle 1080p, 60hz, High Quality at a cost of $50 to $100 less than either AMD or NVIDIA's comparable cards, and maybe double or triple that price gap at 4k. Intel is fighting a serious uphill battle and their best bet is to target the more budget conscious gamers willing to take the risk on new hardware.
That said... I remain very much a 1080p/60hz gamer myself, and still, I probably wouldn't consider ARC for a generation or two.
It does not matter how good the hardware is, if the drivers are not up to par then its pretty useless. They are very far behind the nvidia's and amd's of this world right now on all there 3 mentioned levels of gaming, so tbh I think waiting till the next round might be smarter.
Last edited by Bumadar on 9 September 2022 at 6:58 pm UTC
I would assume a good part of that is that their IG isn't exactly targeting anything bleeding edge, but from the comments above, it doesn't seem they are necessarily targeting high end right away (is there a business reason for that? It doesn't make sense to me).
Either way, I'm interested in seeing more. If they can match performance of cards released in the last 3-4 years and it's relatively problem free, I think that would be a huge success.
Quoting: denyasisfrom the comments above, it doesn't seem they are necessarily targeting high end right away (is there a business reason for that? It doesn't make sense to me).I think more of a technical reason: That shit is really, really fucking hard to do. You need to build up expertise over years.
Quoting: Purple Library GuyQuoting: denyasisfrom the comments above, it doesn't seem they are necessarily targeting high end right away (is there a business reason for that? It doesn't make sense to me).I think more of a technical reason: That shit is really, really fucking hard to do. You need to build up expertise over years.
Apart from that, those higher-end gpu's would take up production resources because of those huge dies (with extra high failure rates on top) that can be better spent increasing the install base. A higher number of users is important to gain better dev support.
My r9 380x is really showing its age, not because of the games I play (mostly CS & some older games like Gothic/Risen)
but because of the lack of ray tracing and computing.
AMD's Rocm sadly isn't working and I believe its partly due to the more closed source approach there.
Intel's computing libraries with oneAPI on the other hand are more open source, but I am not sure about the support of
third party libraries there. Support in e.g. pyTorch and Blender would be great.
And I would just love to experiment with ray tracing for some of my own game engine experiments.
Quoting: KohlyKohlI just read some rumors that Intel cancelled their Arc dedicated GPUs already. Would be disappointing if they gave up already as I was looking forward to more competition in the GPU market.
Intel alteady pushed back on this rumor.
I would like them as a competitor in the GPU market, but they are far away to become one.
I think we'll probably see cards able to compete in the mid-higer levels in 5 years or further in the future.
Let's see if they have the will to that investment.
See more from me