Don't want to see articles from a certain category? When logged in, go to your User Settings and adjust your feed in the Content Preferences section where you can block tags!
We do often include affiliate links to earn us some pennies. See more here.

Intel Arc A770 GPU releases October 12th

By -

Along with announcing their new Raptor Lake desktop CPUs, Intel has also finally announced the release date of their Arc A770 GPU for October 12th. This is their top-end GPU, that they claim "delivers 65% better peak performance versus competition on ray tracing".

Coming in at $329, putting it in line with the RTX 3060 which they previously compared it against. They're facing some fierce competition with NVIDIA and AMD already well established in the market although given the insane price on NVIDIA Ada Lovelace, perhaps more will be looking at alternatives?

Very little else was really said about it that I can find, other than what they've previously revealed about it which is pretty much everything with all the specifications already available on their website. As a refresher, their Limited Edition version is mainly the card with the 16GB GDDR6 and they said previously most of their partners will do it in 8GB.

You can find out more on the Intel website. The XeSS SDK is also now available on GitHub.

I'm keen to see how Intel do with their first lot out properly, as a third vendor has been needed like this in the GPU space. With open source drivers too, there's a lot for Linux users to like about it.

Article taken from GamingOnLinux.com.
21 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. You can also follow my personal adventures on Bluesky.
See more from me
The comments on this article are closed.
All posts need to follow our rules. For users logged in: please hit the Report Flag icon on any post that breaks the rules or contains illegal / harmful content. Guest readers can email us for any issues.
25 comments
Page: 1/2»
  Go to:

ElamanOpiskelija Sep 27, 2022
I want Intel to succeed so bad.
BTW, off-topic but it's just impressive how bad of a game Fortnite and Cyberpunk are. And Cyberpunk, at least, has the excuse of beautiful graphics.
ElectricPrism Sep 27, 2022
I'm thinking of buying one for a server or something. I very much hope the GPU War of the future is AMD vs Intel and Nvidia can go away (mostly) -- which since EVGA is no longer making NVIDIA seems likely as they were #1.

I've phased out nearly all Intel CPUs as Ive been on Thank You AMD phase, so this is a weird turn for me -- I just acknowledge Intel has had a open source driver since forever.

Open source drivers dictates a much higher probability I will buy something.


Last edited by ElectricPrism on 27 September 2022 at 8:13 pm UTC
Purple Library Guy Sep 27, 2022
I can tell I have an instinctive thing for the underdog. Seeing the two headlines about Intel one atop the other, on the graphics card one where Intel is the small player I was thinking "Go Intel!" but on the CPU one where Intel is the big player I was thinking "Boo, Intel!" It felt kind of weird to whiplash between those two in one second flat like that.
kit89 Sep 27, 2022
I wish they had stuck it in the 200-250 bracket on price.
pageround Sep 27, 2022
View PC info
  • Supporter
XeSS is a sweet name, good job whoever came up with that. (I'm choosing to pronounce it as 'excess').
I think the price may be okay given good driver support, I'll keep an eye out for benchmarks.


Last edited by pageround on 27 September 2022 at 9:26 pm UTC
dpanter Sep 27, 2022
Intel entering the graphics card market at what is literally the worst possible time in the history of the market, without properly working drivers or software, no actual USP, pricing that has no chance against the price drops from the competitions next gen lineups releasing around the same time and the presumed flood of mining GPU's... what can go wrong?

I want Intel to succeed but sadly expecting disaster.
STiAT Sep 27, 2022
I am all in for Intel as another competitor.

But I think their hardware will need a few generations to really close up.

As well do the drivers. They do not have a good track record there. I hope this initiative will get them there.

Nvidia opened up considerably, AMD is great, and pressure from the Intel side will certainly help.
sarmad Sep 27, 2022
I want Intel to succeed so bad.
BTW, off-topic but it's just impressive how bad of a game Fortnite and Cyberpunk are. And Cyberpunk, at least, has the excuse of beautiful graphics.

and Fortnite has the excuse of supporting split screen in an age where game developers think that nobody should be making physical friends anymore :(
Or maybe they think everyone should just buy a Steam Deck 🤔
denyasis Sep 28, 2022
I'm definitely interested. But I want to see some good solid benchmarks. I know Ray tracing is all the rage, but I'd really like to see how it compares to cards from the last few years in general gaming.

If it's a good step up from, say my 1070ti and has a decent cooler, at that price, that's an upgrade I can afford!


Last edited by denyasis on 28 September 2022 at 12:52 am UTC
Xaero_Vincent Sep 28, 2022
VKD3D-Proton is pretty much unusable under ANV at this time, so that will put the breaks on any Linux adoption for AAA gaming.


Last edited by Xaero_Vincent on 28 September 2022 at 1:11 am UTC
redman Sep 28, 2022
I'm thinking of buying one for a server or something. I very much hope the GPU War of the future is AMD vs Intel and Nvidia can go away (mostly) -- which since EVGA is no longer making NVIDIA seems likely as they were #1.

I've phased out nearly all Intel CPUs as Ive been on Thank You AMD phase, so this is a weird turn for me -- I just acknowledge Intel has had a open source driver since forever.

Open source drivers dictates a much higher probability I will buy something.

Nvidia is not going anywhere... They rule in the AI and Data science world with CUDA, AMD is second class for say that has something. Perhaps Intel and AMD can take the gamer market and the low cost market, but where the big bucks are expended will be on these A 100

Just my humble opinion!
psycho_driver Sep 28, 2022
I wish they had stuck it in the 200-250 bracket on price.

I think this is where the 580 will eventually settle? I agree though this used to be the sweet spot for price/performance in GPUs and since the bitcoin mining craze it's been a deadzone.


Last edited by psycho_driver on 28 September 2022 at 4:25 am UTC
Phlebiac Sep 28, 2022
Someone, please explain that Intel math on the first picture, where there are 45 and 47 stand next to each other, and 47 is somehow 25% over 45.

I think they were trying to say that their benchmark score for that game improved by 25% in their latest beta drivers. In other words, their performance was terrible in that scenario, and they figured out how to fix it, rather than dropping it off the comparison list. ;-)
jordicoma Sep 28, 2022
I would like to see an imagination (the old powervr) desktop card. I don't know how good the linux drivers are.
https://www.imaginationtech.com/products/gpu/img-cxt-gpu/
They invented the hardware raytracing accelerator, and if it can go mobile probably it could scale well to desktop with good power efficiency.
llorton Sep 28, 2022
Keeping an eye on the A380, I wonder how it compares to my RX560 and if it could be an upgrade.
Phlebiac Sep 28, 2022
I would like to see an imagination (the old powervr) desktop card.

They tried that years ago; it was a total flop. Of course, the same could be said for Intel...
sarmad Sep 28, 2022
I'm definitely interested. But I want to see some good solid benchmarks. I know Ray tracing is all the rage, but I'd really like to see how it compares to cards from the last few years in general gaming.

If it's a good step up from, say my 1070ti and has a decent cooler, at that price, that's an upgrade I can afford!

I would say don't count on ray tracing just yet. I was trying Quake the other night and was getting around 500fps without ray tracing on my RTX 3060 laptop. After switching on ray tracing that went down to around 50fps. This was on FHD screen. On a WQHD (3440x1440) that number went down to around 25fps. So, the penalty you pay for RT is still pretty large and can probably only be afforded on RTX 40 series or something. If you have a big screen (big TV for example) then 4K resolution + higher frame rate gives you better payoff than ray tracing.
denyasis Sep 29, 2022
I'm definitely interested. But I want to see some good solid benchmarks. I know Ray tracing is all the rage, but I'd really like to see how it compares to cards from the last few years in general gaming.

If it's a good step up from, say my 1070ti and has a decent cooler, at that price, that's an upgrade I can afford!

I would say don't count on ray tracing just yet. I was trying Quake the other night and was getting around 500fps without ray tracing on my RTX 3060 laptop. After switching on ray tracing that went down to around 50fps. This was on FHD screen. On a WQHD (3440x1440) that number went down to around 25fps. So, the penalty you pay for RT is still pretty large and can probably only be afforded on RTX 40 series or something. If you have a big screen (big TV for example) then 4K resolution + higher frame rate gives you better payoff than ray tracing.

Thanks for the info! I must offer my apologies as well. I realized my post was not written well. I'm not really interested in ray tracing, but more general purpose gaming benchmarks. I should have been more clear now what I look at it. Ray Tracing is neat and all, but like you indicated, the tech doesn't really seem ready for general use.
Matombo Sep 29, 2022
XeSS. I'm gona call it Xtreme-SS! Oh wait ... I shouldn't do that ...
syylk Sep 29, 2022
I'm thinking of buying one for a server or something. [...]

Nvidia is not going anywhere... They rule in the AI and Data science world with CUDA, AMD is second class for say that has something. Perhaps Intel and AMD can take the gamer market and the low cost market, but where the big bucks are expended will be on these A 100

Just my humble opinion!
Glad someone mentioned it.

Nvidia is printing money in the tensor core field. They absolutely dominate the HPC/Top500 rankings and on their homepage "AI" is far more prominent than Games. If you check it right now, besides the GTC announcements, all "Solutions" dropdown and "For You" dropdown should suggest what nV is concentrated on.

At this point in time, their gaming products are just a side-effect of their main focus, not the other way around.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.