Don't want to see articles from a certain category? When logged in, go to your User Settings and adjust your feed in the Content Preferences section where you can block tags!
We do often include affiliate links to earn us some pennies. See more here.

The Palit GeForce GTX970 Jetstream 4GB Is A Monster

By -
I finally decided what graphics card to upgrade with, and my choice was the Palit GeForce GTX970 Jetstream 4GB, and wow what a card performance wise.

I rarely update my PC, but with the increasing amount of AAA games it was becoming needed. Especially as my 560ti has been getting lower frame-rates in games than I have been wanting for a smooth experience. Unity games especially are quite resource hungry, so this should do me for quite some time I hope.

A shot of the new beast sitting in the box:
image

Here's a look at different frame-rates across a few select games, these are the FPS (lowest-highest) I seem to get at different scenes.

Not 'official benchmarks' here, but a general look at just how good the card is in my real-world testing of it. These are not really automated benchmarks, just proper use of the card to see how it performs for a gamer.

Tested together with an Intel i5 4670K 3.4GHZ, 8GB DDR3 RAM and Linux Mint Cinnamon 17.1 64bit. This is with the 346.35 Nvidia driver.

Borderlands 2
1920x1080, High settings, 4X AF

560ti: 37-51
970: 38 for a split second, and then mostly lowest of ~55 - 130

Metro 2033 Redux
Detail level set to High.

560ti: 30-40
970: 113-160

The Witcher 2
1920x1080, High Settings, V-Sync Off

560ti: 23-37FPS (Noticeably sluggish too)
970: 53-99 (HOORAY! I can finally play it!)

Unigine Heaven
This was only tested on the 970 due to time and impatience on my part (I want to play games!).

On Ultra settings, Extreme tesellation, and 8X AA it managed a minimum of 30.5 and a max of 79.5, I think that's amazing for something so demanding, and again shows how good the card really is.

Annoyingly, I had to buy a new power supply unit to go with it, as the 970 made my original 500w Cooler Master squeal like a pig on helium (coil whine), and it was extremely annoying. So, this became a rather expensive upgrade, but it should make my Livestreams and game-play video's much smoother. I thought it was the new graphics card making all the noise, but after careful listening when taking the PSU out of the case I found out that it was creating the noise.

It also seems to have lights on it, so that's something. Not sure why anyone wants fancy lights and stuff like that, but it was an interesting surprise:
image
Yes, I know my cables are a mess, and I'm okay with that.

Verdict: Bloody buy it. Article taken from GamingOnLinux.com.
0 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
24 comments
Page: «2/3»
  Go to:

Plintslîcho Jan 18, 2015
I'm looking forward for the GTX960.
The bigger cards consume too much power.
minty Jan 18, 2015
Hi liam,

I just wondering what steps did you take to install the latest driver in Linux Mint?

Thanks
Styromaniac Jan 18, 2015
It's good to know the coil whine is coming from the PSU being stressed. I plan on a new PSU anyway. I put too much faith into someone knowing their hardware brands and ended up finding out the PSU I got in the hastened case swap with them was an unreliable brand.
Liam Dawe Jan 18, 2015
Quoting: mintyHi liam,

I just wondering what steps did you take to install the latest driver in Linux Mint?

Thanks

I've been using this PPA
https://launchpad.net/~mamarley/+archive/ubuntu/nvidia
Beamboom Jan 18, 2015
How do you get more than 60 frames in Witcher? It looks to be capped at 60 for me, even indoors it stops at 60 spot on. And I got a screen with a refresh rate of 120 so that should be no bottleneck.

Also, a framerate of 160?! What kind of screen have you got to handle that high refresh rate? Or are not refresh rate and fps related? You can't get a higher frame per second than the max refresh rate of the monitor, can you?
MayeulC Jan 18, 2015
Quoting: HamishAnd here I am still thinking of my recently acquired Sapphire Radeon HD 6870 as a massive monster, but then of course my demands are much smaller than yours... ;)

http://www.sapphiretech.com/presentation/product/?pid=1270

Yeah, exactly the same here, I chose a 6870 (not Sapphire, though) to replace my 4850. I still get abysmal performance with radeon drivers in Serious Sam 3, though.

Quoting: Beamboomare not refresh rate and fps related? You can't get a higher frame per second than the max refresh rate of the monitor, can you?

Not entirely. You can have your graphic card drawing frames faster than the monitor; that's not a problem, but can in some cases produce some tearing.
It is often advised (Depending on the user's preferences) to enable v-sync, or at least limit the framerate to the monitor's refresh rate to lower your energy consumption/noise/temperature/whatever. This framerate limit is often enabled by default (at least on open source drivers).
lave Jan 19, 2015
Quoting: BeamboomAlso, a framerate of 160?! What kind of screen have you got to handle that high refresh rate? Or are not refresh rate and fps related?
Yep, they are not related. the refreshrate of your monitor and the frames per seconds your PC can pull oft are 2 entirely different things that by default dont affect each other at all.
Now there is vsync, a GPU feature that delays and caps fps to your monitors hz to force it to display 1 full frame each refresh (the effect that happens without vsync is called tearing). vsync however has major drawbacks - it adds a segnificant amount oft input lag and decreases fps just by itself. Plus if your rig cannot handle a stable 1:1 fps to Hz ratio (120fps in your case) then the input lag gets even worse.

if you own a 120 or 144hz monitor then just disable vsync completly as tearing is not much of an issue on those monitors. also dont cap your fps, as higher fps will always(!) improve your game, even if its just input lag. (and if for some reason you have to cap, pick 120)
oldrocker99 Jan 19, 2015
View PC info
  • Supporter Plus
My 750ti will have to do for the time being :D .
Kallestofeles Jan 19, 2015
Congratulations on the purchase, seems like an excellent buy!
Now that the 9xx series is out, I feel kind of sad that I got my 770GTX 4GB earlier this year... though performance on that one has not disappointed me yet - besides Unity engine games... but those are crap to run with any specs anyways.
Beamboom Jan 19, 2015
Thanks alot for your replies, lave and MayeulC! Now I understand. Also, I'm pretty sure I have turned on vsync in witcher - I've always turned it on, as I knew the relation between refresh rate and screen tearing - but now I'll pump up the refresh and turn off vsync instead :).
Also the nvidia drivers always sets the resolution to "auto" and it appears that sets the refresh rate at 60 - that explains why Witcher seem to be capped at 60 for me.

Thanks again for that info guys, I've always wondered about that stuff.

(I'm on a ASUS GeForce GTX 680 2GB btw. Enjoying Witcher on high setting@HD res with decent framerate (typically 40ish up to the mentioned 60 "cap"))
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.