Don't want to see articles from a certain category? When logged in, go to your User Settings and adjust your feed in the Content Preferences section where you can block tags!
We do often include affiliate links to earn us some pennies. See more here.
tagline-image
BioShock Infinite has been released for Linux, and it brings a promise of some excellent FPS action with it.

To get this out of the way first; the port is done by Virtual Programming, so it's using their "eON" technology. The argument for and against it has been done to death somewhat, so let’s leave that at the door please.

This is the third port from Virtual Programming, and it's my favourite out of the games they have put out for Linux!

Once Steam decides I'm worthy of a download I will detail my findings, so until then you can either wait, or risk it and buy it.

Intel graphics, and open source drivers are not currently supported on the title, so you have been warned!

RECOMMENDED GRAPHICS DRIVERS
nVidia: NVidia 340.65
AMD: fglrx 14.12 (Currently only Radeon 7xxx and greater series cards are supported)

About the game (Official)
Indebted to the wrong people, with his life on the line, veteran of the U.S. Cavalry and now hired gun, Booker DeWitt has only one opportunity to wipe his slate clean. He must rescue Elizabeth, a mysterious girl imprisoned since childhood and locked up in the flying city of Columbia. Forced to trust one another, Booker and Elizabeth form a powerful bond during their daring escape. Together, they learn to harness an expanding arsenal of weapons and abilities, as they fight on zeppelins in the clouds, along high-speed Sky-Lines, and down in the streets of Columbia, all while surviving the threats of the air-city and uncovering its dark secret.

Check out BioShock Infinite on Steam now, but you may want to hold off for our official report. Article taken from GamingOnLinux.com.
Tags: FPS, Steam
0 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
72 comments
Page: «4/4
  Go to:

dubigrasu Mar 18, 2015
CPU/GPU *RAM usage and benchmark on SteamOS:

Overall score while recording: 86.55
Overall score running normally: 115.69
Overall score running fullscreen @1920x1080: 71.84

View video on youtube.com
BabaoWhisky Mar 18, 2015
Hum...

Bioshock Infinite is a d3d11 game which is ported to OpenGL4 to Linux, right ?
So, if The Witcher 3 is really coming to Linux, i think it's Virtual Programming who make the port.
What do you think ?
dubigrasu Mar 18, 2015
Quoting: berillionsHum...

Bioshock Infinite is a d3d11 game which is ported to OpenGL4 to Linux, right ?
So, if The Witcher 3 is really coming to Linux, i think it's Virtual Programming who make the port.
What do you think ?
If they do, I think they better crank up their engine for W3, put some more mojo into it.
No matter how good BI or W2 are running now, from what I saw Witcher 3 will be a much more demanding game than these two.
EKRboi Mar 19, 2015
It finally allowed me to install it today and I just got around to testing it out. I'm pretty damn impressed. It is not perfect and needs some work, but I'm still impressed.

intel 5930k, 16gb, gtx970

When the game is not streaming in textures it is parked @ 60fps and utilizing 50-75% of my GPU on Ultra preset. There is an annoying stutter when it is streaming in textures, it happens very often and when it does it tanks FPS for a second or two.

Now on to my theory of why it is happening. It is hardly loading anything into VRAM. With the game running on one monitor, steam on another and a terminal(htop) and nvidia settings open on the other I have not seen it use use more than 900mb of my VRAM(200mb is desktop). It should be far more than that @ ultra settings. So I don't think it is pre-loading textures or even keeping textures from prior scenes around. So it has to pull those textures from disk every time a scene really changes.

If that is the case and not that for some crazy reason Nvidia-settings is reporting VRAM usage wrong for this game only, then I can't Imagine they did this on purpose and has to be a bug that slipped in not long ago. So hopefully it will get fixed quickly. If they can fix that then I think it is going to be rock solid 60fps all day long and some people(myself included) probably have some words to eat concerning eON...

*I would still prefer a native port obviously
dubigrasu Mar 19, 2015
Quoting: EKRboiWhen the game is not streaming in textures it is parked @ 60fps and utilizing 50-75% of my GPU on Ultra preset. There is an annoying stutter when it is streaming in textures, it happens very often and when it does it tanks FPS for a second or two.

Now on to my theory of why it is happening. It is hardly loading anything into VRAM. With the game running on one monitor, steam on another and a terminal(htop) and nvidia settings open on the other I have not seen it use use more than 900mb of my VRAM(200mb is desktop). It should be far more than that @ ultra settings. So I don't think it is pre-loading textures or even keeping textures from prior scenes around. So it has to pull those textures from disk every time a scene really changes.

If that is the case and not that for some crazy reason Nvidia-settings is reporting VRAM usage wrong for this game only, then I can't Imagine they did this on purpose and has to be a bug that slipped in not long ago. So hopefully it will get fixed quickly.
Well, some of the issues reported for the Linux port I've seen them (and can still do) reported by Windows users, for example the black screen at first boot, occasional hangs and stuttering, so I'm not sure if VP can do something about it.
I personally never had the hang issue in Windows, but I did had the black screen and the stuttering.

About VRAM usage:
Googled a bit about it and it seems that it boils down to increasing the PoolSize in the "XEngine.ini" file. The default for me was "400" and I raised it to 3000.
(it has to be a number lower than your card total VRAM memory with around 600MB.)
Previously I've never seen the VRAM going past 1000mb, now at times goes to 2700mb, but not higher than that. The textures are flushed every time a new level gets loaded.
The stuttering (still present) is also reduced.

It seem that this must be used in conjunction with the:
-ReadTexturePoolFromIni
parameter to the game's launch options.
Supposedly it reduces stutter, but I'm still undecided about this one.

There are a bunch of other tweaks (some old) floating around, but the "PoolSize"is the only one I've seen to has some positive effect.

Edit:
Found some interesting info about how memory is managed (in DefaultEngine.ini file):
Quote[TextureStreaming]
UseTextureFileCache=TRUE
; We now auto calculate the texture pool size on PC.
; The equation is basically TexturePoolSize = Detected video memory - size of frame buffers - estimate for other resource useage like vertex buffers.
; TexturePoolSizeReductionMB is the estimate of how much we'll need for resources than the frame buffers
TexturePoolSizeReductionMB=40
;On PC, pool size only gets used if -ReadTexturePoolFromIni is passed in on the commandline. Otherwise it is auto calculated based on your video card memory.
PoolSize=400
; hard coded "safe" max texture pool size if running in low or very low
LowPCTexturePoolSizeMB=256
EKRboi Mar 19, 2015
[quote=dubigrasu]Well, some of the issues reported for the Linux port I've seen them (and can still do) reported by Windows users, for example the black screen at first boot, occasional hangs and stuttering, so I'm not sure if VP can do something about it.
I personally never had the hang issue in Windows, but I did had the black screen and the stuttering.

About VRAM usage:
Googled a bit about it and it seems that it boils down to increasing the PoolSize in the "XEngine.ini" file. The default for me was "400" and I raised it to 3000.
(it has to be a number lower than your card total VRAM memory with around 600MB.)
Previously I've never seen the VRAM going past 1000mb, now at times goes to 2700mb, but not higher than that. The textures are flushed every time a new level gets loaded.
The stuttering (still present) is also reduced.

It seem that this must be used in conjunction with the:
-ReadTexturePoolFromIni
parameter to the game's launch options.
Supposedly it reduces stutter, but I'm still undecided about this one.

There are a bunch of other tweaks (some old) floating around, but the "PoolSize"is the only one I've seen to has some positive effect.

Edit:
Found some interesting info about how memory is managed (in DefaultEngine.ini file):
Quote[TextureStreaming]
UseTextureFileCache=TRUE
; We now auto calculate the texture pool size on PC.
; The equation is basically TexturePoolSize = Detected video memory - size of frame buffers - estimate for other resource useage like vertex buffers.
; TexturePoolSizeReductionMB is the estimate of how much we'll need for resources than the frame buffers
TexturePoolSizeReductionMB=40
;On PC, pool size only gets used if -ReadTexturePoolFromIni is passed in on the commandline. Otherwise it is auto calculated based on your video card memory.
PoolSize=400
; hard coded "safe" max texture pool size if running in low or very low
LowPCTexturePoolSizeMB=256

Ha, I remember that now from when the game first came to win and the game wasn't properly auto calculating pool size for some people. The same thing must be happening here. I changed the pool size to 3000 which is more than enough for this game and used -ReadTexturePoolFromIni in the launch options and it is now using 1600mb of VRAM in the beginning of the game (about right, it uses 2.5gb in win running 3 monitor surround on Ultra) and the MEGA stutter and subsequent MEGA FPS hit are no longer there. There is still a little stutter in some places but I would consider it playable now. With the way it was acting last night there would be no way I could play through it.

Thanks for the reminder!
dubigrasu Mar 19, 2015
Great :)

Edit: Still a bit undecided/confused, according to
Quote;On PC, pool size only gets used if -ReadTexturePoolFromIni is passed in on the commandline. Otherwise it is auto calculated based on your video card memory.
I need to use the -ReadTexture....parameter if I want a custom PoolSize.

However, If I do that the usage never goes beyond 1000mb for me despite the PoolSize set to 3000.
While if I only set a custom (3000) PoolSize without using the -ReadTexture...parameter, the usage easily goes beyond 2000mb by the end of the level.

I played one level few times using both methods and it always seem to be somehow better without the -ReadTexture parameter, and even the stutter seem to go away.

Hm
adolson Mar 19, 2015
OK dubigrasu, you'll be happy to know that I'm supporting VP by buying directly on Steam. Why, if I dislike VP/eON and didn't care for the game itself? Because of a) the sale, b) I have a need to collect Linux games, and c) ultimately I decided that having my purchase count as a Linux sale matters more than anything else.
dubigrasu Mar 19, 2015
Power to you man!
Yeah, I have that need to collect Linux games too. Must be a virus of some sort.
Had no idea about the sale, thanks.
Aule Mar 19, 2015
It's on sale on Steam right now. 75% off!

"Instabuy"! ; - )
EKRboi Mar 20, 2015
Quoting: dubigrasuGreat :)

Edit: Still a bit undecided/confused, according to
Quote;On PC, pool size only gets used if -ReadTexturePoolFromIni is passed in on the commandline. Otherwise it is auto calculated based on your video card memory.
I need to use the -ReadTexture....parameter if I want a custom PoolSize.

However, If I do that the usage never goes beyond 1000mb for me despite the PoolSize set to 3000.
While if I only set a custom (3000) PoolSize without using the -ReadTexture...parameter, the usage easily goes beyond 2000mb by the end of the level.

I played one level few times using both methods and it always seem to be somehow better without the -ReadTexture parameter, and even the stutter seem to go away.

Hm

I think I can concur. Like you said, the notes in the ini and what is actually happening seems to conflict. I am getting more VRAM usage (and less stuttering) with simply setting the pool size to 3000 than with pool size to 3000 AND using -ReadTexturePoolFromIni launch option.

Sadly, nothing I've tried allows me to run the game on 3 monitors in Linux. I know performance wouldn't be good even on 1 GTX970, but I had hoped if one day Nvidia made SLI on Linux actually work right or in 4 years when I can run it on 3 monitors and a single GPU it would work ;(
dubigrasu Mar 20, 2015
Quoting: EKRboi
Quoting: dubigrasuGreat :)

Edit: Still a bit undecided/confused...

I think I can concur. Like you said, the notes in the ini and what is actually happening seems to conflict. I am getting more VRAM usage (and less stuttering) with simply setting the pool size to 3000 than with pool size to 3000 AND using -ReadTexturePoolFromIni launch option.
From the horse's mouth: (Chris Kline, Tech Director @Irrational)
http://forums.2k.com/showthread.php?222666-Possible-solutions-for-known-issues
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.