Confused on Steam Play and Proton? Be sure to check out our guide.
Latest Comments by F.Ultra
A developer from Bohemia Interactive wants to know your interest in the Arma 3 Linux port
6 August 2018 at 4:55 pm UTC

Quoting: x_wing
Quoting: Guest
Quoting: F.UltraSo Mesa should just focus on being bug for bug compatible with nVidia?

You're focusing on bashing Nvidia again. Let me make it clear.

The Nvidia binary driver on Linux has this behavior
AMD's Catalyst driver on Linux has this behavior
All of the OS X OpenGL drivers - for AMD, Nvidia and Intel have this behavior.
Metal on OS X has this behavior.

MESA is the odd one out.

You can argue semantics all you like, but if the situation was reversed e.g. Nvidia was "following spec" and MESA was "doing the popular thing", you'd be screaming for Nvidia to "fix it".

The question here is: what says OpenGL specs about this behavior? Is Marek wrong with his statement? There is no point to say that others drivers works to a project and devs that are very pedantic with what is stated in papers (you're allowed to quote Torvalds for this behavior :p ).

Is there anyway to optimize the workaround they have in radeonsi? Sounds like that we have a solution but it strongly hits performance.

According to https://lists.freedesktop.org/archives/mesa-dev/2017-June/160362.html the GLSL specs (where this is happening) says that "derivatives are undefined after non-uniform discard" so you could argue that this use is forbidden (if you act like a GCC developer) or that it's a grey area.

A developer from Bohemia Interactive wants to know your interest in the Arma 3 Linux port
6 August 2018 at 4:46 pm UTC

Quoting: jens
Quoting: F.Ultra
Quoting: Guest
Quoting: F.UltraThat it works on the nVidia driver is proof of exactly nothing. nVidia is known for adding support for broken behaviour and the problem for Mesa here is if they are to be following the OpenGL specifications or if they are to allow nVidia to dictate the OpenGL specifications.

Unfortunately many game devs use nVidia so they don't know that they are not following the specs since it "just works".

Yawn. You completely missed the part where I said it works on OS X's GL drivers too then. On ALL GPU's there. Also it worked on Catalyst. Nice one in turning it into an opportunity to bash Nvidia though.

Just so you know.. my main Linux box has an AMD card in it.

If we look at the changelog from Marek when this particular drirc override was implemented it looks quite clear to be a bug in all the other drivers that accept this behaviour for OpenGL yes:

QuoteRocket League expects DirectX behavior for partial derivative computations after discard/kill, but radeonsi implements the more efficient but stricter OpenGL behavior and that will remain our default behavior. The new screen flag forces radeonsi to use the DX behavior for that game.

I don't see the word "bug" anywhere in your quote. From my own experience (not with graphics api's though), like @mirv already pointed out, complex specification will always contain some minor gray areas where interpretation differences may occur. Combine that with lots of moving targets and needed time for stabilization for the specification itself during development, it is then rather the exception that something "just" works according to specs ;). I don't know all the details here, but sometimes there is no right or wrong. I guess that the mesa devs had a similar conclusion since they implemented a switch for this behavior. I have an NVidia card but I'm happy that mesa got in such a good shape recently and that lot's of AMD people can enjoy e.g. Feral, VP and other ports.

The switch was added because it was easier to implement the work around via a drirc setting than change the behaviour in Rocket League (where the problem that lead to the setting originated) due to the DX->OpenGL translation and the main problem here is that there exists a difference in how DX and OpenGL handles this, aka Marek was just a nice guy here.

You don't see the word "bug" explicitly but the wording is still "radeonsi implements the ... OpenGL behaviour" which can not be read in any other way than this not being according to OpenGL specifications. nVidia probably handles it like DX due to similar code paths in their driver. No one is accusing nVidia of deliberately implementing it this way to create problems for mesa/radeon.

A developer from Bohemia Interactive wants to know your interest in the Arma 3 Linux port
5 August 2018 at 12:03 pm UTC Likes: 2

Quoting: Guest
Quoting: F.UltraThat it works on the nVidia driver is proof of exactly nothing. nVidia is known for adding support for broken behaviour and the problem for Mesa here is if they are to be following the OpenGL specifications or if they are to allow nVidia to dictate the OpenGL specifications.

Unfortunately many game devs use nVidia so they don't know that they are not following the specs since it "just works".

Yawn. You completely missed the part where I said it works on OS X's GL drivers too then. On ALL GPU's there. Also it worked on Catalyst. Nice one in turning it into an opportunity to bash Nvidia though.

Just so you know.. my main Linux box has an AMD card in it.

If we look at the changelog from Marek when this particular drirc override was implemented it looks quite clear to be a bug in all the other drivers that accept this behaviour for OpenGL yes:

QuoteRocket League expects DirectX behavior for partial derivative computations after discard/kill, but radeonsi implements the more efficient but stricter OpenGL behavior and that will remain our default behavior. The new screen flag forces radeonsi to use the DX behavior for that game.

A developer from Bohemia Interactive wants to know your interest in the Arma 3 Linux port
5 August 2018 at 11:53 am UTC Likes: 1

Quoting: Leopard
Quoting: F.Ultra
Quoting: Guest
Quoting: sonicYou says that is Mesa bug, but from Mesa devs response I am no so sure. And lets be honest, this is mainly caused by the d3d->ogl translation layer.

It is, and the proof is that it does not happen on the Nvidia driver, nor does it happen under MacOS under OpenGL. D3D->GL has nothing to do with it.

That it works on the nVidia driver is proof of exactly nothing. nVidia is known for adding support for broken behaviour and the problem for Mesa here is if they are to be following the OpenGL specifications or if they are to allow nVidia to dictate the OpenGL specifications.

Unfortunately many game devs use nVidia so they don't know that they are not following the specs since it "just works".

Can we stop this please?

Just because of that nonsense attitude , AMD users are still suffering when try to play Dying Light like games.

Is your goal running all games you have without issues as a gamer or caring about driver hacks etc?

So Mesa should just focus on being bug for bug compatible with nVidia? I agree that it's frustrating as a gamer to not have your game work due to Mesa refusing to implement a bug that just happens to exist in nVidia but at the same time the real problem lies in the game, if games would just also test with Mesa on AMD then a lot of bugs in games could have been fixed before product launch (which would produce a much higher quality product in the end).

A developer from Bohemia Interactive wants to know your interest in the Arma 3 Linux port
4 August 2018 at 6:27 pm UTC Likes: 4

Quoting: Guest
Quoting: sonicYou says that is Mesa bug, but from Mesa devs response I am no so sure. And lets be honest, this is mainly caused by the d3d->ogl translation layer.

It is, and the proof is that it does not happen on the Nvidia driver, nor does it happen under MacOS under OpenGL. D3D->GL has nothing to do with it.

That it works on the nVidia driver is proof of exactly nothing. nVidia is known for adding support for broken behaviour and the problem for Mesa here is if they are to be following the OpenGL specifications or if they are to allow nVidia to dictate the OpenGL specifications.

Unfortunately many game devs use nVidia so they don't know that they are not following the specs since it "just works".

Facepunch are no longer selling the Linux version of the survival game Rust (updated)
30 July 2018 at 8:35 pm UTC Likes: 1

Quoting: iiariI have zero horse in the this race, but find it hilarious that the dev implies that:

1) Only the Linux community is toxic (eyeroll). Has he been in, well, almost any other Windows gaming forum at all for the sophomoric, hateful, entitled views there? Hahahaha....

2) Only Linux is a broken platform (eyeroll again). There are so many broken games, compromised engines, etc etc on Windows and, well, every other platform that to pretend it's only us is absurd.

This is because Gary is a Windows guy, he codes on Windows and he uses only Windows. So when he sees toxic comments from Windows people he is just not seeing it but then the same comes from a Linux user then it hits his brain with "well look at that, another toxic Linux luser...".

This is just how the human brain works. If you are a meat-eater and you see a single Vegan doing something crazy -> all vegans are crazy. And if you are a die hard Linux fan then you can experience one bug every single day without recording it but try Windows/macOS once and hit a single bug and "I knew it, this whole system is FUBAR".

Facepunch are no longer selling the Linux version of the survival game Rust (updated)
30 July 2018 at 7:49 pm UTC

Quoting: burningserenity
Quoting: F.Ultra
Quoting: Egonaut
Quoting: nox
Quoting: EgonautThat's why you don't support that toxic guy Garry and everyone who's working with him.
Meh, he is straight forward, blunt and sarcastic. Toxic isn't really the right word.
Calling Linux (and by that their users) a second class isn't toxic? Yeah sure, defend that guy further if you want, I don't.

But he is correct, the Linux version is a second class citizen for them. There is nothing toxic with that, second class citizen in computer speak only means that it's a lower priority, it's not a derogatory term.

Here's an idea: by the dev's admission, only 17 people use Linux. So why not just give us the source code? No one else can be bothered to compile it. Hell, if we give him $240 he'll even get to keep the money from the lost sales!

It's note like there is a source of The Linux Version that is also not The Windows Version. I get what you are saying but this will never happen, there are several examples of publishers that rather see zero sales than earn some money by selling a license or source code.

Facepunch are no longer selling the Linux version of the survival game Rust (updated)
27 July 2018 at 5:28 pm UTC Likes: 4

Quoting: Egonaut
Quoting: nox
Quoting: EgonautThat's why you don't support that toxic guy Garry and everyone who's working with him.
Meh, he is straight forward, blunt and sarcastic. Toxic isn't really the right word.
Calling Linux (and by that their users) a second class isn't toxic? Yeah sure, defend that guy further if you want, I don't.

But he is correct, the Linux version is a second class citizen for them. There is nothing toxic with that, second class citizen in computer speak only means that it's a lower priority, it's not a derogatory term.

The CTO of Croteam has written up a post about 'The Elusive Frame Timing'
26 July 2018 at 10:21 pm UTC Likes: 1

Quoting: DuncFrom the article:
QuoteVideo games have been running at 60 fps since the days of first arcade machines, back in the ‘70s. Normally, it was expected that the game runs at exactly the same frame rate that the display uses. It wasn’t until the popularization of 3D games that we first started accepting lower frame rates.
That's actually a bit misleading. 80's home computers (and consoles) were slow, and they were connected to standard TVs, which obviously used interlaced displays. Games did (usually) run at the full frame rate (i.e, one screen update every two interlaced “fields”), but that was 30 fps in NTSC country, and 25 in PAL-land. 60 (or 50) fps games existed, but they were rare enough for it to be a selling point sometimes. I grew up on 25 fps. Maybe that's why I'm less obsessed with absolute framerate than most people seem to be these days. :) Stuttering drives me nuts, though.

QuoteWay back in the ‘90s, when “3D cards” (that was before we started calling them “GPUs”) started to replace software rendering, people used to play games in 20 fps and considered 35 fps speeds for serious competitive netplay. I’m not kidding.
This is usually the point where I mention that the original F1 Grand Prix ran at a fixed 8 fps on the Amiga. When people started upgrading their machines, they had to apply a third-party patch to unlock the framerate. I still never managed to get it much above 20, mind you. :S:

That said, in general the Amiga was as smooth as silk, at least in 2D. The PC architecture has always seemed to struggle with maintaining a consistent framerate without stuttering. And I'm sure Alen from Croteam is right about the reason. Those older machines' display hardware was much more closely integrated with the display. They were built for NTSC or PAL. Programmers, the programs they wrote, and the hardware that code ran on always knew exactly when the frame was being drawn. They wrote to the vblank, going to crazy lengths sometimes to ensure that the entire game loop ran between frames. Even those 30/25 fps games ran at a consistent 30/25 fps, without tearing or dropped frames.

I hope the fix is as simple as he seems to think.

Exactly this, back in the day we used to measure everything we coded for demos/games in the amount of raster time they took. On the Amiga I remember that we used to switch on a bitplane before a function and switch if off again (or if it was a color in the bitplane, memory is a bit rusty) after to see the actual mount of pixels as a visual benchmark when doing performance measurements.

The Paradox Launcher is now available on Linux
20 July 2018 at 10:34 pm UTC

Quoting: Eike
Quoting: F.Ultra
Quoting: Eike
Quoting: F.UltraWhich is why I included the fact that all agreements you have to sign with Valve in order to publish on Steam is public in my previous post.

What's the share Valve takes off the Steam price?

Paradox claims that it's 30% in their Annual Reports which match what others have said the cut is. Seams to be a magic number since that is what Google/Apple et al also seams to take in their stores.

If we've got full access to all agreements you have to sign with Valve in order to publish on Steam, shouldn't we have that information first-hand?

What I'm getting at: It seems we don't have that full access. And I wonder if they're under NDA, because it's not like a lot of game makers are telling us clearly what the share is.

Ok, yes I can agree with you on that. It's clearly covered by a NDA because you have to sign the NDA as part of signing up with the program (and you can see the whole NDA yourself if you click to become a publisher). However AFAIK see there is nothing else that you have to sign so you are not signing away any publish-on-opther-places rights or don't-set-another-price-elsewhere.

Speculating here, but my guess is that this the actual percentage is secret either due to big publishers like EA do not paying the full 30% or Valve simply wanting to keep it a trade secret (how well that now works since everybody is talking about it being 30%).

I've heard that GOG also takes 30% but could not find any information at all on gog.com, in fact there seams to be no public information at all on how publishing works on gog, but it might also be me that haven't looked hard enough.

Bottom line is yes you are correct in that not every single detail is public but at least every agreement that you have to sign is and that is what I find more important since it's only in agreements where you can be legally forced to do certain things.