The developer of VK9, another rather interesting compatibility layer has advanced further with the announcement of another completed milestone.
Much like DXVK, it aims to push Direct3D over to Vulkan, while DXVK focuses on D3D11 and D3D10 the VK9 project is fixed on D3D9.
Writing about hitting the 29th milestone, the developer said this on their blog:
VK9 has reached it's 29th milestone. Reaching this milestone required further shader enhancements including support for applications which set a vertex shader but no pixel shader. In addition to this there are a number of fixes including proper NOOVERWRITE support which fixed the graphical corruption in UT99. This release also no longer depends on the push descriptor extension so VK9 should now be compatible with the closed source AMD driver. New 32bit and 64bit builds are available on Github.
Always fun to watch these projects progress, give it another year and it will be exciting to see what we can do with it. They have quite a few milestones left to achieve, which you can find on the Roadmap.
Find it on GitHub.
Quoting: CybolicP.S. A fun anecdotal video is one from Linus Tech Tips (4K Gaming is Dumb where even some of their people couldn't spot any difference between 60/144/240Hz whilst playing Doom 2016.
Totally agree. Refresh rate > resolution. And optimum today is indeed 2560 x 1440 / 144 Hz (with adaptive sync and LFC). Hopefully Linux will support that soon.
Last edited by Shmerl on 14 January 2019 at 4:04 am UTC
Quoting: ShmerlQuoting: CybolicP.S. A fun anecdotal video is one from Linus Tech Tips (4K Gaming is Dumb where even some of their people couldn't spot any difference between 60/144/240Hz whilst playing Doom 2016.
Totally agree. Refresh rate > resolution. And optimum today is indeed 2560 x 1440 / 144 Hz (with adaptive sync and LFC). Hopefully Linux will support that soon.
I'm running 100Hz with G-Sync enabled (on 3440x1440) and it's working fine for the most part; a few games need a helping hand, but it's mostly smooth sailing. Are you referring more to an available open / AMD solution or is 100Hz really the limit?
Quoting: CybolicI'm running 100Hz with G-Sync enabled (on 3440x1440) and it's working fine for the most part; a few games need a helping hand, but it's mostly smooth sailing. Are you referring more to an available open / AMD solution or is 100Hz really the limit?
GPUs simply aren't powerful enough to run games at something like 144 Hz at 4K. Video above makes this point quite well. 2560 x 1440 / 144 Hz matches current generation hardware, at least if we are talking about single GPU solution.
Quoting: ShmerlQuoting: CybolicI'm running 100Hz with G-Sync enabled (on 3440x1440) and it's working fine for the most part; a few games need a helping hand, but it's mostly smooth sailing. Are you referring more to an available open / AMD solution or is 100Hz really the limit?
GPUs simply aren't powerful enough to run games at something like 144 Hz at 4K. Video above makes this point quite well. 2560 x 1440 / 144 Hz matches current generation hardware, at least if we are talking about single GPU solution.
Absolutely true for 4K, but 2.5K (or whatever one chooses to call 3440x1440) at 100Hz is manageable for many games (though 144Hz would be pushing it), as long as they're not too graphically/computationally expensive; luckily, the only one that's far from the mark in my library is "Total War: Warhammer", which doesn't absolutely require 100Hz anyway.
Quoting: Cybolicthough 144Hz would be pushing it
That's where adaptive sync should be helpful. Let's say you have monitor sync range 40 - 144 Hz. So anything in the range of 40 - 144 fps should be running smoothly.
And then there is also LFC (low framerate compensation). You should try getting monitors with that. It basically means that monitor runs with double framerate to prevent tearing. I.e. let's say your game produces 30 fps which is below adaptive sync range. The monitor will run at 60 Hz (which is already in range). So that covers framerates between 20 and 40 fps. Anything below 20 fps is unplayable anyway, so not an issue.
Here is a good table I've found recently: https://www.amd.com/en/products/freesync-monitors
Filter by resolution, sync range and LFC.
Hopefully all this will be supported on Linux this year.
Last edited by Shmerl on 14 January 2019 at 4:50 pm UTC
Quoting: ShmerlQuoting: Cybolicthough 144Hz would be pushing it
That's where adaptive sync should be helpful. Let's say you have monitor sync range 40 - 144 Hz. So anything in the range of 40 - 144 fps should be running smoothly.
[...]
Hopefully all this will be supported on Linux this year.
G-Sync already seems to do this quite well for me; I was actually referring to the FPS, not the Hz - sorry for the mess-up :)
Quoting: CybolicQuoting: HubroQuoting: CybolicIt's highly subjective. In general, humans perceive anything over 25/30 FPS as "continuous" and anything over 60 FPS as "smooth" but most can distinguish between 30 and 60 FPS and quite a few can recognise changes between 60 and 120 FPS. Above that, things get extremely subjective and most people can't see any difference.
Dude no, that's absolutely not true. If you're talking about watching movies you might be right, but the extra responsiveness and smoothness you get from higher frame rates when gaming is *extremely* noticeable. The difference between 60hz and 120hz when gaming is MASSIVE. I can say that from personal experience and the testimony of everyone I know of who've tried a 120hz monitor. If you disagree, just try playing Counter Strike on a PC with a mouse and moving your crosshair back and forth quickly. If you honestly can't tell the difference at that point then you must have some kind of medical condition, or just terrible eye sight. I would consult a doctor (or optician, respectively.)
I found the jump from 120hz to 165hz very noticeable as well, although less so than 60 to 120. In my uneducated opinion, the difference in smoothness in some situations (like quickly turning 180 degrees in a first person shooter) will probably be somewhat noticeable up to around 240hz, maybe even further. I'd have to try it myself to be sure.
(Also if your entire comment was about *seeing* a difference, not *feeling* a difference while gaming, then I apologize in advance. A high frame rate is much less important when you're just watching the screen and not interacting in any way.)
I'm not speaking for myself. As I said in the first sentence, it's highly subjective; I can absolutely both see and feel the difference between 60 and 120Hz/FPS. If you look around at blind tests and consumer reports on monitors however, you'll see that a surprising amount of people don't notice any difference between a 60Hz and a 120Hz monitor, even in gaming tests.
Again "quite a few can recognise changes between 60 and 120 FPS", you and I included, but it's not everyone who can.
P.S. A fun anecdotal video is one from Linus Tech Tips (4K Gaming is Dumb where even some of their people couldn't spot any difference between 60/144/240Hz whilst playing Doom 2016.
I'm sure you can get people to say they don't notice a difference in *certain circumstances* - A picture of a gray rock will look very similar whether or not the image har color. If you set up blind tests using a picture of a gray rock, I'm sure you'd get numbers saying lots of people don't notice the difference between grayscale and color images. Playing certain games (like a slow paced strategy game or playing with a controller) will feel very similar on a 60hz and a 120hz monitor.
But put somebody in front of a sufficiently strong PC with a mouse and a first person shooter and the difference between 60hz and 120hz will be as stark as the difference between a grayscale and color image of a carnival.
See more from me