In an interesting post aimed at the more casual reader, the CTO of Croteam has now written up a post about 'The Elusive Frame Timing'.
An issue that has been around for years, is that while a game may run at 60FPS you might experience stuttering, even though the FPS is good. It's not a problem with one particular developer either, many games do suffer from the same issue. It's written to go along with the talk they did at GDC this year (slides here). What's interesting, is how long they've been researching this and trying to solve it.
They showed two videos to highlight the problem, the first is this one which should run perfectly smoothly and this one showing what they're calling "Heartbeat" stutter. Having two monitors next to each other, playing both at the same time made it really easy to see. To make it easier to spot, they did another video with them sliced together, the smooth version being on top and you can easily notice the difference, especially as it looks like it's tearing in the middle.
It seems Vulkan is a bit ahead of the game here, with the VK_GOOGLE_display_timing extension, which Croteam have been experimenting with. Although, support for it is quite limited currently.
A really interesting read, if still a bit over my head overall. You can read the full post on Medium.
I'm just waiting for Freesync to be working okay on the opensource drivers, to pull the trigger on a LG Ultrawide monitor.
Shall we tell Feral to use
VK_GOOGLE_display_timing
to compute frame timing on all of their Vulkan ports, or are they already doing it?Last edited by MayeulC on 26 July 2018 at 3:10 pm UTC
I personally prefer enabling V-Sync since I prefer a small stuttering than a sliced image (in other words, I prefer a strong 30 fps than an irregular and out-of-sync 45 fps).
And that's where Freesync (or G-Sync for NVIDIA customers) comes to play. The display will show the image that the GPU sends, with a much loose consideration for the frame rate, since the display frame rate is now dynamic and the GPU can send its rendering within a frame rate range and not at a unique frame rate.
Last edited by Creak on 26 July 2018 at 3:23 pm UTC
QuoteVideo games have been running at 60 fps since the days of first arcade machines, back in the ‘70s. Normally, it was expected that the game runs at exactly the same frame rate that the display uses. It wasn’t until the popularization of 3D games that we first started accepting lower frame rates.That's actually a bit misleading. 80's home computers (and consoles) were slow, and they were connected to standard TVs, which obviously used interlaced displays. Games did (usually) run at the full frame rate (i.e, one screen update every two interlaced “fields”), but that was 30 fps in NTSC country, and 25 in PAL-land. 60 (or 50) fps games existed, but they were rare enough for it to be a selling point sometimes. I grew up on 25 fps. Maybe that's why I'm less obsessed with absolute framerate than most people seem to be these days. :) Stuttering drives me nuts, though.
QuoteWay back in the ‘90s, when “3D cards” (that was before we started calling them “GPUs”) started to replace software rendering, people used to play games in 20 fps and considered 35 fps speeds for serious competitive netplay. I’m not kidding.This is usually the point where I mention that the original F1 Grand Prix ran at a fixed 8 fps on the Amiga. When people started upgrading their machines, they had to apply a third-party patch to unlock the framerate. I still never managed to get it much above 20, mind you. :S:
That said, in general the Amiga was as smooth as silk, at least in 2D. The PC architecture has always seemed to struggle with maintaining a consistent framerate without stuttering. And I'm sure Alen from Croteam is right about the reason. Those older machines' display hardware was much more closely integrated with the display. They were built for NTSC or PAL. Programmers, the programs they wrote, and the hardware that code ran on always knew exactly when the frame was being drawn. They wrote to the vblank, going to crazy lengths sometimes to ensure that the entire game loop ran between frames. Even those 30/25 fps games ran at a consistent 30/25 fps, without tearing or dropped frames.
I hope the fix is as simple as he seems to think.
Last edited by Scoopta on 26 July 2018 at 5:24 pm UTC
Quoting: ScooptaI would have expected that, at least with vsync and OpenGL, when the buffer swap returns that's when the frame has been displayed.You'd think so. But, as far as I can make out, what Alen's saying is that this isn't necessarily the case. Or at least, relying on this method for timing is misleading.
Last edited by Dunc on 26 July 2018 at 5:48 pm UTC
Quoting: KetilHaving watched both the good and bad video a lot of times at 1080p60 quality I must conclude that the playback of either video doesn't feel consistent. Sometimes the bad one feels perfect although most of the time it seems a little off. On the other hand, the good one sometimes feels bad, although never as bad as the bad one.Something must be wrong with the way your system renders the videos then, because for me the difference is extremely obvious. One is smooth as silk, and the other is consistently choppy, like advertised.
See more from me