We do often include affiliate links to earn us some pennies. See more here.

In an interesting post aimed at the more casual reader, the CTO of Croteam has now written up a post about 'The Elusive Frame Timing'.

An issue that has been around for years, is that while a game may run at 60FPS you might experience stuttering, even though the FPS is good. It's not a problem with one particular developer either, many games do suffer from the same issue. It's written to go along with the talk they did at GDC this year (slides here). What's interesting, is how long they've been researching this and trying to solve it.

They showed two videos to highlight the problem, the first is this one which should run perfectly smoothly and this one showing what they're calling "Heartbeat" stutter. Having two monitors next to each other, playing both at the same time made it really easy to see. To make it easier to spot, they did another video with them sliced together, the smooth version being on top and you can easily notice the difference, especially as it looks like it's tearing in the middle.

It seems Vulkan is a bit ahead of the game here, with the VK_GOOGLE_display_timing extension, which Croteam have been experimenting with. Although, support for it is quite limited currently.

A really interesting read, if still a bit over my head overall. You can read the full post on Medium.

Article taken from GamingOnLinux.com.
Tags: Misc
16 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
16 comments
Page: 1/2»
  Go to:

M@GOid Jul 26, 2018
Yeap, some games have a acute case of this. For me, in Linux, it mostly happens on the SCS truck simulators.

I'm just waiting for Freesync to be working okay on the opensource drivers, to pull the trigger on a LG Ultrawide monitor.
TheRiddick Jul 26, 2018
Warthunder and Tomb Raider have this issue at times. At least when using OpenGL mode,.
MayeulC Jul 26, 2018
Now, that was an interesting read. I love Croteam!

Shall we tell Feral to use VK_GOOGLE_display_timing to compute frame timing on all of their Vulkan ports, or are they already doing it?


Last edited by MayeulC on 26 July 2018 at 3:10 pm UTC
Creak Jul 26, 2018
@raneon it could actually. that's probably not the only reason, but it's possible that the GPU drops a frame because the display isn't ready yet for the next frame. And the result is, if you disabled V-Sync, you'll get tearing, and if you enabled V-Sync, you'll get stuttering (no perfect solution).

I personally prefer enabling V-Sync since I prefer a small stuttering than a sliced image (in other words, I prefer a strong 30 fps than an irregular and out-of-sync 45 fps).

And that's where Freesync (or G-Sync for NVIDIA customers) comes to play. The display will show the image that the GPU sends, with a much loose consideration for the frame rate, since the display frame rate is now dynamic and the GPU can send its rendering within a frame rate range and not at a unique frame rate.


Last edited by Creak on 26 July 2018 at 3:23 pm UTC
Dunc Jul 26, 2018
From the article:
QuoteVideo games have been running at 60 fps since the days of first arcade machines, back in the ‘70s. Normally, it was expected that the game runs at exactly the same frame rate that the display uses. It wasn’t until the popularization of 3D games that we first started accepting lower frame rates.
That's actually a bit misleading. 80's home computers (and consoles) were slow, and they were connected to standard TVs, which obviously used interlaced displays. Games did (usually) run at the full frame rate (i.e, one screen update every two interlaced “fields”), but that was 30 fps in NTSC country, and 25 in PAL-land. 60 (or 50) fps games existed, but they were rare enough for it to be a selling point sometimes. I grew up on 25 fps. Maybe that's why I'm less obsessed with absolute framerate than most people seem to be these days. :) Stuttering drives me nuts, though.

QuoteWay back in the ‘90s, when “3D cards” (that was before we started calling them “GPUs”) started to replace software rendering, people used to play games in 20 fps and considered 35 fps speeds for serious competitive netplay. I’m not kidding.
This is usually the point where I mention that the original F1 Grand Prix ran at a fixed 8 fps on the Amiga. When people started upgrading their machines, they had to apply a third-party patch to unlock the framerate. I still never managed to get it much above 20, mind you. :S:

That said, in general the Amiga was as smooth as silk, at least in 2D. The PC architecture has always seemed to struggle with maintaining a consistent framerate without stuttering. And I'm sure Alen from Croteam is right about the reason. Those older machines' display hardware was much more closely integrated with the display. They were built for NTSC or PAL. Programmers, the programs they wrote, and the hardware that code ran on always knew exactly when the frame was being drawn. They wrote to the vblank, going to crazy lengths sometimes to ensure that the entire game loop ran between frames. Even those 30/25 fps games ran at a consistent 30/25 fps, without tearing or dropped frames.

I hope the fix is as simple as he seems to think.
Scoopta Jul 26, 2018
I would have expected that, at least with vsync and OpenGL, when the buffer swap returns that's when the frame has been displayed.


Last edited by Scoopta on 26 July 2018 at 5:24 pm UTC
Dunc Jul 26, 2018
Quoting: ScooptaI would have expected that, at least with vsync and OpenGL, when the buffer swap returns that's when the frame has been displayed.
You'd think so. But, as far as I can make out, what Alen's saying is that this isn't necessarily the case. Or at least, relying on this method for timing is misleading.


Last edited by Dunc on 26 July 2018 at 5:48 pm UTC
Ketil Jul 26, 2018
Having watched both the good and bad video a lot of times at 1080p60 quality I must conclude that the playback of either video doesn't feel consistent. Sometimes the bad one feels perfect although most of the time it seems a little off. On the other hand, the good one sometimes feels bad, although never as bad as the bad one.
kit89 Jul 26, 2018
Looks like they are using a variable-delta time for their rendering-subsystem. I had a similar problem with my own rendering system - I mitigated the problem by implementing a fixed-timestep that didn't accumulate elapsed-time. There is no need to 'catch-up' on missed frames. I found it more important to have consistent frames.
tuubi Jul 26, 2018
View PC info
  • Supporter Plus
Quoting: KetilHaving watched both the good and bad video a lot of times at 1080p60 quality I must conclude that the playback of either video doesn't feel consistent. Sometimes the bad one feels perfect although most of the time it seems a little off. On the other hand, the good one sometimes feels bad, although never as bad as the bad one.
Something must be wrong with the way your system renders the videos then, because for me the difference is extremely obvious. One is smooth as silk, and the other is consistently choppy, like advertised.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.
Buy Games
Buy games with our affiliate / partner links: