Every article tag can be clicked to get a list of all articles in that category. Every article tag also has an RSS feed! You can customize an RSS feed too!
We do often include affiliate links to earn us some pennies. See more here.

In an interesting post aimed at the more casual reader, the CTO of Croteam has now written up a post about 'The Elusive Frame Timing'.

An issue that has been around for years, is that while a game may run at 60FPS you might experience stuttering, even though the FPS is good. It's not a problem with one particular developer either, many games do suffer from the same issue. It's written to go along with the talk they did at GDC this year (slides here). What's interesting, is how long they've been researching this and trying to solve it.

They showed two videos to highlight the problem, the first is this one which should run perfectly smoothly and this one showing what they're calling "Heartbeat" stutter. Having two monitors next to each other, playing both at the same time made it really easy to see. To make it easier to spot, they did another video with them sliced together, the smooth version being on top and you can easily notice the difference, especially as it looks like it's tearing in the middle.

It seems Vulkan is a bit ahead of the game here, with the VK_GOOGLE_display_timing extension, which Croteam have been experimenting with. Although, support for it is quite limited currently.

A really interesting read, if still a bit over my head overall. You can read the full post on Medium.

Article taken from GamingOnLinux.com.
Tags: Misc
16 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly came back to check on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
The comments on this article are closed.
16 comments
Page: «2/2
  Go to:

Scoopta Jul 26, 2018
Quoting: Dunc
Quoting: ScooptaI would have expected that, at least with vsync and OpenGL, when the buffer swap returns that's when the frame has been displayed.
You'd think so. But, as far as I can make out, what Alen's saying is that this isn't necessarily the case. Or at least, relying on this method for timing is misleading.
Yeah that's what he seems to be saying which just seems weird to me. I could understand it with vsync off but with vsync on if the buffer swap doesn't return when the frame is displayed then when exactly does it return? It blocks until vblank so unless it's returning late but that would then imply it's returning after the buffer swap has already occurred. Either way it's an interesting problem.
F.Ultra Jul 26, 2018
View PC info
  • Supporter
Quoting: DuncFrom the article:
QuoteVideo games have been running at 60 fps since the days of first arcade machines, back in the ‘70s. Normally, it was expected that the game runs at exactly the same frame rate that the display uses. It wasn’t until the popularization of 3D games that we first started accepting lower frame rates.
That's actually a bit misleading. 80's home computers (and consoles) were slow, and they were connected to standard TVs, which obviously used interlaced displays. Games did (usually) run at the full frame rate (i.e, one screen update every two interlaced “fields”), but that was 30 fps in NTSC country, and 25 in PAL-land. 60 (or 50) fps games existed, but they were rare enough for it to be a selling point sometimes. I grew up on 25 fps. Maybe that's why I'm less obsessed with absolute framerate than most people seem to be these days. :) Stuttering drives me nuts, though.

QuoteWay back in the ‘90s, when “3D cards” (that was before we started calling them “GPUs”) started to replace software rendering, people used to play games in 20 fps and considered 35 fps speeds for serious competitive netplay. I’m not kidding.
This is usually the point where I mention that the original F1 Grand Prix ran at a fixed 8 fps on the Amiga. When people started upgrading their machines, they had to apply a third-party patch to unlock the framerate. I still never managed to get it much above 20, mind you. :S:

That said, in general the Amiga was as smooth as silk, at least in 2D. The PC architecture has always seemed to struggle with maintaining a consistent framerate without stuttering. And I'm sure Alen from Croteam is right about the reason. Those older machines' display hardware was much more closely integrated with the display. They were built for NTSC or PAL. Programmers, the programs they wrote, and the hardware that code ran on always knew exactly when the frame was being drawn. They wrote to the vblank, going to crazy lengths sometimes to ensure that the entire game loop ran between frames. Even those 30/25 fps games ran at a consistent 30/25 fps, without tearing or dropped frames.

I hope the fix is as simple as he seems to think.

Exactly this, back in the day we used to measure everything we coded for demos/games in the amount of raster time they took. On the Amiga I remember that we used to switch on a bitplane before a function and switch if off again (or if it was a color in the bitplane, memory is a bit rusty) after to see the actual mount of pixels as a visual benchmark when doing performance measurements.
MayeulC Jul 26, 2018
Quoting: Creak@raneon it could actually. that's probably not the only reason, but it's possible that the GPU drops a frame because the display isn't ready yet for the next frame. And the result is, if you disabled V-Sync, you'll get tearing, and if you enabled V-Sync, you'll get stuttering (no perfect solution).

I personally prefer enabling V-Sync since I prefer a small stuttering than a sliced image (in other words, I prefer a strong 30 fps than an irregular and out-of-sync 45 fps).

And that's where Freesync (or G-Sync for NVIDIA customers) comes to play. The display will show the image that the GPU sends, with a much loose consideration for the frame rate, since the display frame rate is now dynamic and the GPU can send its rendering within a frame rate range and not at a unique frame rate.
Well, you can have triple buffering as well, that smooths things out a bit at the expense of latency, though wouldn't work well at a steady 45 FPS, for which you would have quite a bit of judder. FPS limiting can help as well here, but adaptative sync is indeed the correct answer: by dynamically adjusting the refresh rate [or rather, pushing images as they come], the frames do not have to stay for 16.67ms each. There is a maximum time that the image can be displayed on a given screen, though (lower frame rate bound), so you need Low Framerate Compensation as well when the FPS dips too much.

Juddering is touched upon briefly in the Medium article, but that's not the main issue here. And indeed, nothing is more irritating than stuttering while you are at a rock solid 60 FPS. Those frametime graphs remind me of quantization noise, they might be due to batching GPU commands together before submitting and retrieving them. And I am afraid that if you don't compute frame time accurately, and use frametime to compute displacement, nothing can help you.

What I am not sure of, is where this frametime is needed. The game engine internally keeps track of every moving object. You have to sample this movement to display it to the user.
If my understanding is correct, the problem is the following: if you take a naive approach and sample the location of every object before starting to draw, variable frame rate can introduce stutter. Indeed, if a car moves at a fixed speed while a couple frames are drawn, for 16, 8 and 20ms respectively, the car would travel 2 times less in the second frame
That said, for a smooth movement, you can't really sample the current position and display that, as it would (ironically) stutter if you don't have a smooth framerate: that car would travel at say N px/s, then N px/s
Nope, can't figure it out, the calculations I made give me the same speed. I don't quite see what the problem with naive sampling is (unless you must send multiple frames in advance, but I must just be tired). Feel free to enlighten me if you do.

Quoting: tuubi
Quoting: KetilHaving watched both the good and bad video a lot of times at 1080p60 quality I must conclude that the playback of either video doesn't feel consistent. Sometimes the bad one feels perfect although most of the time it seems a little off. On the other hand, the good one sometimes feels bad, although never as bad as the bad one.
Something must be wrong with the way your system renders the videos then, because for me the difference is extremely obvious. One is smooth as silk, and the other is consistently choppy, like advertised.

Yup, it's stated multiple time in the article that video playback can suffer from the same issue. I had way less stuttering on my desktop (specs on my profile - R9 Fury) than on a laptop with an intel iGPU. Strangely, I had more stuttering at 720p. Time to dig in Firefox's renderer?

Quoting: From the Medium articleNow open the previous stuttering video (the “heartbeat”) again, pause the video and use the . (dot) key in the YouTube player to move frame by frame.
What is this this sorcery? It seems to only work with a us keyboard layout (reported).


Last edited by MayeulC on 26 July 2018 at 11:41 pm UTC
qptain Nemo Jul 27, 2018
Quoting: MayeulCWhat I am not sure of, is where this frametime is needed. The game engine internally keeps track of every moving object. You have to sample this movement to display it to the user.
If my understanding is correct, the problem is the following: if you take a naive approach and sample the location of every object before starting to draw, variable frame rate can introduce stutter. Indeed, if a car moves at a fixed speed while a couple frames are drawn, for 16, 8 and 20ms respectively, the car would travel 2 times less in the second frame
That said, for a smooth movement, you can't really sample the current position and display that, as it would (ironically) stutter if you don't have a smooth framerate: that car would travel at say N px/s, then N px/s
Nope, can't figure it out, the calculations I made give me the same speed. I don't quite see what the problem with naive sampling is (unless you must send multiple frames in advance, but I must just be tired). Feel free to enlighten me if you do.
It's not a matter of speed, but position. You don't change the speed of moving objects for external reasons of course, but by trying to be precise you use frametime as the time objects spent traveling. Now if frametime is reported or measured incorrectly, or misleadingly and doesn't represent actual time passed since last frame was displayed your calculation of distance will be off as far as naturally-looking motion is concerned, things will move too far or not far enough (not sure if the latter also normally happens but whatever, same principle). To use your example, if you see the frametime was 8 ms you move a thing by 10 pixels, if it was 16 you move it by 20 pixels (twice the time, twice the distance traveled, right?), if it was 32 you move it by 40 pixels. If it was actually 16 all along, the result will look incorrectly. The motion should've been even, not compensating for "time skipping".

Also I suspect this issue could have a severe impact on the nausea-inducing factor and general comfort in VR.


Last edited by qptain Nemo on 27 July 2018 at 6:19 am UTC
dmacofalltrades Jul 27, 2018
I think this is the problem I had with Borderlands 2; even with decent framerates, I kept seeing this stutter while playing the game. In the end, I had trouble enjoying the game because of it. It's been a while, though, so hopefully it's been fixed by now.
Dunc Jul 27, 2018
Quoting: F.UltraExactly this, back in the day we used to measure everything we coded for demos/games in the amount of raster time they took. On the Amiga I remember that we used to switch on a bitplane before a function and switch if off again (or if it was a color in the bitplane, memory is a bit rusty) after to see the actual mount of pixels as a visual benchmark when doing performance measurements.
Yes, I'd forgotten about that! (Not that I ever did any serious Amiga coding, especially not at the low level, but I dabbled.) The screen border on 8-bit machines was often used in much the same way. And still is; you can run a simple set-one-colour-set-another-wait-for-vblank loop, even in BASIC, to check if an emulator has correct timing.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.
Buy Games
Buy games with our affiliate / partner links: