Update: You can do it easier now with the NVIDIA control panel. See this newer article for info.
Thanks to a few different people for their advice (xpander for the initial advice on a script and HeavyHDx on twitter) I have finally found a way to stop screen tearing with Nvidia proprietary drivers.
I have been struggling with this issue for months across all the different desktop environments I tried (KDE, GNOME, Cinnamon, Unity), and it has caused me a fair amount of headaches and stress, so I am pleased to finally find a solution. It's not perfect, slightly annoying, but also quite useful too.
You have probably heard of ForceFullCompositionPipeline before and that is what I am using. I have two scripts setup on keyboard shortcuts depending on the resolution that I am using (4K or 1080p). Why both? I hear you ask. It's simple, performance in a lot of games at 4K resolution is terrible, and some games have tiny unreadable text, so I run certain games at 1080p.
Here's where the confusion came from...
The problem with ForceFullCompositionPipeline is when you play a game that has a fullscreen mode that changes your desktop resolution, instead of stretching a fullscreen window, is that ForceFullCompositionPipeline is reset back to disabled. If you have noticed screen tearing returning at times even with using ForceFullCompositionPipeline, that could well be your issue too. Like me, if you didn't know that, it was probably bugging you a lot. This is also why simply putting it in an xorg config file will not 100% solve it, where as with this method you can just re-run it any time you need to.
So, here are the two very simple scripts I run. They are both put in plain text files and allowed to run as an executable (right click -> properties -> permissions -> tick "Allow executing file as program").
First up is for the 4K resolution (I have this set to run at start-up so I don't have to mess with xorg stuff directly):
nvidia-settings --assign CurrentMetaMode="DP-4:3840x2160_60 +0+0 { ForceFullCompositionPipeline = On }, DVI-I-1:1920x1080_60 +3840+0 { ForceFullCompositionPipeline = On }"
And for 1080p resolution:
nvidia-settings --assign CurrentMetaMode="DP-4:1920x1080_60 +0+0 { ForceFullCompositionPipeline = On }, DVI-I-1:1920x1080_60 +1920+0 { ForceFullCompositionPipeline = On }"
If you only have one monitor, you won't need the addition part after the comma.
You can run the script at any time. Your monitor(s) will blink, and then come back all sorted.
You will of course need to change things like "DP-4" and "DVI-I-1" to the connections your monitor is using (or monitors in my case as I have two). You can find them out by running the "xrandr" command in terminal. It will give you a list of things, like this:
QuoteDP-4 connected primary 3840x2160+0+0 (normal left inverted right x axis y axis) 621mm x 341mm
I hope this helps someone else, as it has been driving me nutty. They are pretty safe scripts to use, I have been testing switching between them constantly, but don't blame me if you blow your computer up.
These two little scripts have literally changed my gaming life on Linux for the better.
Where it becomes even more useful
A nice side-effect of the script: Games like RunningWithRifles which has poor multi-monitor support, it actually turns off my main monitor. Hitting the desktop shortcut I set for it will bring that monitor back, and still allow me to play the game. So not only do you get zero tearing, you get your normal multi-monitor experience back.
Feel free to share what methods you're using on your favourite desktops. Let's see if we can help each other in the comments.
with
Option "TearFree" "on"
it works for me with vsync off :P in xorg.conf
(btw yeah i disabled vsync and stuff but i have 144hz monitor)
The compositor of Xfwm 4.12 has a vsync option, but it doesn’t really work.One of the developers said he had "tear-free compositing working here with GL support on proprietary NVidia driver with current code". So likely a misconfiguration or a new bug needs to be reported.
Last edited by N30N on 13 May 2016 at 4:43 pm UTC
Some people uses triple buffering, but one thing that need more caution is the VRAM usage
I did make a bug report to the Cinnamon developers back in January with no reply: https://github.com/linuxmint/Cinnamon/issues/4990Your bug report doesn't include enough information to be helpful, no mention of hardware, drivers, etc…
You could also try reporting it upstream, in your case that'd be: https://github.com/linuxmint/muffin
Oh, yes. Well at least you know a fix is coming (and/or where to get it). ;)One of the developers said he had "tear-free compositing working here with GL support on proprietary NVidia driver with current code". So likely a misconfiguration or a new bug needs to be reported.They are talking about unreleased code, post 4.12.
Last edited by N30N on 13 May 2016 at 6:11 pm UTC
Noob question. How could I run the script at boot in Ubuntu?
Thanks!
While true, still no one replied to even ask for more info.I did make a bug report to the Cinnamon developers back in January with no reply: https://github.com/linuxmint/Cinnamon/issues/4990Your bug report doesn't include enough information to be helpful, no mention of hardware, drivers, etc…
You could also try reporting it upstream, in your case that'd be: https://github.com/linuxmint/muffin
As for "upstream", well Cinnamon is the overall upstream for all of it since it affects Cinnamon directly.
In my case, I set nvidia driver to use "Performance Mode" making a .desktop file to add in autostart folder with this cmd [ /usr/bin/nvidia-settings -a '[gpu:0]/GPUPowerMizerMode=1' ].
I was using kde5 some time ago to avoid screen tearing I set the var kwin_triple_buffer as true with [ export KWIN_TRIPLE_BUFFER=1 ] which I set in .bashrc in home as I only have one user and doesn't need special permission.
for gnome I only had screen tearing in firefox after configuring nvidia driver as performance mode, to fix tearing in fullscreen firefox I found this cmd [ gdbus call --session --dest org.gnome.Shell --object-path /org/gnome/Shell --method org.gnome.Shell.Eval 'Meta.disable_unredirect_for_screen(global.screen);' ] which also I made a .desktop to put in autostart folder.
I hope helps someone else who's having same problem
Last edited by TheRiddick on 13 May 2016 at 9:29 pm UTC
Unfortunately for owners of laptops with PRIME (like me), there is yet no vsync support at all:Have you tried this script to see if it helps at all?
https://devtalk.nvidia.com/default/topic/775691/linux/vsync-issue-nvidia-prime-ux32vd-with-gt620-m-/7
Yes I have tried these settings, but there is no way it can work on a laptop with PRIME, because PRIME involves an integrated Intel card with a dedicated Nvidia. The way they interact with each other is different than a single card. There is a Nvidia developer assigned to solve this, but no good news yet.
Last edited by gqmelo on 14 May 2016 at 1:19 am UTC
# /etc/X11/xorg.conf.d/20-nvidia.conf
Section "Device"
Identifier "Device0"
Driver "nvidia"
VendorName "NVIDIA Corporation"
Option "TripleBuffer" "true"
Option "metamodes" "nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }"
EndSection
This forces Triple Buffering so the performance does take a hit but the end result is worth it IMHO. I also disable vsync in all my games, don't know if it affects the performance but I simply do it as it's redundant to have both the system and the application vsync. I also have Rendering Backend set to OpenGL 3.1 under System Settings>Display and Monitor>Compositor but I don't think it makes any difference.
This is hands down the smoothest experience I have ever had on linux. Simply amazing.
PS: Pro tip if you are using plasma 5, under System Settings>Desktop Behavior>Desktop Effects>Blur un-check Save Intermediate Results to fix flickering in popups when opengl applications are being overlayed by popups (like volume control).
Hi there.
Noob question. How could I run the script at boot in Ubuntu?
Thanks!
make a script and add it to startup programs
I have terrible tearing on scrolling for a very long time. I disabled Fullscreen composition in Cinnamon settings and Sync to VBlank is on in Nvidia settings but no luck.
Thanks in advance.
Thanks for the great hint! No more tearing in fullscreen, but do you guys know any ways for turning Vsync on when you not in fullscreen? Like Firefox browsing and scrolling in Linux Mint Cinnamon.This fixes to desktop tearing too.
I have terrible tearing on scrolling for a very long time. I disabled Fullscreen composition in Cinnamon settings and Sync to VBlank is on in Nvidia settings but no luck.
Thanks in advance.
This forces Triple Buffering so the performance does take a hitTriple buffering is supposed to improve performance compared to double buffering. Well, the framerate should improve, not the latency.
i think the issue where it takes the hit is where game itself has triplebuffer enabled as well.
some games get really jumpy framerate when thats in xorg.. thats with my experience.
just ForceCompositionPipline fixes all tearing for me and no need to use other tweaks.
That said, I'm not getting a whole lot of tearing right now using Debian Sid on my 55" screen. I haven't done much to the configuration either, beyond installing the nVidia drivers. I really need to fix my laptop though so I can actually USE the nVidia side of things. What would have been nice is if they could come up with a chipset that had a really nice powersaving mode so we wouldn't need this PRIME/Optimus nonsense in the first place! Maybe their 10x0 cards will?
Did you try to disable your compositor while running games ?This forces Triple Buffering so the performance does take a hitTriple buffering is supposed to improve performance compared to double buffering. Well, the framerate should improve, not the latency.
i think the issue where it takes the hit is where game itself has triplebuffer enabled as well.
some games get really jumpy framerate when thats in xorg.. thats with my experience.
just ForceCompositionPipline fixes all tearing for me and no need to use other tweaks.
I know the auto-detection doesn't work well since lot of games now play in a borderless window. It's hard for them to detect fullscreen and then auto-disable compositor, and that's what I experienced when I switched from XFCE (without compositor) to KDE (with compositor) 2 years ago. So I chose to do it by myself when needed.
KDE provides a shortcut key to disable it manually, default is Ctrl+Alt+F12 iirc, and can be changed to any key you want (I personally use Pause key). While compositor doesn't hit performance on "low demanding" games, it can cripple fps on "high demanding" games like Feral or Aspyr ones. So I press my "magic key" to improve FPS. :P
To be honest I hardly believe triple buffer may hit performance, since its very benefit is to improve performance and provide steady FPS, at the cost of a larger usage of VRAM. Your problem should come from somewhere else.
See Nvidia documentation or simply Wikipedia article.
Enable:
sh -c "nvidia-settings --assign CurrentMetaMode=\"$(nvidia-settings -t -q CurrentMetaMode |tr -d "\n"|sed 's/ViewPortIn=/ForceFullCompositionPipeline=On, ViewPortIn=/g'|sed 's/.*:://'|sed 's/^ *//;s/ *$//')\""
Disable:
sh -c "nvidia-settings --assign CurrentMetaMode=\"$(nvidia-settings -t -q CurrentMetaMode |tr -d "\n"|sed 's/.*:://'|sed 's/^ *//;s/ *$//')\""
However, forcing ForceFullCompositionPipeline is just like running a compositor (IT IS a simple compositor) plus you lose shadows and candies, see those benchamrks running simple "teapot" mesa demo:
FPS | Window manager | compositor
-----------------------------------
978 openbox NONE
976 kwin NONE
907 openbox Compton opengl
905 kwin kwin opengl 3.1
905 kwin ForceFullCompositionPipeline=on
902 openbox ForceFullCompositionPipeline=on
...that is the 7-8% performance hit i'd expect when running a compositor on my GTX470 (it is negligible when overrall fps are low, but more evident on simple demos like teapot, and in newer cards the performance hit should be even less evident).
So, if you managed to have vsync with userspace compositors like kwin,compton,mutter and so on, i think it is better to use them instead of ForceFullCompositionPipeline.
Last edited by kokoko3k on 17 May 2016 at 9:28 am UTC
Also, nvidia driver refers to display with other names (in my case is DPY-2), that's why i suggested a general script (see my previous comment).
Anyway, you're already running a compositor; i don't think is a good idea stacking one on top of another. as you will lose twice the performance in the best case.
BTW:
The fact that the performance drop is the same as running a compositor should REALLY be written in the main article; there's no vodoo magic here.
Last edited by kokoko3k on 19 May 2016 at 2:44 pm UTC
See more from me