Support us on Patreon to keep GamingOnLinux alive. This ensures all of our main content remains free for everyone. Just good, fresh content! Alternatively, you can donate through PayPal. You can also buy games using our partner links for GOG and Humble Store.
We do often include affiliate links to earn us some pennies. See more here.

Kubuntu Focus launch the 5th-gen M2 Laptop

By -

Need a high-powered laptop and love KDE Plasma? Well the team at Kubuntu Focus just announced the fifth-generation M2 Laptop. Buying from Kubuntu Focus also helps to support KDE development, as Kubuntu Focus became a KDE Patron as of April 2023.

Starting at $1,895 it comes power-packed for that with an Intel i9-13900HX (24 cores), 16GB 4800MHz RAM, NVIDIA RTX 4060, 500GB NVMe and a 15.6" 2560x1440 240 Hz IPS display. Overall, that certainly sounds good and it ships with KDE Plasma 5.24 LTS on Kubuntu 22.04 LTS.

You can also upgrade it up to 64GB RAM, an NVIDIA 4070 and 2 x 2TB NVMe. Even with the entry level you could pretty easily replace a desktop with this absolute unit.

Plenty of ports too including HDMI 2.0b, Mini DisplayPort, 2 x USB-C 3.2 (with Thunderbolt, DisplayPort, and Power-In), 2 x USB-A 3.2, Headset Audio Jack, Mic + S/PDIF Audio Jack and 2.5 Gigabit RJ-45 Ethernet.

They offer free shipping inside USA / Canada but for outside you need to email them to arrange it.

Check out the M2 Laptop here.

Article taken from GamingOnLinux.com.
10 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. You can also follow my personal adventures on Bluesky.
See more from me
The comments on this article are closed.
All posts need to follow our rules. For users logged in: please hit the Report Flag icon on any post that breaks the rules or contains illegal / harmful content. Guest readers can email us for any issues.
11 comments

pleasereadthemanual May 19, 2023
I do like laptops with a lot of ports, but I have to wonder why they chose to go with NVIDIA. As someone with a NVIDIA card, the only reason I keep it on Linux is with the hope that DaVinci Resolve will one day support H.264/AAC decoding/encoding with their Studio version. It's a pretty awful experience otherwise compared to even my integrated Intel graphics. And NVIDIA Optimus laptops are the absolute worst. I owned one of those, and it's a considerable amount of work for newbies to get graphics-switching setup. The only reasonable way to do it, in my opinion, is with prime-run. I would hope Kubuntu does some of the work setting this up, as it's sold from a Linux company.

NVIDIA works a little better on Wayland than X.org for me, in some cases, but worse in others. I'd really like to see NVIDIA properly support Wayland compositors sometime in the near future, but I'm not holding out hope. I think I'll get an AMD GPU in a few years, because NVENC and CUDA are not worth the trouble.

I would probably buy this laptop with 32GB of RAM if I were purchasing.

Interesting naming, by the way.

I took a look at the sales page:

 
Furthermore, it is even stronger in OptiX and ML performance.


So, yes, they chose NVIDIA for machine learning. It's possible on AMD, but not easy, I've heard. It also suggests users might use the laptop for Blender.

In my opinion, I would not purchase a laptop with a discrete GPU anymore. It kills battery life, especially on Linux:

Big battery: The 80 Wh battery provides up to 4 hours of on-the-go computing in power-save mode.
Ardje May 19, 2023
A laptop with NVidia is a waste of money for someone who works with Linux.
ahoneybun May 19, 2023
I would hope Kubuntu does some of the work setting this up, as it's sold from a Linux company.

Speaking for System76 we do a lot of work to get switching to work correctly so I know it can be done for this system as well.

A laptop with NVidia is a waste of money for someone who works with Linux.

Depending on the case this is incorrect, it can be helpful for AI/ML work, video work and gaming.
pleasereadthemanual May 19, 2023
I would hope Kubuntu does some of the work setting this up, as it's sold from a Linux company.

Speaking for System76 we do a lot of work to get switching to work correctly so I know it can be done for this system as well.

I hope it's not a terrible experience for users. In my experience, it will try to run the compositor with NVIDIA by default, which is a terrible idea on both X.org because it will stutter and kill your battery life, while on Wayland it probably won't stutter, but it'll break resume from suspend on GNOME for example. So the first step is to switch to using Intel primarily, and then switch to NVIDIA for specific situations like hardware decoding.

This is my experience from 1-2 years ago, mind you—things may have improved. At that time, you had to restart your entire desktop session if you wanted to switch to your discrete GPU completely—from the sounds of it, some distributions now have a graphical utility allowing you to switch GPUs on the fly. But it still doesn't switch automatically, right?

A laptop with NVidia is a waste of money for someone who works with Linux.

Depending on the case this is incorrect, it can be helpful for AI/ML work, video work and gaming.

I don't know about video editing work particularly. A video editor's only real professional options on Linux are DaVinci Resolve Studio, which doesn't have AAC decoding/encoding (the audio codec used with the most popular video codec in the world) and Lightworks. Sure, with DaVinci Resolve Studio, you get to decode H.264 if you have a NVIDIA card; that's true. That's still only half the way there, because you need to transcode your AAC audio in your .mp4 container to something else before you can ingest it. And then after you render it, you need to transcode it again back to AAC. At that point, the time you're saving with NVENC on the render is probably negligible. If you're not working with H.264, I can understand this. However, even if you don't have codec problems, I've seen that DR seems to be quite...buggy on Linux. It also doesn't support display scaling.

Lightworks does take advantage of NVENC, but the workflow is so alien to me, I never got far into it. Cinelerra-GG takes advantage of NVENC too, but there are other bottlenecks slowing it down with the CPU, from my understanding. And, well, Kdenlive still has that stability problem to work out.

Now, if you're talking about VFX, that might be true, as this is a field I know Linux is used widely in.
Purple Library Guy May 19, 2023
Depending on the case this is incorrect, it can be helpful for AI/ML work, video work and gaming.
Gaming, eh?
Well . . . this website is called GamingonLinux. The people here are all about gaming, on Linux. Every. Single. Time an article shows up about some Linux-oriented hardware company releasing a laptop, including System76 ones, it seems to use NVidia. And Every. Single. Time. the comments are mostly not about the specific virtues of the laptop, but about "Why won't they use AMD? Why isn't there at least an AMD option?" And the opinion seems to run 90% in the direction of "NVidia is bad for gaming on Linux" and 10% "Well, NVidia is actually a decent/workable option". Very often many of these comments run along the lines of "I'd buy one of these if it had AMD instead of NVidia, too bad."

So, the Linux gaming community seems to disagree with you. Maybe they're totally wrong and it's just that everyone here is a bunch of idiots; certainly the point I'm making is technically a logical fallacy along the "appeal to authority" lines. But, balance of probabilities, I think it would be worth your while at System76 to investigate the possibility that your customer base could have a point. Even if you are sure they're a batch of fools, it might be worth thinking about selling them what they want to buy.

One thing to keep in mind is that people's problems with NVidia seem to be less about benchmarks and more about quality of life issues that crop up over time during use, something it's a little harder for a manufacturer to test for; sometimes you just have to take your customers at their word.


Last edited by Purple Library Guy on 19 May 2023 at 4:22 pm UTC
ahoneybun May 19, 2023
Depending on the case this is incorrect, it can be helpful for AI/ML work, video work and gaming.
Gaming, eh?
Well . . . this website is called GamingonLinux. The people here are all about gaming, on Linux. Every. Single. Time an article shows up about some Linux-oriented hardware company releasing a laptop, including System76 ones, it seems to use NVidia. And Every. Single. Time. the comments are mostly not about the specific virtues of the laptop, but about "Why won't they use AMD? Why isn't there at least an AMD option?" And the opinion seems to run 90% in the direction of "NVidia is bad for gaming on Linux" and 10% "Well, NVidia is actually a decent/workable option". Very often many of these comments run along the lines of "I'd buy one of these if it had AMD instead of NVidia, too bad."

So, the Linux gaming community seems to disagree with you. Maybe they're totally wrong and it's just that everyone here is a bunch of idiots; certainly the point I'm making is technically a logical fallacy along the "appeal to authority" lines. But, balance of probabilities, I think it would be worth your while at System76 to investigate the possibility that your customer base could have a point. Even if you are sure they're a batch of fools, it might be worth thinking about selling them what they want to buy.

One thing to keep in mind is that people's problems with NVidia seem to be less about benchmarks and more about quality of life issues that crop up over time during use, something it's a little harder for a manufacturer to test for; sometimes you just have to take your customers at their word.

I think the bad experience that bad NVIDIA drivers had (and can still have from time to time) has left a mark on folks so they won't try it again. It's easy to forget that AMD once had just as bad drivers (still true if you want/need the Pro as you need a certain kernel to support it and distro). I'm of course going to be bias but I think once Pop!_OS came with the drivers builtin and Ubuntu followed things have been much better for NVIDIA + Linux, I use a NVIDIA GPU with Pop!_OS every day at work without no issues.
ahoneybun May 19, 2023
[quote=pleasereadthemanual]
I would hope Kubuntu does some of the work setting this up, as it's sold from a Linux company.

Speaking for System76 we do a lot of work to get switching to work correctly so I know it can be done for this system as well.

I hope it's not a terrible experience for users. In my experience, it will try to run the compositor with NVIDIA by default, which is a terrible idea on both X.org because it will stutter and kill your battery life, while on Wayland it probably won't stutter, but it'll break resume from suspend on GNOME for example. So the first step is to switch to using Intel primarily, and then switch to NVIDIA for specific situations like hardware decoding.

This is my experience from 1-2 years ago, mind you—things may have improved. At that time, you had to restart your entire desktop session if you wanted to switch to your discrete GPU completely—from the sounds of it, some distributions now have a graphical utility allowing you to switch GPUs on the fly. But it still doesn't switch automatically, right?

A reboot is still needed from what I have seen as some of the magic is kernel parameters but system76-power sure makes it a smooth process in my experience.

[quote=ahoneybun]
A laptop with NVidia is a waste of money for someone who works with Linux.
Purple Library Guy May 19, 2023
I think the bad experience that bad NVIDIA drivers had (and can still have from time to time) has left a mark on folks so they won't try it again.
And the rest of what you say suggests the plan is to ignore those folks and hope they go away instead of buying your kit. That plan is likely to be successful, but I would have thought not really optimal from the point of view of a hardware sales company.


Last edited by Purple Library Guy on 19 May 2023 at 7:23 pm UTC
ahoneybun May 19, 2023
I think the bad experience that bad NVIDIA drivers had (and can still have from time to time) has left a mark on folks so they won't try it again.
And the rest of what you say suggests the plan is to ignore those folks and hope they go away instead of buying your kit. That plan is likely to be successful, but I would have thought not really optimal from the point of view of a hardware sales company.

That's not true at all, there has been movement to off AMD options for laptops like the Pangolin but the desktops (Thelio) have had AMD options from day 1.
Purple Library Guy May 19, 2023
I think the bad experience that bad NVIDIA drivers had (and can still have from time to time) has left a mark on folks so they won't try it again.
And the rest of what you say suggests the plan is to ignore those folks and hope they go away instead of buying your kit. That plan is likely to be successful, but I would have thought not really optimal from the point of view of a hardware sales company.

That's not true at all, there has been movement to off AMD options for laptops like the Pangolin but the desktops (Thelio) have had AMD options from day 1.
Good to hear.
Holzkohlen May 21, 2023
In my opinion, I would not purchase a laptop with a discrete GPU anymore. It kills battery life, especially on Linux

I wish Thunderbolt was fast enough to properly use eGPUs. Maybe Thunderbolt 5 will provide a decent little boost.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.