Don't want to see articles from a certain category? When logged in, go to your User Settings and adjust your feed in the Content Preferences section where you can block tags!
We do often include affiliate links to earn us some pennies. See more here.

A little break in the gaming news for a moment to mention that Zed, a "high-performance, multiplayer code editor from the creators of Atom and Tree-sitter", now has a Linux version available.

Something many have been wanting, because of how popular it is. Previously only officially available for macOS, the developers are not yet even providing official builds for Windows (but it can be built from source).

Announced via their blog in the "Linux when? Linux now." post, it included a nice thank you note about how vital the community has been in pushing the rapid development of the Linux build.

It has some of the fanciest sounding features I've ever read for a code editor. Built like a video game as they said: "Zed’s breakthrough performance starts with our GPUI framework, a new way to build 2D user interfaces. GPUI rasterizes the entire window on the GPU, just like a 3D video game. The result? Fast, reliable, and smooth delivery of pixels on every frame.".

Not just that, it's built for collaboration too with a virtual office system with different channels, shared documents for notes, audio and text-based chat and so on. So, it sounds like an open-source Discord for programmers in a way.

See more on the official website.

Article taken from GamingOnLinux.com.
14 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly.
See more from me
29 comments
Page: «2/2
  Go to:

devland Jul 12
FYI it has opt-out telemetry.

https://zed.dev/docs/telemetry

That's a big NO for me.
Either Xed or Zed should probably rename
kokoko3k Jul 14
A perfect example of https://en.wikipedia.org/wiki/Overengineering and inefficient coding.

Need a gpu to accelerate the drawing of...text? Is it that slow?
TheSHEEEP Jul 14
View PC info
  • Supporter Plus
Quoting: kokoko3kA perfect example of https://en.wikipedia.org/wiki/Overengineering and inefficient coding.

Need a gpu to accelerate the drawing of...text? Is it that slow?
The opposite is the case.
Making use of the GPU for drawing is vastly more efficient than using the CPU.

Is it necessary? No, of course not.
But advantageous.


Last edited by TheSHEEEP on 14 July 2024 at 7:36 am UTC
kokoko3k Jul 14
Quoting: TheSHEEEP
Quoting: kokoko3kA perfect example of https://en.wikipedia.org/wiki/Overengineering and inefficient coding.

Need a gpu to accelerate the drawing of...text? Is it that slow?
The opposite is the case.
Making use of the GPU for drawing is vastly more efficient than using the CPU.

Is it necessary? No, of course not.
But advantageous.

Where does your statement comes from?
I would not be that sure about that.

Mine comes from the fact that if you feel the need of accelerating the drawing of text with a modern or even aging cpu, then your doing something very wrong.
wvstolzing Jul 14
Quoting: kokoko3kWhere does your statement comes from?
I would not be that sure about that.

With all due respect, that's because you seem to be pretty clueless as to how a gpu is better at calculating pixels on a 2d surface.

QuoteMine comes from the fact that if you feel the need of accelerating the drawing of text with a modern or even aging cpu, then your doing something very wrong.

Take a look at the following articles for an overview:
https://sw.kovidgoyal.net/kitty/performance/
https://tomscii.sig7.se/2020/11/How-Zutty-works
TheSHEEEP Jul 14
View PC info
  • Supporter Plus
Quoting: kokoko3kWhere does your statement comes from?
I would not be that sure about that.
What wvstolzing said, basically.
GPU stands for graphics processing unit - it should be fairly self-explanatory what it is good at, drawing things and making calculations used in drawing things (and nowadays a bunch more like de- & encoding, raytracing, etc.).

CPUs are for general calculations. You can render with a CPU, but there's a reason you usually don't - it is pretty inefficient.
Another advantage is the rendering precision, it is very easy to spot a difference - normal CPU rendered text can be a bit "rough" especially on really large resolutions.
Finally, rendering speed is also a factor. I'm sure we've all seen applications using normal CPU rendered UI that have been minimally sluggish when redrawing entire sections/entire screen - if this is done on the GPU, I would not expect any such small delay.
One can argue if you really need the boost of performance from GPU text rendering, but one thing that cannot be denied is that it frees the CPU up for other things as well as being straight up better at the job.

I wouldn't go out of my way to create a GPU pipeline for text rendering in my own projects, but if I was using a toolkit that offered either CPU or GPU rendering, I'd always go for the GPU variant - there's just no reason not to. And then have CPU rendering as a failsafe (not all machines have a GPU, etc.).

As I said, it is an advantage and just makes sense to do, but a strict necessity it is not.


Last edited by TheSHEEEP on 14 July 2024 at 1:01 pm UTC
wvstolzing Jul 14
Quoting: TheSHEEEP
Quoting: kokoko3kWhere does your statement comes from?
I would not be that sure about that.
What wvstolzing said, basically.
GPU stands for graphics processing unit - it should be fairly self-explanatory what it is good at, drawing things and making calculations used in drawing things (and nowadays a bunch more like de- & encoding, raytracing, etc.).

CPUs are for general calculations. You can render with a CPU, but there's a reason you usually don't - it is pretty inefficient.
Another advantage is the rendering precision, it is very easy to spot a difference - normal CPU rendered text can be a bit "rough" especially on really large resolutions.
Finally, rendering speed is also a factor. I'm sure we've all seen applications using normal CPU rendered UI that have been minimally sluggish when redrawing entire sections/entire screen - if this is done on the GPU, I would not expect any such small delay.
One can argue if you really need the boost of performance from GPU text rendering, but one thing that cannot be denied is that it frees the CPU up for other things as well as being straight up better at the job.

I wouldn't go out of my way to create a GPU pipeline for text rendering in my own projects, but if I was using a toolkit that offered either CPU or GPU rendering, I'd always go for the GPU variant - there's just no reason not to. And then have CPU rendering as a failsafe (not all machines have a GPU, etc.).

As I said, it is an advantage and just makes sense to do, but a strict necessity it is not.

Also, many different techniques exist for rendering text via the gpu; I've no idea what zed is using, but the techniques range from texture atlases for prerendered fonts (just copying, in an extremely fast manner -- bcs. parallelized -- between video memory locations; a 40+ year old technique), to really sophisticated methods for scaling fonts accurately on the fly *without* calculating all the geometry information anew: https://www.reddit.com/r/opengl/comments/bzh2b4/msdfgl_gpuaccelerated_multichannel_distancefield/ also: https://wdobbie.com/post/gpu-text-rendering-with-vector-textures/ ... with a very impressive demo: https://wdobbie.com/pdf/

GPUs are better at graphics rendering because they're designed for massively parallel vector multiplications -- to calculate, in parallel, *at the same time*, info about pixels that you see on the screen *at the same time*.

That's also why they're key to other applications that require parallel vector multiplications on a massive scale, which is what the current AI hype is built upon. Text & graphics created by people are 'vectorized' a certain way, and then a vast collection of vectors are compared, & classified, and relationships between them get established in accordance with sophisticated formulas that require parallel computations on those vector values at a massive scale.

(& it's clear why the AI hype is inseperable from data mining ... 'generative AI' is all about formulaically generating stuff from patterns found in data created by actual intelligences ... e.g., people, whose work gets hoarded by the mass scale data miners)


Last edited by wvstolzing on 14 July 2024 at 1:31 pm UTC
kokoko3k Jul 14
Quoting: wvstolzing
Quoting: kokoko3kWhere does your statement comes from?
I would not be that sure about that.

With all due respect, that's because you seem to be pretty clueless as to how a gpu is better at calculating pixels on a 2d surface.

QuoteMine comes from the fact that if you feel the need of accelerating the drawing of text with a modern or even aging cpu, then your doing something very wrong.

Take a look at the following articles for an overview:
https://sw.kovidgoyal.net/kitty/performance/
https://tomscii.sig7.se/2020/11/How-Zutty-works

Thank you for the efforts, but it's obvious that drawing using the gpu can be faster, if done right.
What is not, is that the implementation has to be right for it to be.

If you follow the discussion, you'll understand that I speak about inefficient coding.

There should be really no need for a text editor to use gpu rendering to draw text, ofc you can do it, but advertising that as a feature smells like the underlying code "needs" it to perform well.

Given that, they should also advertise what gpu this text editor needs to perform as they expect, if there is really something to expect :)


Last edited by kokoko3k on 14 July 2024 at 5:43 pm UTC
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
Login / Register