Support us on Patreon to keep GamingOnLinux alive. This ensures all of our main content remains free for everyone. Just good, fresh content! Alternatively, you can donate through PayPal. You can also buy games using our partner links for GOG and Humble Store.
We do often include affiliate links to earn us some pennies. See more here.

Good things come to those who wait, like a fine Wine. Today the Wine team has officially release the next stable version Wine 3.0 [Official Site].

After around a year of development during the 2.x cycle, Wine 3.0 brings in some major changes towards better game and application support for those of you wanting to run Windows-only stuff on Linux. It's nowhere near perfect, but it's a massive advancement for the Wine project and provides a good base for them to continue onwards.

Here's a few highlights from the mailing list announcement sent today:

  • Direct3D 10 and 11 support which includes:
    • Compute shaders
    • Hull and domain (tessellation) shaders
    • A large number of shader model 4 and 5 shader instructions
    • Cube-map arrays
    • Mip-map generation
    • And lots more
  • The Direct3D command stream, which is disabled by default. 
  • Support for OpenGL core contexts in Direct3D is improved. If you're using Mesa, you shouldn't need to set the "MaxVersionGL" registry key to enable Direct3D 10 and 11 support.
  • The Android graphics driver.
  • Improved DirectWrite and Direct2D support.

There's absolutely tons, that's me just cherry-picking random bits that I found quite interesting from this big release. For the next development cycle, we can look forward to things like Direct3D 12 and Vulkan support, OpenGL ES support to enable Direct3D on Android and plenty more.

You can find the brief official announcement here.

Article taken from GamingOnLinux.com.
Tags: Wine
21 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by . You can also follow my personal adventures on Bluesky.
See more from me
The comments on this article are closed.
All posts need to follow our rules. For users logged in: please hit the Report Flag icon on any post that breaks the rules or contains illegal / harmful content. Guest readers can email us for any issues.
48 comments Subscribe
Page: «2/3»
  Go to:

Kimyrielle 18 Jan 2018
Great that it's out now. I am looking forward to see continued improvements to DX11. I really hope they will concentrate on getting DX11 right, and not open another front with DX12. Who needs DX12 at this point anyway? Barely any game is using it, and from what I have heard it may become a fairly big flop. Developers seem to be reluctant to even want to use low-level graphics APIs and rather stick to DX11, which is deemed "good enough".
razing32 18 Jan 2018
I wonder when We gonna see this included in Crossover..
Sometimes I wonder why people pays so much money for a computer game, but not for such a usefull tool like crossover. Crossover 17 is great, I can't wait for next release.

Was curios about that.
Is the support good ?
mrdeathjr 19 Jan 2018
Great that it's out now.

I am looking forward to see continued improvements to DX11.

I really hope they will concentrate on getting DX11 right, and not open another front with DX12.

Who needs DX12 at this point anyway? Barely any game is using it, and from what I have heard it may become a fairly big flop.

Developers seem to be reluctant to even want to use low-level graphics APIs and rather stick to DX11, which is deemed "good enough".

Very good point, for example polish another things case software vertex shaders needed for some old titles)

But now thanks projects like VKDX, wine must be begin transition to vulkan because runs in all os with all driver type: closed and open source

Will be more interesting vulkan

^_^
Shmerl 19 Jan 2018
Yeah for wayland most big obstacle is games quantity inertia directly represented in this simple question:

Wayland runs all games runs in X and have support of games developers ?

As far I know, only some older versions of Unity use X directly. Newer ones should be using SDL. Same goes for other engines and games on Linux which either use SDL or GLFW or the like, to abstract the display server and input specifics.

http://www.glfw.org

So basically they don't need to worry about whether it's X or Wayland.
I wonder when We gonna see this included in Crossover..
Sometimes I wonder why people pays so much money for a computer game, but not for such a usefull tool like crossover. Crossover 17 is great, I can't wait for next release.

Was curious about that.
Is the support good ?

When you ask questions to the support, they answer..
And most supported games have an install script (like POL)



Or you could get it free its called Wine and just diy your games and applications

The problem with Wine by itself is that it is difficult to configure...

I installed, with a couple of clicks, Perception
![](https://pbs.twimg.com/media/DQVFS6gX0AAI1NV.jpg)
![](https://pbs.twimg.com/media/DQVFUolXkAIpBsY.jpg)


and Shadow Warrior 2


![](https://pbs.twimg.com/media/DQaAv4YVwAADyfj.jpg)
![](https://pbs.twimg.com/media/DQaA5rSUIAAVFw-.jpg)
![](https://pbs.twimg.com/media/DQaAy-lUIAA42A7.jpg)
![](https://pbs.twimg.com/media/DQaA9OnVwAAWxLj.jpg)

without any problem on codewavers Crossover 17...(well, there is a little choppy audio with both games and Shadow warrior 2 has some little graphical glitches, but the games are playable)
slaapliedje 19 Jan 2018
Yeah for wayland most big obstacle is games quantity inertia directly represented in this simple question:

Wayland runs all games runs in X and have support of games developers ?

As far I know, only some older versions of Unity use X directly. Newer ones should be using SDL. Same goes for other engines and games on Linux which either use SDL or GLFW or the like, to abstract the display server and input specifics.

http://www.glfw.org

So basically they don't need to worry about whether it's X or Wayland.

Am I missing something? Wayland under nvidia has worked for quite some time. I don't use wayland because they gimped the secondary clipboard, which I use all the time.
14 19 Jan 2018
View PC info
  • Supporter Plus
KDE devs don't care about nvidia and nvidia doesn't care about xwayland

Another reason to ditch Nvidia. In a few years, Nvidia will be barely used on Linux. See the trend on the same page you linked to.
Is your Nvidia decline prediction based on any information outside of your own preference and this website? I'm wondering what your sphere of influence is.

Another thought is if GNU desktop usage gets more Windows conversions, the Nvidia stats won't go down. Not everyone is going to feel compelled to use pure OSS right off the bat. Some people just want to get off Windows.

Even if Nvidia does end up declining in the GNU desktop world, I don't predict that happening in the Linux server world where GPU's are used for machine learning and data science tasks.
Shmerl 19 Jan 2018
Is your Nvidia decline prediction based on any information outside of your own preference and this website?

It's quite simple. Nvidia will never reach the level of AMD integration, because they have no interest in opening and upstreaming their driver, and AMD already caught up to Nvidia in performance. So once they'll also catch up in hardware (Vega 2 and Navi), Nvidia will have only disadvantages on Linux, so there will be an accelerating switching away from it.

In the machine learning and server AMD has advantages over Nvidia as well. Their hardware supports asynchronous compute, while Nvidia one doesn't. Also, Khronos are pushing new converged API for graphics and compute, that will combine Vulkan and OpenCL. That would basically undermine CUDA and the grip that Nvidia has over compute market, because there will be zero benefits in using CUDA vs the new portable API. AMD are on the right track to unseat Nvidia from these markets.


Last edited by Shmerl on 19 Jan 2018 at 6:17 am UTC
cRaZy-bisCuiT 19 Jan 2018
@1050ti users: If you would have claimed nVidia GPUs and drivers are more competetive like 2 years ago, I'd totally agree.


Nowadays you'll have the Raven Ridge desktop APU right around the corner which should match the TDP of an 1050ti and CPU while still having enough performance. Also, 7 mm will be a thing soon and this time I expect AMD to be quicker than nVidia.


In combination with the fact that AMDs drivers are much better integrated in the Linux world I'd definitely stick with AMD and would go for AMD if I'd be owning an nVidia GPU.

As you can see in Phoronix benchmarks, the AMD driver's performance is now competitive to nVidia and also more and more devs actively support AMD Mesa drivers. As well as Valve does.
beniwtv 19 Jan 2018
Guys, don't forget Nvidia is a corporation - and they make their own driver, on their own. There is no community to help out. Right now, Nvidia has no business case for supporting Wayland or XWayland in their driver.

Their corporate Linux customers are staying on X for still a few years time, consumer Linux distros just have begun shipping usable versions of Wayland and DE's supporting Wayland, and not all consumers have switched or want to switch yet.

(Even though I think Wayland is pretty great myself and I am using it ;)

Once Wayland becomes mainstream, and X gets less and less attention, things will only start to work on Wayland, and eventually, everyone will need to switch (or compile their own old versions of X). I am sure that at that point, Nvidia will support Wayland and XWayland just fine.
crt0mega 19 Jan 2018
*cough* nouveau *cough*
Pecisk 19 Jan 2018
My next GPU will be AMD because they have put their words in action - they have made AMDGPU driver official driver, redid binary one for legacy purposes, worked with community, opened their Vulkan driver and community part of developers are kick ass and keep pushing performance up.

Yes, Nvidia might have gigantic marketing and brand recognition and there's no shame for people picking it. I'm not fan of binary drivers though - although Nvidia hasn't caused lot of pain for me with that - and I want that to change.

Said that, GTX 760 has served me very, very well over these years.

As Wine 3.0 - impressive work, everybody involved (clap). I have tried to debug and report multiple games and it looks really hopeful. There's chance Wine 3 can handle even more challenging 3D games with some tweaks. There's incredible amount of work still required for Wine, it feels like never-ending story, so it is even more amazing amount of software running on it.


Last edited by Pecisk on 19 Jan 2018 at 12:39 pm UTC
mrdeathjr 19 Jan 2018
@1050ti users: If you would have claimed nVidia GPUs and drivers are more competitive like 2 years ago, I'd totally agree.

Nowadays you'll have the Raven Ridge desktop APU right around the corner which should match the TDP of an 1050ti and CPU while still having enough performance.

Also, 7 mm will be a thing soon and this time I expect AMD to be quicker than nVidia.

In combination with the fact that AMDs drivers are much better integrated in the Linux world I'd definitely stick with AMD and would go for AMD if I'd be owning an nVidia GPU.

As you can see in Phoronix benchmarks, the AMD driver's performance is now competitive to nVidia and also more and more devs actively support AMD Mesa drivers. As well as Valve does.

Raven ridge APU according CU quantity (11 aka 640 shaders*) in 2400G

*Will be more or less RX 550 and this gpu stay in GT 1030 level, and before gpus is 50% more slower than GTX 1050 and GTX 1050ti stay upper

RTG have serious troubles in consume** area compared with nvidia and sadly in igpu consume cost so much to RTG

**If GT 1030 with 50% shaders minor, 50% minor memory bus, 30w of consume and have same more or less same performance of RX 550 with almost double quantity of shaders and have around 55w of consume

Without forget more important apu ryzen 2200G only have 8 CUs aka 512 shaders, with actual information will be slower than RX 550

See before information could be really difficult to rtg have a igpu with GTX 1050 non ti level, RX 550 level is very possible

Another interesting comparision will be this vega 56 have 3584 shaders and have close than geforce gtx 1070ti with 2432 shaders

Closing before information appears vega needs so much shaders (around 33% more) compared with pascal

Sadly amd dont offer 1024 shaders igpu with ryzen 2200G, with 1024 shaders could be try compete with non ti GTX 1050***

However DDR5 128bit at 7000Mhz aka 110/gbs will be difficult to compete, especially in titles with intensive memory use

Respect drivers them have many things to do: compatibility, freezing, complete opengl set support including azdo extensions with conformant tests, lack of gui, more support for prevent amd cards dont supported or some errors case broken shaders dont fix, observer doors and others

For now many nvidia users stay waiting rtg can improve but in hardware side volta is closer

^_^


Last edited by mrdeathjr on 19 Jan 2018 at 1:06 pm UTC
Avehicle7887 19 Jan 2018
Just tested Race Driver: GRID, as I suspected setting the graphics details to high is just as crash happy as always. It seems the way WINE works, it causes a higher virtual memory usage compared to Windows and once 32bit games reach the 4GB Virtual memory usage they crash.

If anyone has any idea what am I doing wrong I would love to solve this mystery, here's a video showing the issue: https://drive.google.com/file/d/12ImU1RuMSGF9Jpen964cHntRVG9gJ-l5/view?usp=sharing

As you can see, the game has only been active for 13 mins then crashed due to out of memory.


Last edited by Avehicle7887 on 19 Jan 2018 at 2:31 pm UTC
mrdeathjr 19 Jan 2018
Just tested Race Driver: GRID, as I suspected setting the graphics details to high is just as crash happy as always.

It seems the way WINE works, it causes a higher virtual memory usage compared to Windows and once 32bit games reach the 4GB Virtual memory usage they crash.

If anyone has any idea what am I doing wrong I would love to solve this mystery, here's a video showing the issue:

https://drive.google.com/file/d/12ImU1RuMSGF9Jpen964cHntRVG9gJ-l5/view?usp=sharing

As you can see, the game has only been active for 13 mins then crashed due to out of memory.

Good date, in skyrim reports same issue with many mods (in my case only tests a few and works ok)

In my case works but use custom settings showed in video and use windows xp as windows to imitate in wine cfg

However i use xfce but do you use gnome ?

Do you report this behavior to wine devs aka bug?

And with other settings game still crash?

Seeking video error cited openal ?

In my case use wine 32 bit compiled* with all dependencies satisfied with openal needs:

![](https://i.imgur.com/TLs377D.jpg)

*In your case seems use build complete WOW64, i use only i386

Only manage 2 prefixes, one for older apps 32bits aka wine i386 and other for 64bit apps aka wine64

Maybe can try with only i386 wine

^_^


Last edited by mrdeathjr on 19 Jan 2018 at 3:25 pm UTC
Avehicle7887 19 Jan 2018
Good date, in skyrim reports same issue with many mods (in my case only tests a few and works ok)

In my case works but use custom settings showed in video and use windows xp as windows to imitate in wine cfg

However i use xfce but do you use gnome ?

Do you report this behavior to wine devs aka bug?

And with other settings game still crash?

Seeking video error cited openal ?

In my case use wine 32 bit compiled* with all dependencies satisfied with openal needs:

![](https://i.imgur.com/TLs377D.jpg)

*In your case seems use build complete WOW64, i use only i386

Maybe can try with only i386 wine

^_^

I use MATE desktop, just tried 32bit only Wine - game crashes at first race. Could you please record a video with a window showing the Virtual Memory usage as you play?

EDIT:

I've tested the game on my laptop: Debian 9 running Xfce, Intel HD Graphics. Game behaves exactly as on my desktop and runs out of memory just as fast.


Last edited by Avehicle7887 on 19 Jan 2018 at 5:51 pm UTC
14 19 Jan 2018
View PC info
  • Supporter Plus
Is your Nvidia decline prediction based on any information outside of your own preference and this website?

It's quite simple. Nvidia will never reach the level of AMD integration, because they have no interest in opening and upstreaming their driver, and AMD already caught up to Nvidia in performance. So once they'll also catch up in hardware (Vega 2 and Navi), Nvidia will have only disadvantages on Linux, so there will be an accelerating switching away from it.

In the machine learning and server AMD has advantages over Nvidia as well. Their hardware supports asynchronous compute, while Nvidia one doesn't. Also, Khronos are pushing new converged API for graphics and compute, that will combine Vulkan and OpenCL. That would basically undermine CUDA and the grip that Nvidia has over compute market, because there will be zero benefits in using CUDA vs the new portable API. AMD are on the right track to unseat Nvidia from these markets.
You're right, so the only remaining piece for Nvidia is marketing and vendor ties, which counts for something.

Disclosure: I will very likely build my next machine as all or partly AMD. I just like to play devil's advocate so I can see a confident statement backed up. I've heard people say things like, "Amazon is ruined now." Big businesses can often take more than one hit.

Again, I'm not trying to be pro big business, just trying to be realistic.
Shmerl 19 Jan 2018
You're right, so the only remaining piece for Nvidia is marketing and vendor ties, which counts for something.

Sure, but the main point is that AMD is now mostly competitive (except for TDP and high end cards availability that will have to wait until 2019), so competition will be on fair terms and technical merits. And with Wayland and Mesa, AMD is a clear winner for Linux gamers.


Last edited by Shmerl on 19 Jan 2018 at 5:56 pm UTC
mrdeathjr 19 Jan 2018
I use MATE desktop, just tried 32bit only Wine - game crashes at first race. Could you please record a video with a window showing the Virtual Memory usage as you play?

EDIT:

I've tested the game on my laptop: Debian 9 running Xfce, Intel HD Graphics. Game behaves exactly as on my desktop and runs out of memory just as fast.

Hi i finish test but dont pass 3.8 to 3.9gb of virtual memory and race finish without issues

Once finish upload video put link here

This link have my grid configuration maybe can test

https://mega.nz/#!3MNzBYBb!CwyChpcbbwsU-pOE70cx0UrqoQyRaFV-CDNX3uq4VOA

Almost forget i disable postprocess effects (i dont like it) renaming effects.xml to effects.disable and effects_override.xml to effects_override.disable in postprocess folder in game directory

^_^


Last edited by mrdeathjr on 19 Jan 2018 at 6:09 pm UTC
Audi 19 Jan 2018
I am curious if this will improve running Killing Floor 2 through Wine. Right now, DX11 has to be disabled via startup switch. Graphics also then don't look as good and Steam Overlay does not work. I plan to try the update this weekend to give it a try.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.