Check out our Monthly Survey Page to see what our users are running.
Latest Comments by elmapul
Collabora partnered with Microsoft to get OpenGL and OpenCL on DirectX
25 March 2020 at 9:33 am UTC

Quoting: TheRiddickAMD's OGL driver under Linux is pretty decent (not quite as good as NVIDIA's but close), but their OGL driver under Windows is still using old trash code. IF AMD ever decides to open-source their windows driver, they can port over the Linux OGL code and get a immediate benefit.

I'm not sure why its hard to open-source drivers under windows, maybe Microsoft says no?

i dont know how to make an driver, but i think they should use some windows api that is deeply integrated with the kernel and other proprietary stuff, so exposing it may make it easier to reverse enginering windows and microsoft wont allow that.

even valve has an NDA for their apis, i'm not sure why.

Collabora partnered with Microsoft to get OpenGL and OpenCL on DirectX
25 March 2020 at 9:27 am UTC

Quoting: Guest
Quoting: LeopardNo , actually Windows at this point has nothing to do with OGL situation on Windows.

Reason why OGL is still a relevant api these days ( for professional use cases like CAD apps etc ) is Nvidia and Nvidia's OGL driver is pretty solid on both platforms.

OGL on D3D won't be featureful like NV driver or even AMD's very slow OGL driver,which i don't even take account of many many OGL driver app profiles goes for somewhat broken but important apps. So i think that is not so beneficial as it might seem. There are vendors who can deliver solid OGL implementations already. If they somehow join into this trend and starts to abandon OGL driver of theirs , this might give D3D12 a critical edge over Vulkan , which fears me most.

https://twitter.com/_Humus_/status/1018846492273119233?s=19

Funny story about how AMD screwed with their OGL driver btw.

AMD poured a lot of effort into ATi's drivers actually, and were probably the minds behind trying to unify the Windows and Linux GL drivers. Those decisions are not taken lightly, or without understanding the risks.

And don't worry, Vulkan dominates (or is trending that way) in the mobile space. Microsoft can't touch that, and it's a big space. Then don't forget that perhaps some of this work might encourage OpenGL on Vulkan, simplifying the GNU/Linux driver space. If nothing else, it might get others thinking about it.

Not that I trust Microsoft with this one tiny bit however. They've done their best to deserve such ire.
the issue is, mobile games suck, so it dont matter to much...
and they still can enter this market with xcloud, or nvidia could enter with geforce now, wich seems like an great deal...

Collabora partnered with Microsoft to get OpenGL and OpenCL on DirectX
25 March 2020 at 9:17 am UTC

Quoting: Alm888Well, it might technically be beneficial for Microsoft®, considering Windows' current sore state of built-in OpenGL implementation (I think it is something like "OpenGL 1.1") and could make some vendor-locked-in devices (like "Microsoft Surface") to be somewhat useful. Or to help AMDGPU-powered devices to s**k a little less in OpenGL applications (it would be nice to have a comparison to official AMD OpenGL stack)…

But, honestly, using "DirectX 12" is like betting on a dead horse. I mean, who needs DX12 when we have Vulkan? On the other hand, DX12 is all Microsoft® has, so they are out of options… :S:

anyone who develop for xbox needs DirectX12, and before some one says "ps4/switch runs vulkan" they really do? and its an first class citizen, or they threat it like microsoft threated openGL?
"yes, you can use it to port your multiplatform game to our system, but if you want use the full potential you should use our proprietary api"

Collabora partnered with Microsoft to get OpenGL and OpenCL on DirectX
25 March 2020 at 9:09 am UTC Likes: 1

Quoting: GuestSo it's either this support in order to have OpenGL and OpenCL on these machines, or nothing at all....
i think you didnt understood whats happening here.

if microsoft try to launch an hardware (eg: surface, xbox, etc) that dont have support for openGL, they will not be removing value from openGl, they will be removing value from their own product!

they cant afford to launch an device that didnt support aplications such as world of warcraft, they tried it in the past, and it was an disaster.

so, what they are doing?

creating an runtime that will run openGL applications on top of direct x 12, this will ensure that the performance of openGL applications is worse than DirectX 12 applications, wich will reduce the incentive do develop using openGL.
but since most aplications that use it are old, they should run fine (and fast) in those devices, fast enough to run an old game, but not fast enough to run an modern game developed in openGL (an game with tons of shaders, polygons, 3D models etc)
this will not affect indies, but triple A developers will not have an openGL backend.
(its not like they did anyway, but this type of things just ensure it)

the next step could be to do the samething with vulkan...

so, no, they arent adding value to openGL, big companies (like triple a game developers or game engine developers) wont use it anymore since vulkan and dx12 are better options, small companies usualy dont make their own engines, so this will not add value to openGL.
this may even create some frenkstein aplication hybrid from openGL and DirectX12...

Collabora partnered with Microsoft to get OpenGL and OpenCL on DirectX
25 March 2020 at 5:30 am UTC Likes: 3

"Not all Windows-powered devices have consistent support for hardware-accelerated OpenCL and OpenGL. "
that is actually a bad thing, now hardware vendors will not think twice before they dump openGL completely, and if they dont make an driver for windows, i dont see they doing one for linux either.
the ideal solution would be to translate it to vulkan

The big Stadia round-up from the 'Google for Games Keynote' - Splash Damage exclusive, open source and more
25 March 2020 at 5:23 am UTC

Quoting: GuestSo in watching the "Bringing Destiny to Stadia" video, about 10 minutes in, I find it interesting how many people they put onto the Stadia port over 6 months. And 6 months, that's an insanely short amount of time. Bungie really had to invest in it, and honestly I don't think they would have ever remotely considered such a porting investment were it not for Stadia.

I also noted in the video (still watching) that there was mention of Unreal Engine having Vulkan support. Last I heard, that support wasn't the greatest on desktop. Maybe with a push on Stadia, then that might just give some incentive to change, which would definitely be beneficial to desktop GNU/Linux.

--extra: "Google has technical certification requirements", relating to frame rates. I suspected as much. You can't just slap any old game on Stadia; Google require the game to run well.

i didnt saw the number of employees, can you say it?
they showed an picture but i didnt count

The big Stadia round-up from the 'Google for Games Keynote' - Splash Damage exclusive, open source and more
25 March 2020 at 5:22 am UTC

Quoting: Shmerl
Quoting: GuestSo in watching the "Bringing Destiny to Stadia" video, about 10 minutes in, I find it interesting how many people they put onto the Stadia port over 6 months.

Video claims that development environment for Stadia is using Windows. That's already fishy. If it's just an option and they can use Linux to develop (would be weird if they require to use Windows to develop for Linux), then it's not an issue.

i hope its opitional, its very likely that their game engine or other middleware that they used only had an version for windows but could export for linux.

The big Stadia round-up from the 'Google for Games Keynote' - Splash Damage exclusive, open source and more
25 March 2020 at 5:18 am UTC

Quoting: Purple Library GuyI have to say, so far things are working out roughly as I hoped. That is, it isn't dying, but it shows no signs of taking over. I've always said I want it to succeed enough to make the idea of developing games to include a Linux target more mainstream, give Vulkan a boost and so on--but I don't want it to succeed enough to make streaming the dominant form of gaming, because I really hate that idea.
So far, so good.


the issue is, we didnt see what geforce now and xcloud will do...
maybe developers will give up on stadia and focus on those instead, since consumers arent likely to buy the games twice and they dont need to port to run on those services, and if consumers get used to play over streaming, we have the worse of both worlds...

not to mention that, capcom already removed their logo from the list of companies that support stadia...

The big Stadia round-up from the 'Google for Games Keynote' - Splash Damage exclusive, open source and more
24 March 2020 at 11:29 am UTC

i'm very worried with the low viewership on stadia official channel, even those key notes, the video with most views were like 66% for android developers, and just 33% stadia, but this is reasonable since its for developers, what worrie me the most is the channel for gamers...
and considering that a lot of youtubers and media in general are trash talking stadia, it will be an rough start.
the good news is: while everyone is trying to kill stadia, google is showing that he will not give up, contrary to what most people fear or expected, its even better that google is doubling down on their promisses while stadia is having trouble to prove then wrong.
as they say:
"when you are in the bottom of the wheel , the only path is upward"
if google can climb out of this hole, he will show to consumers that they have no reason to fear, "since it didnt gave up before when stadia was bad, now that its doing fine, there is nothing to fear"

the issue is: this will be enough to make developers support linux (i mean, offline on steam and etc) and to rise our marketshare?
i'm afraid not, pc enthusiasts arent likely to give up on all their tools for modding, 3D modeling etc so they can play it on linux, unless more tools get ported, the market shouldnt change much, and if console players didnt migrate before, i dont see why they would now.


not to mention that nvidia geforce now just did the pact with the devil, and microsoft playanywhere (including xcloud) seems better for many people.

NVIDIA talk up bringing DirectX Ray Tracing to Vulkan
21 March 2020 at 3:36 pm UTC

Quoting: Guest
Quoting: elmapul
Quoting: Guest
Quoting: elmapul
Quoting: Shmerl
Quoting: EikeIt's just a matter of time.

That said, I avoided buying a GTX 2000, because at the moment, it feels more like an expensive gimmick.

It is a gimmick. More of a marketing tool than a really useful feature. To achieve good quality real time ray tracing, you need really powerful hardware. And one that can be fit in a single GPU gives at best some minor enhancement to the lighting, and as I said above, it naturally comes at the cost of everything else.

wtf?
Ray tracing is the holygrail of computer graphics.
maybe Rtx, their dedicated cores, may be gimick, but Ray tracing?
that is simply the reason why the computer graphics industry had to use countless other gimmicks, because they didnt had real time ray trace, what nvidia did was an miracle that was later followed by others, sure, its not as good as rendering the entire frame, the same way that eevee (on blender) is not as good as cycles, but its close enough.

rendering in 16ms what usually take hours in a much better machine is not an small deal, sure its not as good as, but its impressive nonetheless.

one thing that i hate in gamers in general is how clueless they are, i dont give a fuck about 4k, raytracing is an serious technology, 4k is just a gimmick, but when they realizes that they would have to give up on 4k to play with raytracing, what they did? trash talked the technology, and that is the reason why it didnt sell as it should.
sure, there are other factors too, like games that arent really optimized for it, but seem the reception that this technology had, just disgusts me.



I always love it when clueless people call others clueless.... It is funny.

No, Nvidia performed no such "miracle". Nvidia just caught up with AMD's hardware architecture after many years. Nvidia's gpus for the better part of this decade were lagging behind in technology. They lacked async compute (VERY important and if games actually utilized it we would be seeing superior games), they lacked enough shading power and relied on geometry and brute force, they overtesselated everything just to win in benchmarks etc.

RTX is just some CUDA shaders that perform raytracing effects. That is why Nvidia after some months enabled the 1000 series to have RTX too.... It was just software. And guess what, architecturally VEGA and NAVI from AMD could run RTX just as efficiently, if Nvidia allowed their shaders to be translated to AMD hardware legally.... Oops, i guess now they did.

4K is not a "gimmick". Alongside HDR, they can enhance graphical fidelity considerably. They do cost a lot of resources. But if i had the choice between 4K/HDR and RTX on 1080p, i would pick 4K, every single time. Why? Because most effects RTX performs can be done with traditional techniques and be quite good looking, while 4K and HDR color literally upgrade the detail level of the whole screen. so yeah, RTX is a gimmick.

first off, i dont know about the hardware details, nvidia may lag behind in this front, but from an software point of view, ray tracing in real time is impressive (even more if they did in on their hardware that is as bad as you described)
it dont matter if it was an hardware inovation, or software inovation, its impressive in any case!

saying "It was just software" ,"Its just a shader", donot explain how they could make such a shader, if it was so easy, people would have figured it out before.

"if Nvidia allowed their shaders to be translated to AMD hardware legally.... Oops, i guess now they did."
the fact that nvidia hold the patent for that proves that they did it first.
why didnt amd did it before? or any game developer? because it isnt easy to do.


and second, 4k is not an guarantee of quality.
if i want to make an 4k game, i can just make an game with atari graphics and it will run on an ps3, an ps2, maybe even an n64 (if you ignore that those platforms dont have the proper output capabilities)

the issue with 4k is that you are giving up on something else to increase the resolution.
nowadays, you can run old games in high resolution on some emulators, pick an n64 game in 4k vs an movie in full HD, what have better graphics?
until games looks as good as movies (with fur and etc) i dont see the reason to increase the resolution, fullHD is good enough, 4k is bullshit, i rather have the double the poly count than double the resolution. (quadruple if you count x and y)


"Because most effects RTX performs can be done with traditional techniques and be quite good looking,"
that is why ray tracing is such a revolution, because now developers dont have to use those techiniques (those gimicks), they can just focus on making the game.
instead of baking the lights then making the objects inmovable so you cant break the illusion that the shadows are real, you just simulate light.
baking shadows is a gimmick.

in the current gen games, its hard to show the true advantage, because games were designed with that limitation into account, but now developers can get rid of then.

this video explain the issue better than i can:
https://www.youtube.com/watch?v=lyfTPG-dwOE

First of all, it would be nice if you don't comment on things you have no knowledge about. Sorry to be harsh, but i am tired of wasting time responding to people who clearly haven't done their homework.... Mostly kids and teenaged gamers...

"saying "It was just software" ,"Its just a shader", donot explain how they could make such a shader, if it was so easy, people would have figured it out before."

You realize raytracing is a very old technique, right? Raytracing is not new, it is decades old, and you can google about it and educate yourself. People had already figured it out, the issue has always been an issue of not enough transistors present on personal hardware to make it feasible in real time video games.... The reason modern gpus began implementing it is because we are at 7nm these days, there is no rush to reach higher resolutions (4K is more than enough for the next decade) and there are no new raster techniques that need extra juice. So they can use any extra transistors to begin transitioning to raytracing this decade.

Nvidia didn't invent anything, really. They just use gpu shaders to perform some effects. Even the dedicated hardware they use on some models is really just specialized shaders. It is not some newfound "magic" Nvidia invented, despite their marketing claiming it is. That is what i am combating, this ridiculous Nvidia propaganda. I hate Nvidia with a passion because they are a detriment to the hardware industry, they enjoy a dominant position in the gpu market while selling pure air and marketing hype.

As for 4K, sorry to burst your bubble, but any 1080p game will have significantly worse visuals than 4K on a proper screen. You clearly haven't gamed on 4K.... It is a great improvement all around. On the other hand, right now video games are not going to be fully ray-traced, raytracing is used mostly on some effects, they REPLACE some effects. The difference is there, raytracing is definitely more realistic, but for the most part the difference is barely noticeable during gameplay. So if someone has to choose between tiny more realism at 1080p that he won't really notice while gaming, and a little less realism at 4K, i think the vast majority of people are going to pick 4K. It just adds more visual quality.

Obviously, N64 games at 4K won't look significantly better than 1080p, because their textures are tiny, even for 1995 standards N64 games had crappy textures due to ram restrictions. You can't notice the difference in detail on flat 1995 textures.... But we are going to play at 4K games like Cyberpunk 2077, with 8gb+ vram.... If you think you can display all that detail of games made in 2020 and beyond on 1080p, you are delusional. We are not talking Super Mario 64 here....

I am sorry, but you are clearly an Nvidia fanboy here. I can smell it from a mile away, you just want to support Nvidia propaganda. When next gen AAA games arrive, people will be disabling raytracing in droves to gain FPS and ability to play at higher resolutions, unless developers begin not implementing the same effects using traditional techniques on purpose to push for more raytracing adoption....


You realize raytracing is a very old technique, right? Raytracing is not new, it is decades old, and you can google about it and educate yourself. People had already figured it out, the issue has always been an issue of not enough transistors present on personal hardware to make it feasible in real time video games..

W.R.O.N.G.
they didnt brute force it, they made a few samples and extrapolated from it.
we still dont have enough power to do it in real time without some tricks.
and the fact that it can work in older hardware proves that its not brute force in hardware, but code, an clever algorith.
even if it was pure hardware based, they deserve some credits for making an 7nm hardware, this isnt easy either.


" N64 games had crappy textures due to ram restrictions. "
wrong again, the issue was to make the textures fit into the catridge.

"you are delusional. We are not talking Super Mario 64 here...."
that was just an example on how resolution means nothing in pratice.
people are criticizing doom eternal for not being 4k (on xbox or stadia) but they would have to sacrifice something else more important to render it in 4k.
like disable some shaders, or something else, they did the right choice, the 1800 resolution wont kill anyone, but consumers are stupid.

"I am sorry, but you are clearly an Nvidia fanboy here."
i'm not an nvidia fanboy, they did a lot of shit in the past
https://www.youtube.com/watch?v=ZcF36_qMd8M
but that dont means everything they do is shit, i recognize when they do something impressive, and to say its just bruteforce is quite dumb.
raytracing in real time is impressive, if that was not an thing, cryengine, microsoft and amd would not be running to implement it on their technology (especialy after nvidia coming with it proving it was possible)

RTX may be bullshit since it was proven that the specific/dedicated hardware was not nescessary, but raytracing is not.