Valve have put up SteamVR for Linux officially in Beta form and they are keen to stress that this is a development release.
You will need to run the latest Steam Beta Client for it to work at all, so be sure to opt-in if you want to play around with it.
VR on Linux will exclusively use Vulkan, so it's going to be a pretty good push for Vulkan if VR becomes more popular. Vulkan is likely one of the pieces of the puzzle that held it back, since Vulkan itself and the drivers are still so new.
On NVIDIA, you need to have the 375.27.10 "Developer Beta Driver", which can be found here. There's also this PPA for Ubuntu users. It's likely it needs some newer Vulkan extensions not found in the current stable drivers.
For AMD GPU owners, you need a very recent build of the open source radv driver (Mesa), Valve provide this pre-release on their github page.
Intel GPUs are not supported and it's probable it will be a long time until they are, since VR generally requires some beefy hardware to run smoothly. It's possible they may work in future, but I imagine the Intel 'anv' Vulkan driver needs more work done.
Also, you will likely need some updated udev rules, but all of that and more can be read about on their github page for it.
It's exciting to finally see VR on Linux starting to become a real reality now. I just wish the hardware wasn't so damn expensive. It will likely be a long time before I can afford a headset myself to review, but hopefully someone can send us a review unit to hold onto.
Thanks for tweeting it to me Dennis.
You will need to run the latest Steam Beta Client for it to work at all, so be sure to opt-in if you want to play around with it.
VR on Linux will exclusively use Vulkan, so it's going to be a pretty good push for Vulkan if VR becomes more popular. Vulkan is likely one of the pieces of the puzzle that held it back, since Vulkan itself and the drivers are still so new.
On NVIDIA, you need to have the 375.27.10 "Developer Beta Driver", which can be found here. There's also this PPA for Ubuntu users. It's likely it needs some newer Vulkan extensions not found in the current stable drivers.
For AMD GPU owners, you need a very recent build of the open source radv driver (Mesa), Valve provide this pre-release on their github page.
Intel GPUs are not supported and it's probable it will be a long time until they are, since VR generally requires some beefy hardware to run smoothly. It's possible they may work in future, but I imagine the Intel 'anv' Vulkan driver needs more work done.
Also, you will likely need some updated udev rules, but all of that and more can be read about on their github page for it.
It's exciting to finally see VR on Linux starting to become a real reality now. I just wish the hardware wasn't so damn expensive. It will likely be a long time before I can afford a headset myself to review, but hopefully someone can send us a review unit to hold onto.
Thanks for tweeting it to me Dennis.
Some you may have missed, popular articles from the last month:
Quoting: ShmerlCan it work as a drop in replacement for OpenVR (API that SteamVR uses)? Also, strangely it seems to be using OpenGL, not Vulkan.
I don't believe it's meant as a drop in replacement yet. Yes, it's using OpenGL. I unfortunately know little about it, other than knowing that it will be a way to use VR devices without being dependant upon Valve.
0 Likes
Traditionally to achieve stereoscopic you needed the video card to process 2 renders per frame, one for each eye, that is no longer required as proven by NVIDIA and a some VR helmet makers. (yes I don't think its nvidia exclusive anymore, and I heard of Pimax doing some sort of frame-buffer trick but I don't know much more their helmet then what I have read).
Makes 4k rather easier to achieve, plus remember allot of VR games are optimized towards framerate much stricter then traditional games, so a traditional game like Deus Ex might only give you 40fps, if it was made for VR they would and could easily double that with optimizations and reductions where possible.
If your VR game runs crap and you have decent hardware, then chances are its the game to blame for it.
Last edited by TheRiddick on 22 February 2017 at 12:12 pm UTC
Makes 4k rather easier to achieve, plus remember allot of VR games are optimized towards framerate much stricter then traditional games, so a traditional game like Deus Ex might only give you 40fps, if it was made for VR they would and could easily double that with optimizations and reductions where possible.
If your VR game runs crap and you have decent hardware, then chances are its the game to blame for it.
Last edited by TheRiddick on 22 February 2017 at 12:12 pm UTC
0 Likes
Quoting: TheRiddickTraditionally to achieve stereoscopic you needed the video card to process 2 renders per frame, one for each eye, that is no longer required as proven by NVIDIA and a some VR helmet makers.
Do you have a link for that? No matter what, you need to have 2 pictures/views/renders for stereoscopic vision, and of course have to be different from each other. It doesn't work any other way. Maybe you mean that the engines have made improvements in sharing as much processing and ressources as possible for those 2 pictures/views/renders?
0 Likes
Here is a link to the headset, you can investigate it yourself if interested.
http://www.gearbest.com/pc-headset/pp_436489.html
I probably would have bought one of those if it had positional tracking...
Last edited by TheRiddick on 22 February 2017 at 12:56 pm UTC
http://www.gearbest.com/pc-headset/pp_436489.html
I probably would have bought one of those if it had positional tracking...
Last edited by TheRiddick on 22 February 2017 at 12:56 pm UTC
0 Likes
Quoting: TheRiddickHere is a link to the headset, you can investigate it yourself if interested.
The headset has not much to do with how 2 pictures are rendered. Every headset is "just" a display (or two). The computer renders the images, and to have stereoscopic vision you need 2 images. If you render only one view, you don't have any 3D effect.
Last edited by Doc Angelo on 22 February 2017 at 1:28 pm UTC
0 Likes
*faints*
... Open eyes, sees headline and
*faints again*
Dare we hope for a full-fledged VR experience before summer? I think we do! Time to start saving some SERIOUS cash!
... Open eyes, sees headline and
*faints again*
Dare we hope for a full-fledged VR experience before summer? I think we do! Time to start saving some SERIOUS cash!
0 Likes
Quoting: TheRiddickYou can get a 4k VR headset from Pimax I think its called, they also intend to release a bigger product in a next few months that has two 4k screens in it and FULL tracking setup. It will take time before there drivers get good but 4k with full tracking might be enough to convince me to get one.
BTW the 4k Pimax one is like $400usd so not bad. However only directional tracking :(
I still laugh at people with this concept that computers can't do 4k VR, its like listening to OLD people comment about tech...
From what I hear Pimax only does 30 FPS at the full 4k resolution due to limitations of HDMI 1.4 making it pretty much unusable at that resolution so I don't think it's a great example to cite as proof for 4k VR being possible...
0 Likes
Quoting: Doc AngeloQuoting: TheRiddickHere is a link to the headset, you can investigate it yourself if interested.
The headset has not much to do with how 2 pictures are rendered. Every headset is "just" a display (or two). The computer renders the images, and to have stereoscopic vision you need 2 images. If you render only one view, you don't have any 3D effect.
I see why you put "just" in quotes but to be even more fair a VR HMD has a few other boxes it needs to tick besides being "just" a display.
0 Likes
They have a driver and software for their headset, that is doing the thing for them. You actually need to look deeper and I won't be holding your hand, if you don't believe me then fine whatever I don't care.
0 Likes
Quoting: TheRiddickThey have a driver and software for their headset, that is doing the thing for them. You actually need to look deeper and I won't be holding your hand, if you don't believe me then fine whatever I don't care.
I don't need you to hold my hand. But if you want others to take your science defeating statement more serious, you should have a better source than some random electronics company selling their products with bullshit bingo marketing.
"Blue Laser Harm Protection" ...?
0 Likes
Quoting: TheRiddickThey have a driver and software for their headset, that is doing the thing for them. You actually need to look deeper and I won't be holding your hand, if you don't believe me then fine whatever I don't care.
Sorry but I just won't believe a random person on the internet over having done my own research unless you provide some proof for what you're claiming. Meanwhile the information that real 4k is only at 30Hz is all over the place if you just do a simple search although the device manufacturer doesn't like to advertise this fact for a reason I think we can all understand.
I think it's best to inform people so that they won't be disappointed if they get Pimax with the false assumption they're just going to get it rendering at the native resolution which is strongly implied with the marketing for this device. They only enabled the 4k @ 30 FPS in a later update after it being requested too.
Last edited by badber on 22 February 2017 at 2:17 pm UTC
1 Likes, Who?
Was just to get you started so you'd know what headset I'm talking about, there is reviews and videos about the headset and some talk about the rendering system they are using, I can't remember exactly which link.
There is not much different in the image of the left eye vs right, and in a 3d space you can render a slightly wider (ever so slightly) image and offset that image depending on the eye that sees it, this means you can just render a single scene and image.
Its not rocket science! People have been doing it since the 60s if I remember correctly with aerial photography, why people treat it like some magical beast of burden is beyond me!!!!!
Last edited by TheRiddick on 22 February 2017 at 2:18 pm UTC
There is not much different in the image of the left eye vs right, and in a 3d space you can render a slightly wider (ever so slightly) image and offset that image depending on the eye that sees it, this means you can just render a single scene and image.
Its not rocket science! People have been doing it since the 60s if I remember correctly with aerial photography, why people treat it like some magical beast of burden is beyond me!!!!!
Last edited by TheRiddick on 22 February 2017 at 2:18 pm UTC
0 Likes
Quoting: TheRiddickThere is not much different in the image of the left eye vs right, and in a 3d space you can render a slightly wider (ever so slightly) image and offset that image depending on the eye that sees it, this means you can just render a single scene and image.
"Not much different" still means different. You can't just pan two section of a single image and get a 3D effect. That is not possible.
Maybe this company claims to have created an algorithm which tries to recreate the 3D scene and generates two different looking images from one. But... why would we want that? There are already companies who claimed that and if I remember correctly, it looked like artificial shit.
0 Likes
I say again, that is how stereoscopic aerial photography works, you have two cameras with slightly different positions that allow for the 3d effect to happen. The work is done by the brain. s3d on PC is done the same way it just comes down to how each slightly varied image is presented to the viewer (such as red/blue, shutter glasses, or in VR sense direct LCD separation.
0 Likes
Well, i had many problems, first not detect my lighthouses. After it, i restarted, and then them has been detected. My room setting wizard crashed while configuring it. Worked at next time. Tutorial doesn't start. And destination hangs after some time. Very fustrating :P
View video on youtube.com
Last edited by bubexel on 22 February 2017 at 3:14 pm UTC
View video on youtube.com
Last edited by bubexel on 22 February 2017 at 3:14 pm UTC
0 Likes
Quoting: TheRiddickI say again, that is how stereoscopic aerial photography works, you have two cameras with slightly different positions that allow for the 3d effect to happen. The work is done by the brain. s3d on PC is done the same way it just comes down to how each slightly varied image is presented to the viewer (such as red/blue, shutter glasses, or in VR sense direct LCD separation.
I absolutely agree. Every single one of those things use two images. With just one, it wouldn't work.
0 Likes
So.. comparing something someone has read about specs that a company claims vs someone who owns a Vive already and has tried one of the heaviest games, Elite: Dangerous, and trying to get smooth framerates and acceptable display resolution with tweaking super sampling, I can say with pretty good authority that it currently is not possible to drive that game onto double 4k screens without yacking.
Specs of my machine;
Asus STRIX 1080GTX OC
32gb DDR4 RAM
Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz
It's not the 'slightly different images' that drives VR, that's for stereoscopic display. Hell my Note 4 can handle that, and does pre-rendered scenes quite nicely. But try adding all the other things in that make the Vive great, tracking, interaction, etc. Being able to drop straight to the floor without having the visual output stutter for even a second is unacceptable performance. Hell, I was trying to watch a movie without much movement of my head, and Big Screen froze and it was disorienting as hell!
Last edited by slaapliedje on 23 February 2017 at 1:08 am UTC
Specs of my machine;
Asus STRIX 1080GTX OC
32gb DDR4 RAM
Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz
It's not the 'slightly different images' that drives VR, that's for stereoscopic display. Hell my Note 4 can handle that, and does pre-rendered scenes quite nicely. But try adding all the other things in that make the Vive great, tracking, interaction, etc. Being able to drop straight to the floor without having the visual output stutter for even a second is unacceptable performance. Hell, I was trying to watch a movie without much movement of my head, and Big Screen froze and it was disorienting as hell!
Last edited by slaapliedje on 23 February 2017 at 1:08 am UTC
1 Likes, Who?
I dunno how elite performs at 4k but the Vive expects 90fps, so if you can't get that standard with the 1080gtx then yeah its a issue. Its not really designed around VR so I don't even know if it allows for the use of the Nvidia VR boosting tech.
0 Likes
The problem is that NOTHING is using the nVidia VR boosting tech, outside of perhaps the nvidia funhouse. Which is actually really fun and a good example of how to optimize things.
While I personally was pointing out to an ex-coworker, that even the GPUs are implementing direct features for VR support to help out, it doesn't mean anything is going to adopt proprietary interfaces for doing so.
While I personally was pointing out to an ex-coworker, that even the GPUs are implementing direct features for VR support to help out, it doesn't mean anything is going to adopt proprietary interfaces for doing so.
0 Likes
Don't expect much performance from games that are not built for VR such as titles on the oculus store and such that say VR ONLY.
In saying that the Serious Sam VR game should be fine at 4k resolution I would think.. even thought the screen is 1200p, IMO you'd be better up-scaling to 1440p.
In saying that the Serious Sam VR game should be fine at 4k resolution I would think.. even thought the screen is 1200p, IMO you'd be better up-scaling to 1440p.
0 Likes
See more from me