Thanks to a new patent that went public on June 17, we can see a little more behind the scenes work on what Valve has planned for their next VR kit with what could be a Valve Index 2. The news and speculation comes thanks to a video from SadlyItsBradley.
The patent itself was actually filed back in December 2019, so it's not actually new. However, it did only just this month go public so now we're able to dive in and see what Valve were thinking about their next steps. It goes to show that they were clearly already thinking about the next generation as the Valve Index was releasing back in June 2019.
Wireless VR is the next true step to make the experience even better. As an owner of a Valve Index (and it's awesome), I can safely say it would be far nicer without the big thick wire attached to it. It gets in the way, you can easily step on it and unplug it, and it's just another part that can break. Part of the problem with wireless or standalone VR kits, as Valve say in the patent, is that they can be heavy and hot due to doing all of the rendering. Some of the skimp on the power to get around this but then you get less of an experience. So how to do deal with those and other issues?
What we can see from the patent is that Valve wanted to have a split rendering system where a PC would do some of it, send it wirelessly over to a HMD (Head-mounted display) to then have the HMD do some too.
Working together, it seems, Valve think this would solve current issues with latency and graphical power for VR. The patent goes into some depth on all the ins and outs of how it would work, including that the HMD looks to be almost be a full computer (see above picture). Part of how they want to solve it using this split rendering system is to have the HMD attempt to correct any errors, and it would be able to do so if it has enough power inside it with it not needing to render everything by itself.
Not only does it talk about the details of the split-rendering mode, it also mentions how it could be used as a standalone device for less graphically intensive games, movies and more. So we could be looking at a new VR kit from Valve that doubles up as a standalone unit and one that can connect up to a PC either wired or wirelessly (it mentions both).
Complicated stuff but this is all incredibly exciting to make VR more accessible than it currently is!
If this turns into a real product, along with a possible handheld SteamPal - Valve would be well positioned to keep pulling in future gamers of all kinds.
Quoting: GuestIt would be great if you could have the choice. Either use it wireless or plug it in for higher quality or rather less input lag and such. So you can decide for yourself how you wanna use it.It most likely works the same way or similarly to the HTC's wireless addon, in which case, yeah you can use it wireless or plugged in.
There was a guy on youtube talking about this being a thing rather than the 'SteamPal' and that the other patents that were filed were about either a wireless adapter for the Index, or a potential Index 2. We'd need better video cards for an Index 2.
Quoting: CatKillerSo, I think the idea is to use the same kind of tech as game streaming; the computer renders the scene based on position information from the headset, and sends essentially a 360° video stream to the headset, which can be freely navigated using the headset. If there's enough bandwidth, that could be quite interesting.
Ya know, if they included depth information per pixel, it would be possible to not only correctly correct the 360 image to adjust for head rotation, but also give proper depth displacement to the 360 video to account for head movement with near zero latency as well. Would hide a lot of latency or frame rate issues in most circumstances.
Quoting: gradyvuckovicPretty sure that tech is OLD. Like Atari Jaguar old (with z-depth). Unless you're talking about something else? Kind of required for most 3D stuff in general.Quoting: CatKillerSo, I think the idea is to use the same kind of tech as game streaming; the computer renders the scene based on position information from the headset, and sends essentially a 360° video stream to the headset, which can be freely navigated using the headset. If there's enough bandwidth, that could be quite interesting.
Ya know, if they included depth information per pixel, it would be possible to not only correctly correct the 360 image to adjust for head rotation, but also give proper depth displacement to the 360 video to account for head movement with near zero latency as well. Would hide a lot of latency or frame rate issues in most circumstances.
Quoting: gradyvuckovicQuoting: CatKillerSo, I think the idea is to use the same kind of tech as game streaming; the computer renders the scene based on position information from the headset, and sends essentially a 360° video stream to the headset, which can be freely navigated using the headset. If there's enough bandwidth, that could be quite interesting.
Ya know, if they included depth information per pixel, it would be possible to not only correctly correct the 360 image to adjust for head rotation, but also give proper depth displacement to the 360 video to account for head movement with near zero latency as well. Would hide a lot of latency or frame rate issues in most circumstances.
So, looking at the patent, that seems to be something that's also included in the stream. The headset sends movement data, and the computer guesstimates where the headset will be at render time based on that. The stream it sends to the headset includes pixel data (so, the video stream) as well as pose data, depth data, motion vector data, parallax occlusion data, and "extra pixel data." So if the next frame is late, or the guesstimate is a bit wrong, the headset has enough data to correct for it. Could be neat.
Also, the computer seems to render the scene, but the headset renders the player model, which should also reduce any latency discrepancy between what the player's doing and what they're seeing.
Last edited by CatKiller on 21 June 2021 at 4:46 pm UTC
Quoting: CatKillerSo, I think the idea is to use the same kind of tech as game streaming; the computer renders the scene based on position information from the headset, and sends essentially a 360° video stream to the headset, which can be freely navigated using the headset. If there's enough bandwidth, that could be quite interesting.That does sound far from efficient at all.
It means that the host needs to render 360 instead of 100 degrees, so 3,6 times as much on the turn, and then we need a lot more for up down.
However if it would compile the scene to simple objects and can do a lot of the z axis calculations, and compile/precalculate a lot of the textures, it would make it a lot easier for the HMD.
Whatever they do, I want to know. I was discussing scene compilation/simplification and texture downloading to the HMD to get a split rendering system without a lot of bandwidth usage (like X11 does until chrome came, or network opengl) a few years ago. I am at least glad that Valve proofs I was not a fool ;-).
Quoting: HoriThe index cable is very fragile. Mine broke recently. Fortunately they do provide a free replacement for it, at least.Where did it break at? The connector to the splitter that goes to the back of the PC always seemed a bit flimsy to me and I always worry when I unplug / move it around. Glad they replaced it for free!
But after that I am now constantly worried that it will break again suddenly.
But this does sound like it could maybe even connect to a SteamPal then... interesting. I wonder if it will run Android or regular Linux on the headset itself though.
Quoting: JuliusWouldn't have to be 360, something with a reasonable overscan (like 150°-180° maybe) would be sufficient since the screen refreshes that quickly anyways.I'm betting Linux. None of the SoCs for ARM seem to have open source video drivers, which has caused heaps of problems getting updates onto the things. Though likely it'll be pretty embedded.
But this does sound like it could maybe even connect to a SteamPal then... interesting. I wonder if it will run Android or regular Linux on the headset itself though.
Quoting: slaapliedjeI'm betting Linux. None of the SoCs for ARM seem to have open source video drivers, which has caused heaps of problems getting updates onto the things. Though likely it'll be pretty embedded.
This is a bit outdated, the Panfrost drivers for Mali GPUs actually work fine and are fully open source, in fact most ARM GPUs have open-source OpenGL drivers these days, but they support status differs.
However Vulkan support is not as great yet so that might make things difficult.
Overall though, I could also imagine Valve going for an Android solution, as making a stand-alone capable headset and then having no games for it would not be very wise for business. While if it runs Android, then most Occulus Quest games should be quite easy to port over. But if Valve would sponsor Anbox (a compatibility layer to run Android apps on GNU-Linux) development, that would be also cool :)
That moves things into interesting direction of eventually rendering everything on the headset itself, but I guess they need to get small enough first while being powerful. Current GPUs required for VR use a ton of power and need serious heat dissipation.
Last edited by Shmerl on 21 June 2021 at 7:11 pm UTC
Quoting: ShmerlSo instead of being essentially a fancy display, the headset will have its own GPU?Standalone VR headsets already exist with their own GPU. The Oculus Quest uses the Adreno 540 and the Quest 2 uses the Adreno 650. In terms of graphics power they're pretty weak, but going stronger risks melting your face. The idea of this is that you can offload some of the face melting stuff to a different machine, but have enough local processing power that the latency doesn't make you hurl.
That moves things into interesting direction of eventually rendering everything on the headset itself, but I guess they need to get small enough first while being powerful. Current GPUs required for VR use a ton of power and need serious heat dissipation.
Quoting: HoriYeah, that's the part I'd have assumed broke, that connects into the Power/DP/USB cable. It seemed really flimsy even when I first opened mine. You'd think they'd have used some hardy DIN connector or something thick, not the equivalent of a Display Port cable that was slimmed down!Quoting: slaapliedjeHonestly not sure. I'm almost sure it is the trident/splitter, but I didn't yet try it with the new tether, so I cannot confirm.Quoting: HoriThe index cable is very fragile. Mine broke recently. Fortunately they do provide a free replacement for it, at least.Where did it break at? The connector to the splitter that goes to the back of the PC always seemed a bit flimsy to me and I always worry when I unplug / move it around. Glad they replaced it for free!
But after that I am now constantly worried that it will break again suddenly.
I think it broke something related to the USB wiring since lsusb stopped showing anything except for the cameras (which still worked in Cheese)
I used to use the Index in such a way that the cable took a sharp corner behind my tv and through the wall and into the PC on the other side. This is why I think it is the trident that broke since that was the part under stress...
Either that or the connector between the trident and the tether, since I had to replug it for every session and after digging info on the internet, people say it is a very fragile connector that should *not* be regularly replugged.
Now with the new cable I just use it normally, no going through the wall, no hiding and no replugging the trident connector. I just unplug the whole thing from the PC end and that is it. Until the extension cables order (DP and USB extensions) arrive then I can cable manage those through the wall instead of the proprietary and expensive and fragile index cable.
I mean, it is just to small screens, isn't it? With sensors of course, but the computing and rendering, I thought, is done on the PC.
The tracking and stuff is done with the lighthouses and that data is to be computed on the PC, so that he game knows about it.
Quoting: JuliusIf that were accurate, attempts at making open source phones would have more platforms to choose from, and it seems to me the only ones they do tend to pick are the Mali GPUs. I'd have to look into it again, it has actually been a while. Still waiting on my Librem 5 phone...Quoting: slaapliedjeI'm betting Linux. None of the SoCs for ARM seem to have open source video drivers, which has caused heaps of problems getting updates onto the things. Though likely it'll be pretty embedded.
This is a bit outdated, the Panfrost drivers for Mali GPUs actually work fine and are fully open source, in fact most ARM GPUs have open-source OpenGL drivers these days, but they support status differs.
However Vulkan support is not as great yet so that might make things difficult.
Overall though, I could also imagine Valve going for an Android solution, as making a stand-alone capable headset and then having no games for it would not be very wise for business. While if it runs Android, then most Occulus Quest games should be quite easy to port over. But if Valve would sponsor Anbox (a compatibility layer to run Android apps on GNU-Linux) development, that would be also cool :)
Quoting: slaapliedjeIf that were accurate, attempts at making open source phones would have more platforms to choose from, and it seems to me the only ones they do tend to pick are the Mali GPUs. I'd have to look into it again, it has actually been a while. Still waiting on my Librem 5 phone...
See for example this: https://youtu.be/ZYCGVzkSIpg
But finding a suitable ARM SoC for a Linux phone is way more complicated than "just" having open GPU drivers for it.
Quoting: JuliusYeah, that's just one of the big show stoppers, well that and being able to update Android, as they only build the closed source GPUs for specific kernels.Quoting: slaapliedjeIf that were accurate, attempts at making open source phones would have more platforms to choose from, and it seems to me the only ones they do tend to pick are the Mali GPUs. I'd have to look into it again, it has actually been a while. Still waiting on my Librem 5 phone...
See for example this: https://youtu.be/ZYCGVzkSIpg
But finding a suitable ARM SoC for a Linux phone is way more complicated than "just" having open GPU drivers for it.
See more from me