AR&VR for Linux
One area of computing that is rapidly evolving and becoming more and more popular is that of Virtual Reality and Augmented reality. Thanks to Valve there is full support for the HTC headsets under Linux, and thanks to community developers there is also work happening to support the Oculus headsets under Linux. In this interview Christian Schaller speaks with Jan Schmidt who is working on reverse engineering a driver for the Oculus Rift VR headset. They talk about the general state of VR under Linux, Jan’s use of Fedora Workstation, and his specific work on his driver.
Christian Schaller: Hi Jan, tell us a little about yourself, like what do you do for a living, how did you get into Linux and Fedora Workstation?
Jan Schmidt: I work on the GStreamer multimedia framework, which I’ve been involved in since 2003.
Together with some friends, I run a consultancy called Centricular, where we help people and companies build multimedia projects and products. We use that consulting as a way of funding development upstream on GStreamer, and other open source projects depending on our personal interests. GStreamer is a big and lively project used all over the world, so helping to build that is very rewarding.
I first installed Linux in 1995 while at university. It’s been my primary operating system since then. I’ve used lots of distributions over the years, but settled on Fedora Linux for my main workstation. It integrates (and often drives) projects that are changing the way the Linux desktop works – like systemd and PipeWire, and strikes a good balance between staying up to date on those developments without having to deal with the instability of compiling everything myself.
Christian Schaller: So tell us a little about your Fedora Workstation setup? Using any special tools or configurations?
Jan Schmidt: I use a fairly standard Fedora Workstation. It has rpmfusion enabled, because I need the multimedia pieces that Fedora can’t distribute upstream, and I have a handful of other things compiled from source – packages for VR development and machine learning that aren’t packaged for Fedora yet.
Christian Schaller: So you are leading the development of a Linux driver for the Oculus Rift headset, what made you decide to take on that project?
Jan Schmidt: I always want to be able to do all the things on my Linux computer that I can on other platforms. I got the original Oculus Rift CV1 toward the end of 2016. At the time, people were still holding out hope that Oculus would release Linux support, but that possibility disappeared the longer things went on. So, Linux people did what we’ve always done in that situation, and started reverse-engineering driver support without help from the vendor.
Philipp Zabel and others in the OpenHMD project had been working on support for the earlier Oculus DK1 and DK2 development headsets, and had extended that to also include support for the consumer CV1 – but the driver was 3DOF (meaning it only supports looking around from a fixed head position), and didn’t include the (at the time) new Touch hand controllers. Philipp did a bunch of USB protocol inspection and worked out the details of talking to the devices.
I got involved in early 2019 and started by taking Philipp’s code and using it to add 3DOF support for the Touch controllers to the existing OpenHMD driver. After that, I started looking into what it takes to have full positional tracking and have been working on that as my weekend passion project ever since. I really didn’t know what I was getting into!
I took a short side-track in 2020 and did my own stint staring at USB packet traces when the newer Oculus Rift S headset came out, documenting the protocol and implementing a 3DOF driver for that headset.
Christian Schaller: Tell us a little of the big picture of VR/AR support under Linux atm?
Jan Schmidt: It’s a really interesting area with a lot going on. At an obvious level, there are a few headsets that you can use out of the box with Linux and SteamVR – the HTC Vive and Valve Index hardware. Beyond that, there’s projects aiming to implement open drivers for a wider range of hardware, and to let us do more than just play games (although of course people want to do that too!)
At some point, I’d like support for Virtual Reality and Augmented Reality headsets to be as integral on Linux as 2D desktops are right now. Windows 10 already has that built-in for Windows Mixed Reality headsets. On Linux, there are multiple projects working toward that goal, but largely in the background – they haven’t surfaced in distros yet.
It’s a huge undertaking, with changes required across the entire software stack from kernel up.
For example, to get positional tracking working with more than one camera, I had to fix a bug in the USB3 stack that miscalculated the available bandwidth. The kernel has had to learn to recognize when a connected display belongs to a head mounted display, so that the X server and Wayland don’t try and extend the desktop onto it as if it were a new monitor.
Those display servers gained the ability to ‘lease’ out display devices – to hand off a connected display so that VR/AR software can take control and output directly. That’s required for the precise timing required to drive VR/AR headsets with low latency and accurate positioning.
In that arena, there is the Monado project, which provides an OpenXR runtime and compositor. That’s really important as a standard API for cross-platform VR/AR applications. Think of it as an X server for VR headsets.
Projects like OpenHMD, libsurvive and psmoveapi are implementing the drivers to talk to different VR headsets. That’s where I’m working on the positional tracking support for CV1.
xrdesktop and SimulaVR are working on the 3D desktop experience, to take traditional apps and use them inside a 3D display.
There are the StardustXR, Project NorthStar and ILLIXR projects working on the Augmented Reality pieces – like understanding the surrounding world in order to merge computer generated elements into it.
At the application level, the Blender 3D modelling software and the Godot open source game engine have support for OpenXR output.
Christian Schaller: Gotten any feedback from Oculus/Facebook about your work? I mean I remember signing up for their Kickstarter where they did promise Linux support, but that went quiet after a while.
Jan Schmidt: Not at all. I suspect that the Rift S is the last PC headset we’ll see from Oculus. There is much more value for them in standalone headsets like the Quest, where they can control and monitor the entire user experience and I think that’s where their effort will be in the future. With the Quest Link cable, and an increasing ability to wirelessly stream VR from a PC they can still support higher-end rendering without losing that control.
Christian Schaller: What is the status of your work on VR?
Jan Schmidt: I maintain a fork of OpenHMD at https://github.com/thaytan/OpenHMD/ where I’m working on the positional tracking support for the CV1 headset. Right now, that tracking is “generally working” – you can use the headset and the controllers to play games in SteamVR via the SteamVR-OpenHMD wrapper or with any OpenXR application via Monado.
What I’m focused on right now is fixing tracking glitches on the controllers. Depending on how you hold the controllers, the computer vision algorithm can misinterpret their pose, or it can get confused about which controller is which and your hands end up swapped in the VR world.
When I’m happy with how well CV1 is working then I want to tackle positional tracking support for inside-out headsets like the Oculus Rift S, and Windows Mixed Reality (WMR) headsets. I also own an HP Reverb G2 (which is a WMR headset) and have done some work on that one to get 3DOF display with correct rendering.
Inside-out tracking is contrasted to the outside-in tracking I’ve been working on. Instead of fixed cameras watching the user move around, inside-out headsets have cameras on the headset that observe the world as the user moves, and the challenge is to use those observations to calculate the user position and pose using SLAM and VIO algorithms. The 2nd part is tracking the hand controllers using those same moving cameras. For that part, I hope to build on top of the controller tracking I’ve already written for CV1.
Christian Schaller: If people want to help out with VR on Linux what are some easy and simple ways to get involved? For instance, are there some packages they could make your life easier by packaging in Fedora? And are there any specific IRC channels or similar where you and other members of the VR community hang out?
Jan Schmidt: I test with SteamVR mostly right now, because it’s a mature platform with access to many of the VR games from Windows via the Proton/WINE compatibility layer. The applications I really want to get to – augmented reality interactions and virtual desktop environments are still coming along somewhat.
In terms of what Fedora can be doing to be ready for those – packaging things like the xr-hardware udev ruleset and Monado dependencies will be a good basis.
For developer discussions, the #openhmd IRC channel on libera.chat and the Monado discord are good places to find the people working on these things.
Christian Schaller: Got any favourite VR applications or games?
Jan Schmidt: It’s a bit cliché – but I really like testing with Beat Saber. It’s a robust test of how well tracking works under really fast motion. To test fine motion, I use Half-Life: Alyx, which has lots of interactions that require a steady hand.
This interview was conducted by Christian F.K. Schaller, be sure to follow me on Twitter for more linux updates and interviews. Also make sure to follow Jan Schmidt on twitter to learn about his latest work VR and also GStreamer.
Luya Tshimbalanga
Note that OpenXR is available in the main Fedora repository. The Modano team were very helpful to improve the support.
Renich
I really like nerdy interviews! Thank you. And thank you, too, Jan, for the awesome work! 🙂