
Movement SDK
Summary
The MovementSDK project is primarily a test project utilizing Oculus's Movement SDK template, combined with the new hardware and software integrated in the recently released Oculus Meta Quest Pro, which allows full facial feature, including eyes, and limited body IK tracking.
The idea being that if we combine this with our work in the Avatar Recorder project and similar projects, we will be able to fully track and digitalize someone in a virtual reality space, where they can have fully accurate body and positional tracking, as well as being able to track the facial expressions, and even a limited form of lip sync.
Getting Started
Meta Oculus Quest Pro
Official Meta Movement SDK
The biggest hurdle is acquiring a Quest Pro, which is a more professional grade VR headset (in terms of pricing and technology), currently running about $1,000 (which due to poor sales was cut to this figure after the original asking price of ~$1,500)
After that, it's a matter of following the guide in the official SDK page, and installing/updating the relevant Unity OpenXR and Oculus Integration packages.
Note: This is important, as I had quite a few issues early on in this project and a lot of confusion due to features not working or not appearing at all. Ultimately, I had to re-install and update all the Unity VR/XR plugins, and Oculus Integration packages, as well as migrate everything to a new Unity project to get everything working correctly.
After running through the guide and getting everything setup in a fresh Unity 2021 project, the primary feature we were after worked. It took a bit of messing around with settings and checkboxes to get the Face Tracking feature working, along with allowing access to face tracking features in the headset itself (as a security and privacy concern).
You can see on this image on the left with working finger tracking (which was relatively easy to do with previous Quests), but now with facial tracking on the character.
I won't bother with the details about how the "Body Tracking" and "Eye Tracking" were constantly "failing to start", and the myriad of forum posts I sifted through to try and figure it out, manually locating and editing AndroidManifest.xml files, so on and so forth. Frankly, I'm really not sure how I screwed it up in the first place, but really it seemed to just boil down to having an updated Unity OpenXR framework, and updated Oculus Integration plugin, and making sure the Oculus plugin is set to specifically use the OpenXR backend, rather than it's own OculusXR backend. Actually, I think that's the kicker. I believe in all our previous projects, we would use the OculusXR backend, because that's what was recommended and worked, then for this, it doesn't, though I suppose that's on me as it does specifically say the OVR plugin needs to be using the OpenXR backend.
Moving on from that, once everything was officially in place, all the features were working as expected. Face Tracking, Eye Tracking, and Body Tracking, which you can see examples of from some of their scenes. Specifically, we were most interested in the Aura Sample. It provides a floating, stylized alien girl head, with floating hands, with a mirrored duplicate version so you can see yourself almost as if in a mirror. Video below: