Collab VR

My final project before leaving the Purdue Envision Center was a collaborative VR teaching environment for viewing, manipulating, discussing, and annotating 3D datasets.

Features & Implementation

VR multiplayer – The goal was to create an interactive forum of discussion, so multiplayer was essential. The project was initially prototyped used Normcore, but the many limitations encountered push the project to a custom stack based on MLAPI.

Voice chat – To create an instructive environment users had to be able to vocalize to each other. Additionally, users could mute themselves and the instructor could mute everyone else. Voice bandwidth ended up being one of the limitations that pushed the project away from Normcore. I ended up using Dissonance VoIP for Unity.

Annotation – The user’s were given the ability to draw on the datasets to highlight and annotate features for discussion. This was actually a feature presented in one of the tutorials for Normcore, so the prototype implementation was a given. Moving to MLAPI and extending this feature gradually morphed the code into something else, but I was grateful for the starting point.

Avatar customization – Avatars had to be customizable so that individuals could be recognized. Name tags would also appear if a user pointed their hand at other users. Customization was similar to Mii creation, but a little simplified.

Point cloud visualization – The datasets this app was built for were scans of super nova remnants. These were massive multi-million point datasets. They were simplified and encoded for Unity in Houdini, then rendered using Unity’s VFX graph system. This allowed the datasets to be rendered at high enough fidelity on standalone VR hardware (Quest 2), while still looking stunning. All visualizations also had to be able to be grabbed and manipulated by any user, synced across all clients, and bring any attached annotations along with it.

Standalone, service-free builds – The final product had to be deployable by any entity with the required hardware without subscription to a service. This posed challenges when searching for multiplayer framework, as the current norm for multiplayer implementations was (and still is) centralized dedicated servers. The goal was to deliver client builds per platform and a server build that could be run on whatever machine was at hand, much like the old days of setting up a Minecraft server.

Deployment

Initial deployment and most of our user testing utilized the Normcore prototype hosted on Normcore servers. With the completion of the MLAPI version, hosting was moved to an AWS server running a dedicated server build.

The Envision Center aided Professor Milisavljevic with the logistics of distributing VR headsets loaded with the app to his students. We typically lent out about ten headsets at a time. Professor Milisavljevic would then host a virtual lecture in the app with these students, reaching his entire class in about three rounds of lectures. We would sanitize and maintain headsets between rounds. This provided us with some great testing data and feedback for development. Students raved about the experience, and it seems to have had a positive influence on Professor Milisavljevic’s Rate My Professor score.

Lessons Learned

I went into this project knowing the very basics of multiplayer game coding. That is, clients are independent apps running on their own data. Multiplayer is all about choosing what data to synchronize, and how and when to do so. That’s it. But I learned just how much those three questions multiply, how many pitfalls there are, and how important it is to code for fault tolerance.

I also learned how powerful it can be to maintain debug tools for a project. The three big ones for this project were implementing console commands, launch parameters, and desktop controllers for VR.

Console commands allowed for rapid testing of features independent of UI implementation. It also allowed for setting state during testing, prompting log generation, and testing individual functions outside of the normal flow of the software. I used Quantum Console from the Unity Asset Store, and would highly recommend it.

Launch parameters started out as a debugging feature in this app, but eventually became a necessity as the project moved to a custom dedicated server. Launch parameters allowed me to skip all of the connection configuration during testing. I made shortcuts in my build folder that would launch the build as a server or client (pointing to the IP of the server). Eventually, this allowed me to set up the dedicated server to run on boot. The only change in the build process between server and client was then platform selection and checking the Server box in the build settings to make the server version headless.

Desktop controllers for what was, by design, a VR app was always in scope. We wanted to allow those without VR headsets to join in on the session. It did become a powerful debug tool though, as I did not have to mess with a headset while testing most of the features. Coupled with the launch parameters, I could test multiplayer features on my workstation just by launching a server and client instance of the app. Additionally, the desktop build (and editor) is where the console commands worked, further empowering desktop testing. I do highly recommend making a desktop controller for VR rigs, even if it is a little clunky or missing features. It can pay dividends in testing.

Press

https://www.purdue.edu/research/dimensions/purdue-prof-pieces-together-how-massive-stars-explode/

https://ryanschultz.com/2021/08/10/studying-supernova-explosions-using-collaborative-virtual-reality-at-purdue-university/

https://edscoop.com/purdue-university-virtual-reality-exploding-stars/