Written by: Timothy Porter, Underminer Studios LLC
Edited by: Alexandria Porter, Underminer Studios LLC
I am Timothy Porter, a pipeline technical artist and efficiency expert. I create unified systems that promote faster, more intuitive, and collaborative workflows for creative projects. With more than nine years in the entertainment industry, a Bachelor’s degree in Computer Animation, and a sharp mind for technical details, I have used lessons learned from early-career gaming roles to develop methodologies for streamlining pipelines, optimizing multiple platforms, and creating tools that make teams stronger and more efficient. I own an outsourcing and bleeding edge tech company, Underminer Studios.
Overview
This article will teach you how to identify VR applications that are mixed reality (MR) ready, and how to enable MR mode in your Unity* VR applications. By the end of this article you will be able to calibrate your experience to have the cleanest and most accurate configuration possible to make MR green screen videos. Since green screen MR can be useful to both developers and content creators (like streamers or YouTubers), the information in this article will be presented from both a developer and a user perspective. Keep that in mind; some of the information might be more than what is needed for a content creator to begin working in MR immediately. Please reference the section titles as guides to locate relevant information for your purposes. The process of getting an MR experience set up for the first time was a painful and tedious process, until now. Underminer Studios has created the Underminer Studios MR Configurator Tool to smooth the process.
Underminer Studios’ MR Configurator Tool
This tool was designed to speed up calibration of your MR setup. This article explains how to use the tool and help you get the most out of your MR experiences. For VR users and streamers, making MR videos is a great way to show a different perspective to people who aren’t wearing a head-mounted display (HMD), while for VR developers, MR videos are a great way to create trailers and show a more comprehensive view of the VR experience.
How does the Underminer Studios MR Configurator Tool help?
This tool will automate the configuration of the controller/camera offset, reducing the time massively, compared to the difficult manual process of camera alignment. Without this helper utility, you start with a blank externalcamera.cfg file, and manually adjust x, y, and z offset values to align the virtual camera with the real one. Every time you make a change you must shut down the application and restart it, then check alignment, hoping that your configuration is correct, and repeat as necessary. This is a tedious and imprecise process. Our helper utility streamlines and automates this alignment process and makes it much easier to calibrate your MR setup. Download and install the executable, then download and follow the documentation in the read me guide.
Though we had many use cases in mind to appeal to a broad audience, inevitably with developers there are always new, uncharted needs. We plan to update the tool periodically, and if you have ideas to improve the tool, please email us at info@underminerstudios.com.
How to use the MR Configurator tool
There is a handy dandy information file that covers, step-by-step, how to do the application setup; it is included in the MrSetUp.pdf file. A direct link is also provided with your install.
What is Mixed Reality?
In this case, mixed reality refers to a person on a green screen background, layered into video from an MR-enabled VR application. This is a great way to show people outside the VR headset what’s happening in the world within. A user can share their VR experience with others and can help create a more social gaming environment. Once you have a VR application that supports MR, all you need is a suitable camera, some green screen material, and an extra Vive* controller to create your very own MR VR experiences.
What’s required?
A powerful machine
Adding MR on top of VR requires a high-end system to handle the inherent stresses that these applications create. Without a powerful enough system you can encounter performance lag, which may lower your frame rate and create a less than optimal experience, especially for the user wearing the HMD. A higher-end PC is required to provide the MR experience and avoid those issues. I have provided a list below with optimal system requirements on which to run an MR experience.
An MR-enabled application
You can either take a previously made project that is MR-enabled or you can make one for yourself. We will cover both in this section.
How to tell if a VR application will work with this method
Config file
Test to see if a Unity-based VR title supports this MR mode by placing a configuration file named “externalcamera.cfg” into the same directory as the VR executable. For example, say you have a game located at C:\Program Files (x86)\Steam\steamapps\common\APPNAMEHERE\. Just put the file in that folder. Here is an example of a raw config file:
x=0
y=0
z= 0
rx=0
ry=0
rz=0
fov=60
near=0.01
far=1000Note that there is almost zero chance that this will work appropriately right away. Use our application to configure, or go to the manual configuration section below and follow the instructions.
Connect a third Vive controller
Connect a third Vive controller (this controller needs to be plugged in via USB, since SteamVR* only supports two wireless controllers at a time). Launch the VR executable, and if you see a four-panel quartered view on the desktop, the app should work for this method of making MR videos. If you don’t see the four-panel quartered view, it’s likely the app wasn’t made in Unity, or doesn’t support this method of MR video. If you created the VR executable, read on for instructions on how to enable this MR mode. If it’s not your application, you will probably have to choose another application for your MR video.
Developers and Users
If you want to do MR setup inside of Unity go to the Developer section. If you want to learn how to play a readymade MR game move to the User section. First, we will cover the developer side of things, so if your goal is to make cool MR experiences, start here. Later, we will be learning the end user side of things, so if your game already has the development side configured you can start there. We are going to limit this discussion and focus on how to make MR work within Unity and the SteamVR system, since there are many ways to create multiple cameras and green screens, as well as to composite them. I will be using the HTC Vive, but I’ve seen others use the Oculus Rift* and Touch* controllers, but that’s outside the scope of this article. Let’s jump right in!
Developer side
I will show the current native SteamVR plugin method first. It takes a considerable amount of guesswork out of the system setup and provides you with a quick and high-quality MR setup. The tool provided sits on top of the current system, alleviating some of the manual or tedious sections of the process with automated or helper solutions. If at any time the specific setup or process that your project needs is not covered, there is no reason to chuck it, as the rest of the processes are perfectly self-contained.
Native (built-in) SteamVR MR overview
The team that invented this tool was quite brilliant. Using the idea of clipping planes and the location of the player, the SteamVR setup creates multiple views to allow an MR experience. If you want to use the native plugin and enable this in your game you have two separate choices: +third controller and no third controller. Both require the use of externalcamera.cfg.
Example using externalcamera.cfg
This file goes into the root of your project as externalcamera.cfg. This file tells the system in meters how far to offset the camera versus the controller.
x=0
y=0
z= 0
rx=0
ry=0
rz=0
fov=60
near=0.01
far=100
sceneResolutionScale=0.5
What setup to use?
The use of a third controller allows the user to move the camera. If you are planning to have a stationary camera see the No third controller section, below. If your game requires moving the camera, see the +third controller section, below.
No third controller—native SteamVR MR setup
- Pull in the extra controller prefab
- Set the Index of the SteamVR_Tracked Object (Script) to Device 2.
Users need to set up the externalcamera.cfg covered in the Example using externalcamera.cfg section, above.
Note: This requires always using Unity IDE unless you follow the How to not use Unity IDE section, below.
+third controller—native SteamVR MR setup
- Pull in the extra controller prefab
- Set the Index of the SteamVR_Tracked Object (Script) to Device 3.
This is simple in concept. The only thing you’ll need is the extra controller that is attached via USB to the computer playing the game.
Note: This requires always using Unity IDE unless you follow the How to not use Unity IDE section, below.
How to not use Unity IDE—native SteamVR MR setup
Both require you to make the project run within the editor. If you want to make a standalone version there is a bit of extra work to do.
- Add the “SteamVR_ExternalCamera” prefab at the root of your hierarchy.
- Drag and drop the “Controller (third)” into the [SteamVR] script “Steam VR_Render” – External Camera.
- In SteamVR_Render.cs add the following code:
void Awake() { #if (UNITY_5_3 || UNITY_5_2 || UNITY_5_1 || UNITY_5_0) var go = new GameObject("cameraMask"); go.transform.parent = transform; cameraMask = go.AddComponent<SteamVR_CameraMask>(); #endif if (System.IO.File.Exists(externalCameraConfigPath)) { { if (externalCamera == null) { var prefab = Resources.Load<GameObject>("SteamVR_ExternalCamera"); var instance = Instantiate(prefab); instance.gameObject.name = "External Camera"; } externalCamera = instance.transform.GetChild(0).GetComponent<SteamVR_ExternalCamera>(); externalCamera.configPath = externalCameraConfigPath; externalCamera.ReadConfig(); } }
User side
If you already have a VR-ready system you can to jump to the Running MR section, below.
System requirements
I have included both high-end (a) and low-end options (b) below.
Shopping list:
Green screen kit
a. StudioPRO* 3000W continuous output softbox lighting kit with 10ft x 12ft support system, $383.95 (or similar).
b. ePhotoInc* 6 x 9 Feet cotton chroma key backdrop, $18.99.Extra controller
a. Vive controller, $129.99.
b. This solution does not need an extra controller, but if you aren’t the developer who created the game, you must keep the camera stationary and do some workarounds as discussed below.Camera
a. Panasonic* HC-V770 with accessory kit, $499.99 (or similar camera with HDMI out for the live camera view; a DSLR or mirrorless digital camera will probably work, but be aware that their sensors are not designed to be run for long periods of time and can overheat).
Video capture card
a. Magewell* XI100DUSB-HDMI USB capture HDMI 3.0 - $299.00 (or similar HDMI capture device.
Computer
a. You’ve probably already got a VR-capable PC if you’re reading this. Beyond the minimum VR spec, you’ll need a system with enough power to handle the extra work you’re going to ask it to do (running the quartered view at 4x the resolution you intend to record, plus doing the layer capture, green screen chroma key, and MR compositing). A high end, sixth-gen or later Intel® Core™ i7 processor (for example, 7700K or similar) is recommended.
4K Monitor
a. Because of the way MR captures the quartered view window, you’ll want to be able to run that window at 4x the resolution of the final video output. This means you need to be able to run that window at 1440p resolution if you want to record at 720p, since you’re only capturing a quarter of the window at a time. If you want to record at 1080p, you’ll need to run the window at 2160p. For that, you’re going to want a monitor that can handle those resolutions; probably 4K or higher.
A little more about some of the options
Green screen
You could use outdoor carpet (like AstroTurf*) as a backdrop. It looks like it gets decent results and it should last for a very long time, but anything in a single color should work just fine. Green is recommended, as most systems (OBS*, or the screen capture provided for this tutorial) utilize green as a cut-out or chroma key.
Controller
If the project was not set up appropriately and requires an extra controller, there is a possible solution involving faking the third controller in software. Using this option is outside the realm of this article, but if you want to try it out you can learn more here.
Camera
There is a HUGE difference if you go with a real camera versus a webcam. Without being as expensive as a pro camcorder, some great things have come out of the camcorder option listed above. If you use a still camera (DSLR or mirrorless), be aware that their sensors are often not designed to be run constantly due to heat; this is why they often have a 20 or 30 minute limit on video recording. Be careful so you don’t harm your equipment.
Video capture card
If you are using an external camera, a capture card is required to get the HDMI output of the camera to appear as a usable source to the PC. The one listed above uses USB and is a great all-around capture card. Compared to having an internal card that is tied to a system, the best part of the USB capture card is portability. To do an onsite with publishers, clients, or other developers you can just throw it in a bag, send them a build, and show everyone in the room what is going on in the game. It will allow you to convey the information and ideas quickly.
Computer
The project we are doing is computationally intensive, so CPU choice is very important. A modern, high-end Intel Core i7 processor, like a 7700K, is well-suited to a project like this because many of the processes are single-thread intensive (like the compositor from SteamVR) and the high single-core performance will really help. Using a quad core or higher CPU can really help with the work of capturing, compositing, and recording your MR video.
Running MR—setup
To view the setup, you only need the .cfg file below, and a game that allows the use of MR. Some of these games include Fantastic Contraption* and Job Simulator*, Space Pirate Trainer*, Zen Blade*, Tilt Brush*, and many more.
Only after fulfilling these two requirements will the setup view appear:
- Add a file called externalcamera.cfg in the root of your project.
- Have a third controller that is plugged into your system.
Running MR—step by step
Note: These steps will not align the experience with the real world until you configure the .cfg file using the steps below.
Turn off SteamVR
If you have SteamVR on it will have issues with the further processes so it’s best to turn it off. As well, if you run into issues later, a restart will always help.
Put an externalcamera.cfg file into the root of your project
Next, you will need to put the file in the correct location at the root of your project. If you find that your project doesn’t show a four-panel quartered screen, then you will want to verify that root location, after you check the controller.
Set up your green screen and lights
You will be compositing people into the VR environment. To do this correctly you will need to have a green screen setup to cut the person out of the real world and put them into the VR world.
Connect your camera to your computer / capture card
The extra overhead of running VR and MR at the same time almost necessitates having a capture card instead of only using a web cam. Also, a capture card lets you pull in video from a secondary camera.
Affix your controller to your camera
The system always needs to locate the camera and the way we do that is by having the system track the Vive controller. The config file above provides offset and camera information to the system based on the Vive controller’s location in relation to the camera that is attached. The tighter the controller is the better.
Unplug any controllers that are attached to your system
SteamVR gets confused during this process. If you get into the project and realize that the third camera is attached to the wrong controller, unplug the third controller and plug it back in. This should solve the issue.
Turn on SteamVR
Now that this is ready we should to tell SteamVR to get going.
Turn on the two controllers not attached to the camera
We only want to turn on the controllers that aren’t attached so they get to the correct places in the SteamVR handset slots. This is a crucial step, so please do this. I also recommend waving the controllers directly at a lighthouse.
Plug into the system the controller that is attached to the camera
Now that Steam knows where the first two controllers are you can plug in the third controller. As stated before, if you get into the project and realize that the third camera is attached to the wrong controller, unplug the third controller and plug it back in.
Shift and double-mouse click your game of choice
This allows you to open the project at the highest resolution. For some reason SteamVR also gives preferential treatment to admin-running applications, so this should help.
Choose the desired resolution (4x the resolution at which you want to record)
This is where a 4K or higher monitor comes in handy. Since you’re only capturing one-fourth of the window (and compositing multiple layers), you’ll need to choose the correct window size here. If you want to record at 720p, choose 2560 x 1440. If you want to record at 1080p, choose 3840 x 2160. You might have to try different recording resolutions, depending on your system performance and the desired quality of the recording.
Open OBS or XSplit*
Now we are moving on to the live compositing section of the article. Either of these programs are tested to do MR compositing, although there are others out there that might work as well.
Add a cut of the upper-left corner and use the upper-right corner as the alpha; label this layer “Foreground”
This is the part we will composite over people. If you don’t have time to match up the handsets exactly to the VR space, choose a skin for your controller that uses large symbols. This will hide the fact that everything isn’t exactly matched up. Here is a how-to which showcases exactly how to change your controller skins in SteamVR.
Add the video stream from your camera and clear out the background with a chroma filter; label this “Live”
Putting the live person into the VR environment is the most crucial part of this project. Depending on the program you are using there are a multitude of ways that you can do this. Below is a screenshot for XSplit showcasing that you could also color key out a layer, which will remove a single color from the image.
Add the bottom left; label this layer “Background”
We will put this layer in the bottom position in whichever program you are using to composite with. If the background isn’t visible, repeat the step above.
Turn off your game and configure the config file
To make the config file, either use our tool outlined in the beginning of this article, or follow the section below. A word of warning: Manual configuration is not only difficult to get right, it’s also a very slow and laborious task. Every time there is a change made you need to restart your program. The average time for setup is about one hour. We have reduced the process to three minutes on average using the MR Configurator tool. It is also more accurate, since the long setup time usually causes people to give up before the config is perfect.
How to manually calculate the information in externalcamera.cfg
x=0
y=0
z= 0
rx=0
ry=0
rz=0
fov=60
near=0.01
far=100
sceneResolutionScale=0.5
Configuring
Note: Remember that you can use our tool to skip this entire section.
Field of view (FOV) – This must be the vertical FOV
FOV is the hardest one in the setup; find this out first. Most camera manufacturers provide the FOV values of the camera, but this is not the vertical FOV. Most of these techniques come from the camera world. Here is an article on how to find the FOV.
Note: The FOV of a camera is dependent on the focal length. Once you have your settings, do not zoom in or out on your camera!
Rotation
RX, RY, RZ—these are the rotational angles. 0,0,0 would be if the handset was level with the camera. Y+ is up, Z+ is forward, with X+ to the left. As a note, these are in degrees.
Distance
X, Y, and Z should be done using a tape measure. Remember, these numbers will be in meters.
Test
Open your game and with OBS or XSplit running, see if things line up. If not, shut down your game and try again.
Troubleshooting
If your system or game lags, options include lowering the canvas size, lowering the frame rate, using the GPU to encode video, or recording only without streaming. These could also make things worse, depending on the game and your system. With so many different variations to choose from it seems impractical to give profiles. To change these manually do the following:
Lower canvas size
Lower frame rate—be careful here; this can introduce further choppiness if below 24
Render using the GPU
Record only; do not stream
This article is an extension of my skills as a mentor and teacher. Often I am able to lead the path to new and exciting techniques, and I thoroughly enjoy sharing my knowledge with others. Enjoy your MR experiences and share your feedback with me at info@undeminerstudios.com.