Quantcast
Channel: Intel Developer Zone Articles
Viewing all articles
Browse latest Browse all 3384

Using the Intel® RealSense™ Camera with TouchDesigner*: Part 3

$
0
0

The Intel® RealSense™ camera (R200) is a vital tool for creating VR and AR projects and real time performance interactivity. I found TouchDesigner*, created by Derivative*,  to be an excellent program for utilizing the information provided by the Intel RealSense cameras.

This third article is written from the standpoint of creating real time interactivity in performances using the Intel RealSense camera in combination with TouchDesigner. Interactivity in a performance always adds a magical element. There will be example photos and videos from an in-progress interactive dance piece I am directing and creating visuals for. There will be demos showing how you can create different interactive effects using the Intel RealSense camera (R200).  The interactive performance dance demo takes place in the Vortex Immersion dome in Los Angeles where I am a resident artist. The dancer and choreographer is Stevie Rae Gibbs, and Tim Hicks, cinematography and VR live action shooting, assisted me. The music was created by Winter Lazerus. The movies embedded in this article were shot by Chris Casady and Winter Lazerus.

Things to Consider When Creating an Interactive Immersive Project.

Just as in any performance there needs to be a theme. The theme of this short interactive demo is simple, liberation from what is trapping the dancer, the box weighing her down. The interactivity contributed to this theme. The effects were linked to the skeletal movements of the dancer and some were linked to the color and depth information provided by the Intel RealSense camera. The obvious linking of the effects to the dancer contributed to a sense of magic. The choreography and dancing had to work with the effects. Besides the use of theatrical lighting care had to be taken so that enough light was on the subject so that the Intel RealSense cameras could properly register. The camera distances from the dancer also had to be considered, taking into account the range of the camera and the effect wanted. The dancer also had to be careful to stay within the effective camera range.

The demo dance project is an immersive full dome performance piece so it had to be mapped to the dome. Having the effects mapped to the dome also influenced their look. For the Vortex Immersion dome, Jeff Smith of Eye Vapor has created a TouchDesigner interface for dome mapping. I used this interface as the base layer within which to put my TouchDesigner programming of the interactive effects.

Jeff Smith on Mapping the Dome Using TouchDesigner:

“There were several challenges in creating a real time mapping solution for a dome using TouchDesigner. One of the first things we had to work through was getting a perspective corrected image through each projector. The solution, which is well known now, is to place virtual projectors inside a mapped virtual dome and render out an image for each projector. Another challenge was to develop a set of alignment and blending tools to be able to perfectly calibrate and blend the projected image. And finally, we had to develop custom GLSL shaders to render real time fisheye imagery”.

Tim Hicks on Technical Aspects of Working with the RealSense Camera

“Working with the Intel RealSense camera was extremely efficient in creating a simple and stable workflow to connect our performer’s gestures through TouchDesigner, and then rendered out as interactive animations. Setup is quick and performance is reliable, even in low light, which is always an issue when working inside an immersive digital projection dome.”

Notes for Utilizing TouchDesigner and the Intel RealSense Camera

Like Part 1 and Part 2, Part 3 is aimed at those familiar with using TouchDesigner and its interface. If you are unfamiliar with TouchDesigner, before you follow the demos I recommend that you review some of the documentation and videos available here: Learning TouchDesigner. The Part 1 and Part 2 articles walk you through use of the TouchDesigner nodes described in this article, and provide sample .toe files to get you started.

free non-commercial copy of TouchDesigner is available and is fully functional, except that the highest resolution is limited to 1280 x 1280.  

Note: When using the Intel RealSense camera, it is important to pay attention to its range for best results.

Demo #1: Using the Depth Mapping of the R200 and SR300 Camera

This is a simple and effective way to create interactive colored lines that respond to the movement of the performer. In the case of this performance, the lines wrapped and animated around the entire dome in response to the movement of the dancer.

  1. Create the nodes you will need, arrange, and connect/wire them in a horizontal row in this order:
    • RealSense TOP node
    • Level TOP node
    • Luma Level TOP node
  2. Open the Setup parameters page of the RealSense TOP node and set the Image parameter to Depth.
  3. Set the parameters of the Level TOP and the Luma Level TOP to offset the brightness and contrast. Judge this by looking at the result you are getting in the effect.
    Figure1. You are using the Depth setting in the RealSense TOP node for the R200 camera.
    Figure1.You are using the Depth setting in the RealSense TOP node for the R200 camera.
  4. Create a Blur TOP and a Displace TOP.
  5. Wire the Luma Level TOP to the Blur TOP and the top connector on the Displace TOP.
  6. Connect the Blur TOP to the bottom connector of the Displace TOP (Note: the filter size of the blur should be based on what you want your final effect to look like).
    ​Figure 2. Set the Filter for the Blur TOP at 100 as a starting point
    Figure 2.Set the Filter for the Blur TOP at 100 as a starting point
  7. Create a Ramp TOP, a Composite TOP.
  8. Choose the colors you want your line to be in the Ramp TOP.
  9. Connect the Displace TOP to the top connector in the Composite TOP and the Ramp TOP to the bottom connector in the Displace TOP.
    ​Figure 3. You are using the Depth setting in the RealSense TOP node for the R200 camera.
    Figure 3.You are using the Depth setting in the RealSense TOP node for the R200 camera.
     
    ​Figure 4. The complete network for the effect.
    Figure 4.The complete network for the effect.
  10. Watch how the line reacts to the performer's motions.
    Figure 5.Videofrom the demo performance of the colored lines created from the depth mapping of the performer by the RealSense camera.

Demo #2: RealSense TOP Depth Mapping Second Effect

In this demo, we use TouchDesigner with the depth feature of the Intel RealSense R200 Camera to project duplicates of the performer and offset them in time. I used it in the demo performance to project several images of the dancer moving at different times, creating the illusion of more than one dancer. Note that this effect was not in the dance performance but it is very worth using.

  1. Add a RealSense TOP node to your scene.
  2. On the Setup parameters page for the RealSense TOP node, for the Image parameter select Depth.

    Create two Level TOP nodes and connect the RealSense TOP node to each of them.
    Figure 6. You are using the Depth setting in the RealSense TOP node for the R200 camera.
    Figure 6.You are using the Depth setting in the RealSense TOP node for the R200 camera.
  3. Adjust the level node parameters to give you the amount of contrast and brightness you want on your effect. You might go back after seeing the effect and readjust the parameters. As a starting point for both Level TOPS, in the Pre Parameters page, set the Brightness parameter to 2 and the Gamma parameter to 1.75.
  4. Create a Transform TOP and wire it to level2 TOP.
  5. In the Transform TOP Parameters, on the Transform page, set the Translate x parameter to .2.Note that translating the x 1 would move the image fully off.
  6. Create two Cache TOP nodes and wire one to the Transform TOP and one to level1 TOP.
  7. On the cache1 TOPs parameters Cache Page, set Cache Size to 32 and Output Index to -20.
  8. On the cache2 TOPs parameters Cache Page, set Cache Size to 32 and the Output Index to -40. I am using the Cache TOP to save and offset the timing of the images. Note that once you see how your effect is working with your performance you will want to go back and readjust these settings.

    Notes on the Cache TOP: The Cache TOP can be used to freeze images in the TOP by turning the Active parameter to Off. (You can set the cache size to 1.) The Cache TOP acts as a delay if you set Output Index to a negative number and leave the Active parameter at On. Once a sequence of images has been captures by turning the On parameter on and off, they can be looped by animating the Output Index parameter.

    For more info on the Cache TOP, click here.

    Figure 7. You could add in more Level TOPS to create more duplicates.
    Figure 7.You could add in more Level TOPS to create more duplicates.
  9. Wire both Cache TOPS to a Difference TOP.
    Figure 8. The Cache TOPS are wired to the Diff TOP so that both images of the performer will be seen.
    Figure 8.The Cache TOPS are wired to the Diff TOP so that both images of the performer will be seen.
     
    Figure 9. The entire network for the effect. Look at the effect when projected in your performance, go back, and adjust the node parameters as necessary.
    Figure 9.The entire network for the effect. Look at the effect when projected in your performance, go back, and adjust the node parameters as necessary.

Demo #3: RealSense TOP Color Mapping For Texture Mapping

Using the RealSense TOP node to texture map the geometries, in this case the boxes with the dancers moving image.

  1. Create a Geometry COMP and go inside it, down one level (/project1/geo1) and create an In SOP.
  2. Go back up to project1 and create a Box SOP.
  3. In the Box SOP Parameters, set the Texture Coordinates to Face Outside. This will insure that each face will get the full texture (Zero to 1).
  4. Wire the Box SOP to the Geometry COMPs input.
  5. Create a RealSense TOP Node and in the Parameters Setup page, set the Model to R200 and the Image to Color.
  6. Create a Phong MAT and in the Parameters RGB page set the Color Map to realsense1 or alternatively drag the RealSense TOP node into the Color Map parameter.
  7. In the Geo COMP Render parameter page, Material put phong1
  8. Create a Render TOP, a Camera COMP, a Light COMP.
  9. Create a Reorder TOP and in the Reorder parameter page, set the Output Alpha, Input 1 to One using the drop down.
    Figure 10. The entire network to show how the RealSense R200 Color mode can be used to   texture all sides of a Box Geo.
    Figure 10.The entire network to show how the Intel RealSense R200 Color mode can be used to texture all sides of a Box Geo.
     
    Figure 11.The dancer appears to be holding up the box, which is textured with her image.
     
    Figure 12. Multiple boxes with the image of the dancer animate around the dancer once she has lifted the box off herself.
    Figure 12.Multiple boxes with the image of the dancer animate around the dancer once she has lifted the box off herself.

Demo #4: RealSense CHOP Movement Control Over Large Particle Sphere

For this effect, I wanted the dancer to be able to interact playfully with a large particle ball. She moves towards the sphere and it moves away from her.

  1. Create a RealSense CHOP node. In the Parameters, Set Up page, Model to be an R200, the Mode to Finger/FaceTracking. Turn On the Person Center-Mass World Position and the Persons Center Mass Color Position.
  2. Connect the RealSense CHOP node to a Select CHOP node.
  3. In the Select CHOP, Select page, set the ChannelNames to, person1_center_mass:tx.
  4. Create Math CHOP node, leave the defaults on for now, (You can adjust them later as needed in your performance) and wire the select CHOP node to the Math CHOP node.
  5. Create a Lag CHOP node and wire the Math CHOP node to that.
  6. Connect the Lag CHOP node to a Null CHOP node and connect the Null CHOP node to a Trail CHOP node.
    Figure 13. The entire network to show how the RealSense R200 CHOP can be hooked up. The Trail CHOP node is very useful for seeing if and how much the RealSense camera is working.
    Figure 13.The entire network to show how the RealSense R200 CHOP can be hooked up. The Trail CHOP node is very useful for seeing if and how much the RealSense camera is working.
  7. Create a Taurus SOP, connect it to a Transform SOP and then connect the Transform SOP to a Material SOP.
  8. Create a Point Sprite MAT.
  9. In the Point Sprite MAT, Point Sprite parameters page, choose a yellow color.
  10. In the Material SOP, parameters page, set the Material to pointsprite1
  11. Create a Copy SOP, keep its default parameter settings, and wire the Material SOP to the bottom connection on it.
  12. Create a Sphere SOP, wire it to a Particle SOP.
  13. Wire the Particle SOP to the top connector in the Copy SOP.
  14. In the Particle SOP, State parameter page, Particle Type, to Render as Point Sprites.
  15. Connect the Copy SOP to a Geo COMP. Go one level down project1/geo1. Delete the Torus SOP and create an In SOP.
    Figure 14. For the more advanced a Point Sprite MAT can be used to change the look of the particles
    Figure 14.For the more advanced a Point Sprite MAT can be used to change the look of the particles
  16. Export the personal1_center_mass:tx channel from the Null SOP to the Transform SOP parameters, the Transform page, Translate tx
    Figure 15. Exporting the channel.
    Figure 15. Exporting the channel.
     
    Figure 16.The large particle ball assume a personality as the dancer plays with it, trying to catch it.

Demo #5: Buttons to Control Effects

Turning on and off interactive effects is important. In this demo, I will show the simplest way to do this using a button.

  1. Create a Button COMP.
  2. Connect it to a Null CHOP.
  3. Activate and export the channel from the Null CHOP to the Parameters, Render Page of the Geo COMP from the previous Demo 4. Pressing the button will turn the render of the Geo COMP on and off.
    Figure 17. An elementary button set up
    Figure 17.An elementary button set up

Summary

This article is designed to give the reader some basic starting points, techniques and ideas as to how to use the RealSense camera to create interactivity in a performance. There are many more sophisticated effects to be explored using the RealSense camera in combination with TouchDesigner.

Related Applications

Many apps that people have created for the RealSense camera are very useful.

https://appshowcase.intel.com/en-us/realsense/node/9167?cam=all-cam - drummer app for Intel RealSense Cameras.

https://appshowcase.intel.com/en-us/realsense?cam=all-cam - apps for all Intel RealSense cameras.

About the Author

Audri Phillips is a visualist/3d animator based out of Los Angeles, with a wide range of experience that includes over 25 years working in the visual effects/entertainment industry in studios such as Sony*, Rhythm and Hues*, Digital Domain*, Disney*, and Dreamworks* feature animation. Starting out as a painter she was quickly drawn to time based art. Always interested in using new tools she has been a pioneer of using computer animation/art in experimental film work including immersive performances. Now she has taken her talents into the creation of VR. Samsung* recently curated her work into their new Gear Indie Milk VR channel.

Her latest immersive work/animations include: Multi Media Animations for "Implosion a Dance Festival" 2015 at the Los Angeles Theater Center, 4 Full dome Concerts in the Vortex Immersion dome, one with the well-known composer/musician Steve Roach. The most recent being the fulldome concert, "Relentless Universe”. She also created animated content for the dome show for the TV series, “Constantine*” shown at the 2014 Comic-Con convention. Several of her Fulldome pieces, “Migrations” and “Relentless Beauty”, have been juried into "Currents", The Santa Fe International New Media Festival, and Jena FullDome Festival in Germany. She exhibits in the Young Projects gallery in Los Angeles.

She writes online content and a blog for Intel®. Audri is an Adjunct professor at Woodbury University, a founding member and leader of the Los Angeles Abstract Film Group, founder of the Hybrid Reality Studio (dedicated to creating VR content), a board member of the Iota Center, and an exhibiting member of the LA Art Lab. In 2011 Audri became a resident artist of Vortex Immersion Media and the c3: CreateLAB. A selection of her works are available on Vimeo , on creativeVJ and on Vrideo.


Viewing all articles
Browse latest Browse all 3384

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>