logo

During a Renderman forum at Siggraph last year (“Stupid Renderman Tricks”) a Google VR engineer named Mach Kobayashi presented a method for rendering 360 degree stereoscopic  images using Pixar’s Renderman. At the time, I thought it was less “stupid trick” and more “awesome technique” that I couldn’t wait to try. Recently, I tried it out on a 3D scene of NASA’s earth science satellites.

Standard render:
fleet_standard

360 degree stereo render:
fleet_360_high-ipd
360 degree, or omnidirectional rendering, seems pretty straightforward – the trick is accurately rendering stereo 360 degree images.  As Mach points out, if you shoot rays in every direction from two static points (left/right eye), the distance between the two rays is inconsistent depending on the ray direction, resulting in a varying interpupillary distance (aka, the distance between your eyes, which should remain constant).  Mach’s solution was to rotate the two ray origins about circle when rendering, resulting in a constant interpupillary distance in the two rays (perfect stereo/3D images).

I recently had the opportunity to implement this method within our pipeline at the NASA Scientific Visualization Studio.  We’re starting to explore VR content creation, and it seems like 360 videos are a great stepping stone toward full, interactive VR experiences.  We already use Maya and Renderman in our pipeline, so I was curious to see how hard it would be to create a stereo, omnidirectional camera.  It turns out to be pretty straightforward, thanks to Mach’s work.   I added a few lines to Mach’s shader to adjust camera location and viewing direction.  This allowed me to ‘fly’ the omnidirectional camera through my scene.

The setup:
I started with Mach’s shader code available here.  I made a few changes to add camera location and yaw controls. Compiled the shader, then imported into Slim.

fleet_slim

In Maya, I have a locator (highlighted in green) to represent the omnidirectional camera and a camera pointed at a plane (highlighted in white).  Attach the omnidirectional camera shader to the plane and render the scene from the perspective of the camera pointed at the plane.

fleet_maya

The resulting image is a top/bottom 360 degree stereoscopic  rendering.  A video in this format can be uploaded to YouTube or Facebook (after adding some spherical video metadata) and be viewed in VR using a mobile viewer like Google Cardboard (I use the Google Tech C1 Glass, but any viewer works).  You can also play these videos using standard media players compatible with the Oculus Rift, HTC Vive, GearVR, etc.

If you don’t have a VR viewer, you can view your content in augmented reality on your phone (move phone to change view) or certain web browsers (click and drag to pan around).  YouTube will recognize that the content should not be played back in stereo, and it will only display one half of the image.  

One interesting note – the apparent scale when viewing an image in stereo/VR is determined by the interpupillary distance.  The difference in left/right images is how our brain tells us how large an object is and how far away it is.  The smaller the the interpupillary distance, the less variation between left/right images (rendering rays originate from similar positions).   When I first rendered my test scene with a spinning Earth, the interpupillary distance was unrealistically large.  When viewed in VR, it looked like the Earth was the size of a baseball in front of me, instead of the massive planet I expected to see. Easy enough to fix – just reduce the interpupillary distance. For a real world scale make sure your scene/object units match the real interpupillary distance units. In the image below, the interpupillary distance has been lowered dramatically, so the left and right images appear very similar (Less left/right variation than the same image above).  In VR, the Earth will still appear to be about the same size in the video, but it will be ‘less 3D,’ telling our brains that the object is much further away, and therefore, much larger.

fleet_360_high-ipd

Videos coming soon!

I’ll be attending back-to-back conferences this summer. First is the Gordon Research Conference on Visualization in Science & Education.  This will be my first Gordon conference. I’ve heard great things about Gordon’s smaller, specialized format.

Immediately after the Gordon conference I’m off to LA for SIGGRAPH 2015! Once again I’ve been accepted to present my work at the Dailies session of the conference.   I’ve also been asked to give a longer talk about my work, specifically about my workflow creating technically accurate data-driven visualizations.  (Only bummer is that the talk is scheduled at the same time as the Avengers 2 production session. Current game plan involves dressing like Thor to boost attendance to my talk. Think I’m kidding? You’ll need to attend to find out…)

NASA’s Scientific Visualization Studio has a great showing at SIGGRAPH this year – the studio will be represented by four Dailies talks, my extended talk, one piece in the Electronic Theater, and two pieces in the VR village.

Going to SIGGRAPH and interested in learning about scientific visualization? Details for my sessions are below-

Dailies:
Tuesday, 11 August
Room TBD
3:45-5:15 pm

Extended Dailies talk:
Thursday, 13 August
Room 402AB
3:45-5:15 pm

 

In a nutshell, it’s been a crazy several months.

SIGGRAPH 2014 in Vancouver was great.  I was selected again to present my work at the Dailies session, and I shared work I had done on visualizing objects in space.   I learned later that the audience voted my presentation one of the best of the Dailies!  It’s great to hear that people enjoy my work, and appreciate scientific visualization!

Speaking of scientific visualizations… my big news is that I’ve accepted a job at NASA’s Goddard Space Flight Center!   I’ve joined the Science Visualization Studio (SVS) – a group tasked withed creating scientific visualizations based on data from various NASA missions.  I’ve only been there for a month and a half, but I’ve already worked on several interesting projects, including hurricane/typhoon visualizations and global ocean salinity visualizations.   It’s been a dream of mine to work at NASA (as any Aerospace Engineer would agree), so I’m excited to be doing work I love for such an amazing organization.

Stay tuned for more NASA-themed visualizations!

I’ve been working on conceptualizing some of my ideas for additional levels for vrTANKS. Nothing too fancy – just experimenting with colors and shapes.  The current demo is a simplified version of the first scene.  Adding obstacles to the levels will certainly make the gameplay more interesting (though it will require a rewrite to the enemy tank AI!).

It’s rather fun to design landscapes knowing that you will eventually get to explore them from a first person/VR perspective!

Lots of news from OculusVR recently – New developer kit on the way (pre-ordered mine – can’t wait for positional tracking and a new, high resolution display!), and less than a week later OculusVR is acquired by FaceBook for $2B.  Reactions to the deal have been all over the place. As an early Kickstarter backer for OculusVR, I can’t say I was too pleased initially, but I guess we will see how the dust settles. If anything, the acquisition confirms that VR is about to be a big deal.

vrTANKS_LevelConcept_Blue_01

vrTANKS_LevelConcept_Green_01

vrTANKS_LevelConcept_Orange_01

vrTANKS_LevelConcept_Purple_01

Results of the Intel Perceptual Computing Challenge have been posted, and my vrDrums application took third place! 

vrDrums is an application I made that combines the Intel Perceptual Computing SDK, the Creative Interactive Gesture Camera Kit, and the Oculus Rift virtual reality headset.

The complete list of winners is posted here.  Congrats to all who participated!

I just recently finished my submission for the Intel Perceptual Computing Challenge. I was selected to participate in the “Open Innovation” category after I submitted a proposal to build something that combined the gesture recognition capabilities of the Creative Interactive Gesture camera with the virtual reality capability of the Oculus Rift.  They sent me a fancy new camera and after poking around for a couple weeks I came up with a virtual drum set you can play with your fingers. The delay makes playing anything at speed a bit tricky, but it’s something fun to play with! Here is the video I submitted:

This is the basic game demo I recently submitted to the official Oculus VR Game Jam

Great Scott, Asteroids!  is a simple space shooter.  You sit in the gunner seat of a large spaceship, and you are tasked with clearing an asteroid field to avoid damage to your ship.

GreatScott-Asteroids_vrGameJam_KelElkins_Screenshot_med

Controls are simple – use your head tracking turret to aim and fire shots.  (xbox controller trigger, or spacebar on keyboard)

I was hoping to take this idea further but I ran out of time, so this is more of a tech demo than a full-on game.  It’s super simple right now, and there are loads of things to improve, but I hope you guys enjoy.

Download here: 
https://dl.dropboxusercontent.com/u/9361448/GreatScott-Asteroids_KelElkins_vrGameJam.zip

Let me know what you think!

 

It’s been a crazy several weeks! In a nutshell:

– I released an early build of a Unity game I have been working on for the Oculus Rift (vrTANKS) about a week before SIGGRAPH. The response has been incredible – it’s great to see so many people enjoying my game.  I even got a mention in PCmag!  I’ve gotten a lot of great feedback, so hopefully I can carve out some time to work on a few of the more popular suggestions.

– SIGGRAPH was awesome, as always. My Dailies presentation went really well – I think there were about 1200 people in that room, and I was voted as one of the top presentations!  (Keep an eye out for the SIGGRAPH 2014 webpage – it sounds like a recording of my presentation might end up on the Dailies page.) Lots of great sessions this year –they even had a Virtual Reality session with presentations by developers working with the Oculus Rift. The developers had some great tips that I’m hoping to roll into my own Rift projects (Denny Unger’s method of simulating neck bend translation based on head pitch – brilliant!) As usual, the week flew by and I wasn’t able to catch every session I wanted to see, but it was still a great conference. I’m already looking forward to SIGGRAPH 2014 – can’t wait to go back to Vancouver!

– Several weeks ago I submitted a couple project ideas to the Intel Perceptual Computing Challenge, and they accepted one of my proposals and sent me a free camera to play with.  I’m hoping to combine the depth sensor on the camera with the Oculus Rift to create a new, immersive experience.  (Check out my blog post on my submission!)

– I have also started yet another game for the Oculus Rift. I found out a week after I published vrTanks that OculusVR is running a VR Game Jam competition over the next couple weeks (bummer that pre-existing projects can’t be submitted!).  I’ve got an idea that works on a variation of my vrTANKS mechanic, so I’m hoping to have something simple and fun to contribute to the competition.  (Check out my blog post on my submission!)

All great projects – I’m excited to see how everything turns out. It looks like vrTanks updates will have to wait until after the VR Game Jam, so anyone who is waiting for those please be patient! Keep the feedback coming!

Oh yeah, I’m also getting married in September 🙂

Busy busy, but all great things!

Cheers,

Kel

I’d like to share an early build of a game/demo for the Oculus Rift that I’ve been working on for the past couple weeks.

vrTANKS_v0.1

vrTANKS is an arcade-style tank simulator. Your goal is to destroy as many enemy tanks as you can before being destroyed yourself.  Pilot your tank using an Xbox controller or keyboard – tank weapons fire in the direction you are looking.

vrTANKS was built from the ground up specifically for the Rift, so I hope you guys like it!  Give it a go and let me know what you think.  I’ve got some ideas for future development (better models, additional enemies, alternative settings, powerups, etc), but I’d like to get some feedback early on to see if it is worth taking the idea further.  

Questions/comments/suggestions/complaints/anything – hit me at Kel.Elkins@gmail.com

Download vrTANKS v0.1
https://share.oculusvr.com/app/vrtanks

Controls:
Xbox controller – joystick to move, right trigger to fire
Keyboard a,s,d,w to move, space to fire.

Tips: keep moving, and be on the lookout for special enemies!

 

I’m about 4 weeks in with the Oculus Rift, so I thought this may be a good time for an update.

On Motion Sickness

In my ‘First Impressions’ post, I voiced my concerns about motions sickness in the Rift.  My first experience wearing the Rift was awesome, but also very nauseating, so I had some real concerns about the technology moving forward.   I’m very happy to report that I’m doing much better now, and I can handle longer sessions (1hr+) without feeling too bad.  (I followed Valve’s advice of limiting initial sessions to 15-20 min for the first several days, and slowly worked up from there)

My advice for first time users:

  • When you first put on the headset, resist the urge to move around in the space.  Begin by looking around you, slowly.  Remain stationary, and look up, down, left, and right. The first time you put on the headset, you’re going to get that ‘WOW’ moment, and you’ll want to move around, but you need to give your body time to adjust.
  • Once you get acclimated to the environment, start with simple, forward movements. Try to avoid motions that feel particularly unnatural – strafing sideways, rotating view with a joystick, etc.
  • Don’t try to ‘push through’ motion sickness – it will only get worse if you keep playing. If you start feeling crappy, that is your cue to take a break.
  • Stick to the simple, easier demos as you begin working in VR– I recommend trying the Unity Tuscany scene (great for a first experience), then perhaps Titans of Space, Blue Marble, or Ocean Rift.  Proton Pulse is a great simple arcade game that is fast paced, but won’t leave you feeling motion sick.  The Rift Coaster is a riot, but be sure you have worked up to it, or you might be left feeling queasy!

 

On Working with the Rift

I am having a blast programming for the Rift!  I am working on a simple arcade style concept in Unity, and so far Rift integration has been pretty straightforward.  The Oculus guys did a great job setting everything up so you can really hit the ground running.  It was as simple as importing an Oculus package into Unity – the package pulled in the necessary cameras, scripts, etc. Everything works right away, so you can really focus on your game content, instead of tinkering with the stereoscopic views and head tracking.

I’m hoping to have a playable Alpha of my game out sometime in the next couple weeks, so stay tuned. The concept is based on a favorite childhood game, and so far it is playing out exactly as I hoped it would in VR.  Hell, even my fiancée enjoys playing it, and she isn’t a gamer!

In the meantime, here is a video I shot of one of my co-workers trying the Rift for the first time!