Earlier this year I worked on a dozen shots for the short film "Portal: No Escape", directed by Dan Trachtenberg, together with a remote team of vfx artists supervised by Jon Chesson. The film, with the talented actress Danielle Rayne in the leading role, premiered at this year's Comic Con in San Diego and was released online last week, reaching viral proportions of over 5 million views in less than 6 days, and marked as the 'most liked video' in August (recommended viewing in 1080p HD in FULLSCREEN with sound UP):
After the initial contact with Dan was made via Twitter, I was asked together with my good friend Marijn Eken to do the matchmoving for this project, where I would be responsible for the Hallway-sequence and most of the Roof-shots. In this blog post I'm hoping to give a little insight in some of the matchmove work that was done on this amazing short film.
The film was shot using a Red One camera, using a set of Lomo Anamorphic lenses of 35, 50 and 75mm, where most shots were done with a 50mm. The tracking plates were handed to us in the form of a (quite dark) 1080p unsqueezed and letterboxed Targa sequence, with no additional info on the original Red and anamorphic modes that were used during the shoot, nor how the footage was cropped to HD.
IMAGE DISTORTION
One of the first things that stood out when I studied the tracking plates, was the very prominent high frequency rolling shutter, especially at the end of the sequence (see: image wobbles in shot 17 and 18).
A logical tool for tackling this image-distortion problem, that is often experienced in CMOS cameras, is the Rolling Shutter plug-in by The Foundry. However, using their suggested settings (a correction value of 0.32 for the Red One camera) introduced serious artefacts in the corrected footage, even after experimenting with the other input parameters of the plug-in:
These artefacts could be 'fixed' by a paint artist later in the vfx pipeline, but they can also mess up a 3D track, because tracking points might be 'jumping' on bad frames and therefore not deliver accurate results. This led us to decide to work with the original tracking plates and see how far these would get us...
Next step in the 3D tracking workflow is removing a possible lens distortion from the tracking plates. Because no lens distortion grids were available here, I used 'check lines' to unbend lines in the plate that were assumed to be straight in reality. Unexpectedly this led me to conclude that this unsqueezed anamorphic footage had no significant lens distortion.
CAMERA TRACKING
To get a good first solve for the camera motion, I decided to ignore the whole rolling shutter problem for a moment, and just tried to track the original tracking plates...
Using SynthEyes this turned out to be pretty straightforward. For most shots it was nothing more than increasing the gamma to make possible features in dark areas more apparent, use the Auto-tracker to track the shot, manually delete the bad trackers, solve the shot with a known 50mm lens on a 24.4 x 13.725 mm sensor and finally orient the ground plane in a similar way for each shot to simplify Hallway reconstruction later on. The only shot that needed a different approach was shot 13a, which was almost completely out of focus. By carefully placing 11 manual trackers (using SynthEyes' very efficient workflow for supervised tracking), I was able to solve the shot with the same lens and sensor.
However, even though the camera solve wasn't corrupted by the rolling shutter, the wobble in the plates of course remained... and thus some of the solved 3D positions for the 2D trackers were still sliding in areas were the high frequency rolling shutter appeared. Realising that I would probably have to face this problem at some point, I decided to build a proxy model for the Hallway first...
HALLWAY RECONSTRUCTION
My initial idea was to build one Hallway model that matched the plates in all the shots, so the scene scale and a position within the Hallway would be consistent from shot to shot, which would be a great benefit for the particle simulations that needed to be done later on. This proved to be quite difficult, partially due to the green screen Portal stand-in on the wall changing from shot to shot, and because the walls seemed to be slanting more in one shot than the other.
I started out building a proxy model for shot 14, which had the best view through the Hallway towards the start. Then I copied this model to shots 13 and 16 which have the same viewing direction, changed the scale, position and orientation of the new scene (where needed) to match the model from shot 14 as close as possible, and finally fine-tuned the geometry in areas where differences between shots would appear... In this way the proxy model could vary a little from shot to shot, but consistency in scene scale, position and orientation would be maintained.
Continuing with the shots viewing at the other end of the Hallway, again I used the proxy model from shot 14 to build forward on, and used the same principles to reconstruct a matching geometry for the remaining shots.
3D ROLLING SHUTTER SIMULATION
Once I had a proxy model for the entire Hallway in place, the idea was born to simulate the effect of the high frequency rolling shutter in 3D to get a better match, by deforming the model slightly from frame to frame. Obviously this was a cheat that might help with compositing, but one that should never be used as input for any particle simulations, because they can be influenced by the vibrating walls and ceiling.
An important thing to realize here is that high frequency rolling shutter can be a very local problem. The ceiling might be vibrating up and down while the floor is not, the left wall might be vibrating left and right while the right wall is not, or the right wall might be skewing close to the camera while in the distance it is not. But because in this sequence the camera was always looking along the length of the Hallway, I suspected this image distortion could be simulated by using a lattice deformer.
After some experiments showed that a 3x3x3 lattice around the complete Hallway model was the ideal setup, I was able to find a perfect match for every frame of the sequence by animating individual lattice points in areas were the high frequency rolling shutter appeared:
PORTAL GUN RECONSTRUCTION
For making believable gun blasts, particle work and contact lighting on the Portal gun, a seamless object track and an accurate reconstruction of the Portal gun were essential.
Before reconstruction I would normally place trackers on the Portal gun and calculate a motion path and point cloud first, but because of the lack of trackable features on the gun and the fact that in a lot of frames the black parts of the gun almost disappeared in the background, this workflow was bound to fail.
So I started with building a proxy model for the Portal gun based on available reference photos:
Reference didn't get me all the way there though, because the gun was slightly different from the one in the tracking plates... the barrel was damaged (probably during the shoot) and I found out it was able to slide back and forth, also causing the cables to bend.
So to be 100% sure my proxy model for the Portal gun would match, I needed to revert to the tracking plates and fine tune my first rough version:
Basically this meant that I needed to setup the Portal gun model for rotomation (more on this later), and position and orient it on a number of 'key' frames in a shot with clearly different viewing angles on the gun, giving me enough information to fine tune the proxy model.
Unfortunately though this model couldn't be used for all shots, because it appeared that they used a second 'stunt' version of the gun in shots 17 and 18 where she's fighting off the guard. The shape of this 'stunt' version was so different from the 'original' Portal gun, that I had to build a separate model for it:
PORTAL GUN ROTOMATION
While normally, having an accurate proxy model for the Portal gun would be an ideal starting point for a geometry track, just like the regular object track for this sequence, it was bound to fail... In some shots the black parts of the gun were almost indistinguishable from the background, and in others the gun was moving in and out of sight entirely. Alltogether this made rotomation the best way to go.
First thing to do before starting any rotomation, is making sure the pivot point and rotation order are chosen wisely, and the scaling is worked out properly. Luckily I got some measurements from set which gave me good starting point to set this up:
width of hallway = 5' = 1.524 m
length of actress = 5' 3½'' = 1.613 m
length of gun = 27" = 0.686 m
Using this information, I scaled, positioned and oriented the model of the Portal gun inside the reconstructed Hallway model on a number of 'key' frames in shot 14. Then, while everything seemed to look right in frame, I did the same thing in a couple of other shots, to make sure the scaling I found in shot 14 would place the gun in the Hallway correctly in all the other shots of the sequence as well. To illustrate the importance of this step, check out how much influence a proper scaling has on the positioning of the gun, noting that the rotomation of the gun matches perfectly in frame for both scaling values:
Once the scaling for the Portal gun was locked down, the process of rotomation was basically just animating the gun by hand to make it match on every frame, while making sure that the motion path of the gun was looking realistic in 3D. For every shot I would start with a correctly positioned and oriented gun on a number of 'key' frames in the shot, and fill in the areas in between from there...
While doing this, it proved to be very important to keep an eye on the animation curves... Working from 'key' frames can sometimes result in 'unnatural' jumps in the animation curves, which might look fine in both 2D and 3D, but they can cause strange problems when particle simulations are attached to the gun. Reason for this is that particle simulations not only look on-the-frame only, but also calculate substeps between frames, which look a lot different in both versions of the curves:
SCENE DELIVERY
One more thing to mention before showing the final result is that, while FBX worked fine for getting all this tracking data from Maya to Softimage, Nuke was having a bit of trouble importing all animation curves correctly. Luckily I ran into this very handy FromMaya2Nuke Python script, that automatically prepares a Nuke scene including animated camera and obj-sequences for all moving geometry, making both my life and the compositors' lives a lot easier...
FINAL RESULT
Of course I can't finish this post without showing the final result... so here it goes, with compliments to Dan Trachtenberg, Danielle Rayne, Jon Chesson, Paul Griswold and others who all did a great job on this sequence (recommended viewing in 1080p HD in FULLSCREEN):
Also check out what other team members say about working on this project:
Interview with Dan Trachtenberg - directed and wrote the film
Audio interview with Dan Trachtenberg and Mike Zarin (composer)
Interview with Paul Griswold - did all the particle work on the gun, gun blasts and the portals
Interview with Jonathan Skabla - composited a part of the Cell and Roof-shots