Sunday, 15 January 2012

Nelson St. Rebuild

Once the shoot had taken place and we got the shot we wanted in the can, it was then onto the arduous process of post production. As the hard work had been put in during pre production, the tracking process was relatively easy. The techniques I used can be seen on one of my earlier posts, although for this particular shot I actually ended up using the Nuke X camera tracker instead of Autodesk Matchmover. For some reason Matchmover was having a hard time solving. There were a few small issues with exporting the camera track from Nuke to Maya, but luckily I happened upon a neat little python script that did a good job of exporting it for me. That particular script can be found here: http://www.nukepedia.com/gizmos/python-scripts/import-export/fromnuke2maya-export-nuke-cameras/

So, with the footage tracked and the camera imported into Maya, I set about the rather large job of rebuilding the Nelson street environment. As outlined in an earlier post about Environmental Rebuild, I did this in order to give realistic reflections and shadows onto any geo I placed within the scene.

Using the original footage as a guide, I built the entire area out of polygons and textured then via a combination of camera projection and textures captured during the shoot by my lovely lady Elaine.


This screen grab shows the environment visible to the camera rebuilt within Maya. The green static cameras on the walk way chart the path of the tracked camera and have been used to project frames from the original footage onto the geometry.


This screen grab is the view from the actual camera track, looking at the rebuilt and matched geometry with a mixture of projections (lo res) and textures (hi res).

One this environment was finished it was then a matter of checking the track and the geometry to ensure everything lined up okay. In order to cut down on render time, I added some metallic spheres into the scene and hid all the environmental geometry apart from in the sphere's reflection. This meant that all the computer was needing to render was the sphere's and the reflections in the spheres. Much less computationally intensive.
Once the spheres were rendered it was just a matter of importing the footage into Nuke and comping it back over the original footage.


The screen were a last minute addition within Nuke as I still had the original camera track inside of Nuke. Having the same camera in both programs really saves a lot of time.
All in all I'm pretty happy with how this came out and especially happy with the track within Nuke. Seems the Nuke tracker is far superior to Matchmover.

No comments:

Post a Comment