Monday 16 January 2012

CG Elements

With the virtual environment created and the footage shot and tracked, it then came time to populate the scene with CG elements.
The over all look that I am going for with this shot is a kind of fantasy futurist vision of Bristol. So, naturally there will be flying cars and large buildings. The composition of the original shot allowed for a number CG elements to be discretely placed within the scene. Here is a small example of a few of the elements I am currently working on and how they will fit into the final shot.


The above image is a tower that I modelled a while ago, but for this project there have been a few issues with texturing which I am currently working through.

The tower is situated amongst other sky scrapers at the end of the street and into the distance.
All of these elements have been lit using the Mental Ray Physical sun and sky illumination. This gives a very realistic representation of where the sun is placed and how it illuminates the geometry. All elements are matted making the final comping job possible.

As promised, a flying car. All elements reflect the environmental geometry, with the environments primary visibility hidden.

Once all these elements were in place I did a quick playblast to check the track and how it all fit together.

Sunday 15 January 2012

Nelson St. Rebuild

Once the shoot had taken place and we got the shot we wanted in the can, it was then onto the arduous process of post production. As the hard work had been put in during pre production, the tracking process was relatively easy. The techniques I used can be seen on one of my earlier posts, although for this particular shot I actually ended up using the Nuke X camera tracker instead of Autodesk Matchmover. For some reason Matchmover was having a hard time solving. There were a few small issues with exporting the camera track from Nuke to Maya, but luckily I happened upon a neat little python script that did a good job of exporting it for me. That particular script can be found here: http://www.nukepedia.com/gizmos/python-scripts/import-export/fromnuke2maya-export-nuke-cameras/

So, with the footage tracked and the camera imported into Maya, I set about the rather large job of rebuilding the Nelson street environment. As outlined in an earlier post about Environmental Rebuild, I did this in order to give realistic reflections and shadows onto any geo I placed within the scene.

Using the original footage as a guide, I built the entire area out of polygons and textured then via a combination of camera projection and textures captured during the shoot by my lovely lady Elaine.


This screen grab shows the environment visible to the camera rebuilt within Maya. The green static cameras on the walk way chart the path of the tracked camera and have been used to project frames from the original footage onto the geometry.


This screen grab is the view from the actual camera track, looking at the rebuilt and matched geometry with a mixture of projections (lo res) and textures (hi res).

One this environment was finished it was then a matter of checking the track and the geometry to ensure everything lined up okay. In order to cut down on render time, I added some metallic spheres into the scene and hid all the environmental geometry apart from in the sphere's reflection. This meant that all the computer was needing to render was the sphere's and the reflections in the spheres. Much less computationally intensive.
Once the spheres were rendered it was just a matter of importing the footage into Nuke and comping it back over the original footage.


The screen were a last minute addition within Nuke as I still had the original camera track inside of Nuke. Having the same camera in both programs really saves a lot of time.
All in all I'm pretty happy with how this came out and especially happy with the track within Nuke. Seems the Nuke tracker is far superior to Matchmover.

The Shoot

So, After much Pre-Production, location scouting and technical investigation we were finally ready to shoot. For this I called on my good friends, talented technicians and lady friend for a morning of shooting. As I planned to film on a Friday morning I called up the Bristol Film Office who advised me there just happened to be another shoot going on at the exact same time in the exact same place and that it wouldn't be possible to film at that time. I kindly thanked the lady, hung up the phone and filmed there anyway.

For the shoot I hired out a number of pieces of equipment from the media centre. The first and most important piece of kit was the camera. New to UWE this year were the additions of the Panasonic af101 film camera. These are some pretty slick camera's, shooting in HD, but the nice part is the use of different lenses.


We had the choice of using either a 14mm pancake lens standard 35mm macro lens. The choice of lens is very important in the post production process, especially when it comes to matchmoving as excessive lens distortion can be problematic in gaining a smooth track. After some tests we decided to still with the 35mm macro set to infinity. This allowed us not to worry about focus issues, both on location and in post production.

As the location test shots had demonstrated, a hand held camera was proving very difficult to track. There were a number of ways around this including a ghetto setup tripod steady cam, however as we had the use of a dolly at our disposal, we decided to go for the Key West dolly (no track).


The dolly proved very good at attaining smooth motion although some stabilization will need to be added on the final shot de-jitter some of the pan.

Ollie, Rob and myself setting up the dolly and camera.
Once all the equipment was setup up we did a number of takes, testing out lenses and varying heights of the camera. Every take that was shot was logged. Additional measurements were taken such as; camera height, from ground, use of lens and measurements of objects and architecture on location to input to matchmoving software in order to recreate the environment.

After a number of tests, this is the final shot that I settled on. This shot, although far from perfect had a smooth fluid enough motion to track and good height off the ground in order to see the geographical features.


All in all the shoot went very smoothly and we managed to get a good enough shot to manipulate in post. I feel that the location reccies and pre-production test shots were immensely valuable in order to avoid any major mistakes that may call for a re-shoot.

Tuesday 8 November 2011

Camera Projection

Yesterday was supposed to be the day when this shot came together, it was supposed to be a shinning moment, and then the track didn't work and blew my whole operation. I ended up doing a blog entry about reflections and shadows and some other hot air to try and cover up the fact my track didn't work.

Well, today, even though I'm ashamed of it I'm going to post the track that looks like arse, but I'm also going to post a little bit about camera projection, which in this case has been a bit of a get out of jail free card.

As I said in my last post, there was a number of benefits in recreating the physical environment within Maya, accurate lighting and shadows to name a few, but the ability to generate a believable camera projection in place of texturing every surface is another.


This image that I'm reusing again because I'm to lazy to render another shot out show's the camera projection at work. In my last post I spoke about how camera projection works to generate reflections onto the ball, this time however I am using the projection to actually texture the geometry.


This Maya view window animation gives a graphic representation of the geometry, the renderable camera and the projection camera.


This is my track that unfortunately didn't work out, which is always a shame. In this particular render, I hid the virtual geometry and just rendered out the ball with it's reflections (last blog entry) and then added the actual footage back in at the comp stage. This helped cut down render times although as you can see. Because the ball  movement didn't match the footage it looks horrible and there is next to nothing that can be done to fix it apart from retracking the footage.

So, what I decided to do was re-render the scene, only this time I would only render the virtual environment with 1 single frame of the footage projected onto it. This image to be precise:


As the original footage now appears nowhere in these sequence there can be no conflict between track and footage. The following shot is the virtual geometry with the one singe frame being projected into it by a locked off projection camera and the match-moved camera moving through the scene.

As you can see there are a few anomalies towards the end as the renderable camera moves out of the range of the projected frame, although if I had the inclination to fix it, this could be rectified with multiple projection cameras throughout the scene stitching together the whole footage from 4 or 5 frames. Also notice that the ball now fits perfectly in the sequence with no slippage at all due to there being no original footage, it is 100% virtual.

All in all I'm pretty impressed with the way it came out and it just goes to show that even when things fuck up badly, there's still a positive to come out of it!

Saturday 5 November 2011

Shadows and Environmental Reflections

So, after creating a virtual environment from a live action plate I thought I'd take a look at what it was possible to achieve within this environment.
Unfortunately for this exercise when I rendered the scene out it soon became obvious that the 3D track was not up to par and as such the geometry I placed into the scene slipped and jumped about the place, thus making the whole shot look like pure arse.
This was a bit of a shame because all the work up to then looked promising. This however is another good lesson to learn and just re-enforces the practice of getting things right at the input, or initial stage of production. That age old allegory; "you can't polish a dog turd" still holds true today and is something I will be especially mindful of when it comes to producing the money shots.

So, as the sequence looked rubbish that doesn't mean that the still frames looked bad, so here's a little look at what I did.




Back at the old location again, I decided in true 3D CG style to put a sphere right in the middle of the scene. Why I hear you ask? Because I can. But, this isn't just any sphere, this is a sphere with a true to life shadow and environmental reflections!

If you saw my previous blog entry on 3D environments, you'll soon notice that from this match moved footage I recreated this environment within Maya. As this environment very closely matched the actual physical space and the camera move was (roughly) taken from the physical camera, it allowed me to use a technique known as Camera Projection to map the reflections onto the sphere. Here's how it works:


This mash of a rendering above is the geometry of the match modelled space, however instead of texturing this space with UV co-ordinates, I duplicated the 3D match-moved camera, movement, translation and all and used this second identical camera as a sort of film projector if you will. This projection camera projected the original footage that I filmed back onto the blank 3D geometry (which you can see above), thus creating perfectly matched reflections for the sphere to pick up on. Once this was achieved it was just a matter of rendering all the elements out separately for comp.
This is done via the Maya render layers where it is possible in render to turn off the primary visibility of geometry but still have it picked up in reflections.

 
This image is an example of that, as all the other geometry has had it's primary visibility disabled but is still casting reflections.
The final element in this shot was the shadow which again, was rendered in the same way. Using the 3D geo to help with the shadow, I rendered out a simple shadow pass on the alpha channel for use in comp.


So there you have it. All the elements of placing 3D geo into a match moved scene. There were a few final touches such as a colour and grade correction and a defocus on the ball to fit it in the scene but I'll cover them properly down the line. And maybe next time I can get track to work so there's some proper footage to look at!

Friday 4 November 2011

3D Environments and plate stabilization.

Having looked at pretty basic match moving in previous posts I thought Id take it a step further today and try and Match Move some footage and then recreate the environment within Maya to match the camera.

After much trial and error of footage, also outlined in previous posts, I managed to find a piece of footage that was appropriate for this exercise. The main issue that I was coming up against was the jerkiness of the handheld footage, however on this take I managed to feed the footage into Nuke first, track the footage and smooth it out using Nukes de-jitter transform option. I might also add that the de-jitter option in Nuke is something that AE lacks and is one of the most useful plugs out there for film makers.

Above footage before the De-Jitter tracker was added

Footage after De-Jitter

Once the footage was stabilized I outputted it into Autodesk Matchmover once again and tracked the footage, creating a bunch of 3D track points. With the footage stabilized and less movement in the X and Y axis then previous shots, the solve worked out very well. Although there were a few focal adorations due to the fact it was my girl friends hand held stills camera on auto focus... 

Once the shot had been solved I imported to to Maya and got onto digitally recreating the environment. As I didn't have my tape measure handy when I shot this footage a lot of the geometry placement came down to guess work and trial and error. This was a good lesson for next time however, it is vitally important to take measurements of your environment!


The above frame is a snapshot of the very basic scene recreation. There is no detail as such and is all basic geometry, but with the match moved camera and the now virtual environment, this was the result of the render.


As you can see, the movement exactly matches that of the physical camera and the environment more or less lines up with its physical counterpart (apart from a few slips here and there!)
The real power of this technique is not in the recreating an already existing physical space, but more as a great frame of reference for adding 3D elements to the original footage. With the ground plane and walls added to the Maya scene, this ultimately helps to achieve realistic shadows, lighting and reflections for anything else I may want to add to this scene. Although this one is very rough, it is very promising for things to come!

Wednesday 2 November 2011

Motion Graphics with After Effects

Yesterday I had a brief look at motion graphics and basic compositing inside of Nuke and how all these elements of a mock advert fit together. Today I'm going to be looking at something similar within After Effects.

As a general rule of thumb After Effects is sneered upon within the compositing community at large as something of a kiddies toy application, or a kind of a "my first compositing" package, and to be honest it's not with out some foundations. However this doesn't mean After Effect doesn't have it's place within creative industry, because as a motion graphics application, AE is second to none.

The tools that are provided with a Program like Nuke are perfectly suited to the high end compositing arena, more specifically working as a finishing package for assets created elsewhere (although with the addition of such elements like its 3D particle system that is beginning to change). After effects on the other hand is much more suited to creating these assets and compositing them as an in-the-box style of work flow.

The first clip that I made for this exercise was a pretty straight piece of motion graphics serving as another mock advert for my impending street scene. The matte of this particular shot was outlined in my Green Screen and Chroma Key blog entry a few days ago when looking at the technique of hard and soft matting.



This video differs from the Nuke motion graphics advert in the sense that all of the assets contained in this advert were created within After Effects itself and not having to rely on Maya to create anything. This is where the power of After Effects lies. * I should point out at this point that the workflow of After Effects is seriously compromised by it's stability and is something I have had a lot of issues with. This is an area where Nuke wins hands down as being one of the most stable applications I've used.


Anyway, after attaining a good key using Keylight (cross-application keyer works the same in AE as Nuke) I then set out to create the sequences look.

As the content of this particular shot oozed cheese, I wanted to create a style of advert that matched Adams dynamic performance. When I think about Cheesy adverts, generally I can't go past Japanese advertising. I love it. Generally my perception of these adverts is short and to the point with lots of bright colours.

As a point of reference I was influenced by a recent film clip by Mark Ronson. This clip is retro to the max and it is this old school retro visual style that I tried to emulate.


With this in mind I first set out to create some very simple animating vector graphics. I wanted them to be bright, colourful and above all simple sticking with this Japanese inspired theme. This I did with 3 simple checker board layers stretched out to create bars and animating over one another. The fonts were then created from a free download of Chinese font packs, again animated over simple, bright vector graphics.


This was all relatively easy and it all looked pretty good once it was done. After the animating part was finished I then turned my attention to the grade as this was what would sell the shit. Ungraded this shot look terrible. The lighting in this shot was one of the hardest things to contend with as it is so stark and all the lack of any definition made it hard to get a professional look.

The first thing I did was to create and adjustment layer that would effect all elements of the shot equally to give it some coherency. With these adjustment layers I reduced a lot of the red in his face and bumped up the contrast to give it a filmic look. On this layer I also added a glow to make the whites pop, this also served as a light wrap style effect that would tie the FG and BG elements together.


Thus creating the final product that is fit for Japanese television.