Tuesday 8 November 2011

Camera Projection

Yesterday was supposed to be the day when this shot came together, it was supposed to be a shinning moment, and then the track didn't work and blew my whole operation. I ended up doing a blog entry about reflections and shadows and some other hot air to try and cover up the fact my track didn't work.

Well, today, even though I'm ashamed of it I'm going to post the track that looks like arse, but I'm also going to post a little bit about camera projection, which in this case has been a bit of a get out of jail free card.

As I said in my last post, there was a number of benefits in recreating the physical environment within Maya, accurate lighting and shadows to name a few, but the ability to generate a believable camera projection in place of texturing every surface is another.


This image that I'm reusing again because I'm to lazy to render another shot out show's the camera projection at work. In my last post I spoke about how camera projection works to generate reflections onto the ball, this time however I am using the projection to actually texture the geometry.


This Maya view window animation gives a graphic representation of the geometry, the renderable camera and the projection camera.


This is my track that unfortunately didn't work out, which is always a shame. In this particular render, I hid the virtual geometry and just rendered out the ball with it's reflections (last blog entry) and then added the actual footage back in at the comp stage. This helped cut down render times although as you can see. Because the ball  movement didn't match the footage it looks horrible and there is next to nothing that can be done to fix it apart from retracking the footage.

So, what I decided to do was re-render the scene, only this time I would only render the virtual environment with 1 single frame of the footage projected onto it. This image to be precise:


As the original footage now appears nowhere in these sequence there can be no conflict between track and footage. The following shot is the virtual geometry with the one singe frame being projected into it by a locked off projection camera and the match-moved camera moving through the scene.

As you can see there are a few anomalies towards the end as the renderable camera moves out of the range of the projected frame, although if I had the inclination to fix it, this could be rectified with multiple projection cameras throughout the scene stitching together the whole footage from 4 or 5 frames. Also notice that the ball now fits perfectly in the sequence with no slippage at all due to there being no original footage, it is 100% virtual.

All in all I'm pretty impressed with the way it came out and it just goes to show that even when things fuck up badly, there's still a positive to come out of it!

Saturday 5 November 2011

Shadows and Environmental Reflections

So, after creating a virtual environment from a live action plate I thought I'd take a look at what it was possible to achieve within this environment.
Unfortunately for this exercise when I rendered the scene out it soon became obvious that the 3D track was not up to par and as such the geometry I placed into the scene slipped and jumped about the place, thus making the whole shot look like pure arse.
This was a bit of a shame because all the work up to then looked promising. This however is another good lesson to learn and just re-enforces the practice of getting things right at the input, or initial stage of production. That age old allegory; "you can't polish a dog turd" still holds true today and is something I will be especially mindful of when it comes to producing the money shots.

So, as the sequence looked rubbish that doesn't mean that the still frames looked bad, so here's a little look at what I did.




Back at the old location again, I decided in true 3D CG style to put a sphere right in the middle of the scene. Why I hear you ask? Because I can. But, this isn't just any sphere, this is a sphere with a true to life shadow and environmental reflections!

If you saw my previous blog entry on 3D environments, you'll soon notice that from this match moved footage I recreated this environment within Maya. As this environment very closely matched the actual physical space and the camera move was (roughly) taken from the physical camera, it allowed me to use a technique known as Camera Projection to map the reflections onto the sphere. Here's how it works:


This mash of a rendering above is the geometry of the match modelled space, however instead of texturing this space with UV co-ordinates, I duplicated the 3D match-moved camera, movement, translation and all and used this second identical camera as a sort of film projector if you will. This projection camera projected the original footage that I filmed back onto the blank 3D geometry (which you can see above), thus creating perfectly matched reflections for the sphere to pick up on. Once this was achieved it was just a matter of rendering all the elements out separately for comp.
This is done via the Maya render layers where it is possible in render to turn off the primary visibility of geometry but still have it picked up in reflections.

 
This image is an example of that, as all the other geometry has had it's primary visibility disabled but is still casting reflections.
The final element in this shot was the shadow which again, was rendered in the same way. Using the 3D geo to help with the shadow, I rendered out a simple shadow pass on the alpha channel for use in comp.


So there you have it. All the elements of placing 3D geo into a match moved scene. There were a few final touches such as a colour and grade correction and a defocus on the ball to fit it in the scene but I'll cover them properly down the line. And maybe next time I can get track to work so there's some proper footage to look at!

Friday 4 November 2011

3D Environments and plate stabilization.

Having looked at pretty basic match moving in previous posts I thought Id take it a step further today and try and Match Move some footage and then recreate the environment within Maya to match the camera.

After much trial and error of footage, also outlined in previous posts, I managed to find a piece of footage that was appropriate for this exercise. The main issue that I was coming up against was the jerkiness of the handheld footage, however on this take I managed to feed the footage into Nuke first, track the footage and smooth it out using Nukes de-jitter transform option. I might also add that the de-jitter option in Nuke is something that AE lacks and is one of the most useful plugs out there for film makers.

Above footage before the De-Jitter tracker was added

Footage after De-Jitter

Once the footage was stabilized I outputted it into Autodesk Matchmover once again and tracked the footage, creating a bunch of 3D track points. With the footage stabilized and less movement in the X and Y axis then previous shots, the solve worked out very well. Although there were a few focal adorations due to the fact it was my girl friends hand held stills camera on auto focus... 

Once the shot had been solved I imported to to Maya and got onto digitally recreating the environment. As I didn't have my tape measure handy when I shot this footage a lot of the geometry placement came down to guess work and trial and error. This was a good lesson for next time however, it is vitally important to take measurements of your environment!


The above frame is a snapshot of the very basic scene recreation. There is no detail as such and is all basic geometry, but with the match moved camera and the now virtual environment, this was the result of the render.


As you can see, the movement exactly matches that of the physical camera and the environment more or less lines up with its physical counterpart (apart from a few slips here and there!)
The real power of this technique is not in the recreating an already existing physical space, but more as a great frame of reference for adding 3D elements to the original footage. With the ground plane and walls added to the Maya scene, this ultimately helps to achieve realistic shadows, lighting and reflections for anything else I may want to add to this scene. Although this one is very rough, it is very promising for things to come!

Wednesday 2 November 2011

Motion Graphics with After Effects

Yesterday I had a brief look at motion graphics and basic compositing inside of Nuke and how all these elements of a mock advert fit together. Today I'm going to be looking at something similar within After Effects.

As a general rule of thumb After Effects is sneered upon within the compositing community at large as something of a kiddies toy application, or a kind of a "my first compositing" package, and to be honest it's not with out some foundations. However this doesn't mean After Effect doesn't have it's place within creative industry, because as a motion graphics application, AE is second to none.

The tools that are provided with a Program like Nuke are perfectly suited to the high end compositing arena, more specifically working as a finishing package for assets created elsewhere (although with the addition of such elements like its 3D particle system that is beginning to change). After effects on the other hand is much more suited to creating these assets and compositing them as an in-the-box style of work flow.

The first clip that I made for this exercise was a pretty straight piece of motion graphics serving as another mock advert for my impending street scene. The matte of this particular shot was outlined in my Green Screen and Chroma Key blog entry a few days ago when looking at the technique of hard and soft matting.



This video differs from the Nuke motion graphics advert in the sense that all of the assets contained in this advert were created within After Effects itself and not having to rely on Maya to create anything. This is where the power of After Effects lies. * I should point out at this point that the workflow of After Effects is seriously compromised by it's stability and is something I have had a lot of issues with. This is an area where Nuke wins hands down as being one of the most stable applications I've used.


Anyway, after attaining a good key using Keylight (cross-application keyer works the same in AE as Nuke) I then set out to create the sequences look.

As the content of this particular shot oozed cheese, I wanted to create a style of advert that matched Adams dynamic performance. When I think about Cheesy adverts, generally I can't go past Japanese advertising. I love it. Generally my perception of these adverts is short and to the point with lots of bright colours.

As a point of reference I was influenced by a recent film clip by Mark Ronson. This clip is retro to the max and it is this old school retro visual style that I tried to emulate.


With this in mind I first set out to create some very simple animating vector graphics. I wanted them to be bright, colourful and above all simple sticking with this Japanese inspired theme. This I did with 3 simple checker board layers stretched out to create bars and animating over one another. The fonts were then created from a free download of Chinese font packs, again animated over simple, bright vector graphics.


This was all relatively easy and it all looked pretty good once it was done. After the animating part was finished I then turned my attention to the grade as this was what would sell the shit. Ungraded this shot look terrible. The lighting in this shot was one of the hardest things to contend with as it is so stark and all the lack of any definition made it hard to get a professional look.

The first thing I did was to create and adjustment layer that would effect all elements of the shot equally to give it some coherency. With these adjustment layers I reduced a lot of the red in his face and bumped up the contrast to give it a filmic look. On this layer I also added a glow to make the whites pop, this also served as a light wrap style effect that would tie the FG and BG elements together.


Thus creating the final product that is fit for Japanese television.

Tuesday 1 November 2011

Motion Graphics with Nuke

Last time around I was looking at different techniques of Chroma keying, and all for very good reason, because today, I'm going to be showing you what I did with those lovely keyed sequences of my highly talented crack(ed) acting squad.

First up we'll look at Nesbeth. As you may recall when we last left him he was digging the smooth sounds in his new headphones and trying to nail a key on reflective objects within the scene. As I mentioned before, this didn't turn out to be a problem, because for this particular ad I used the key to artistic effect.


As you can see the final look of Matt in this ad is simply the alpha channel inverted. This is a very simple technique to achieve in Nuke. When using keylight key a green screen, this is the what you will get as an alpha channel. It is simple a matter of using the shuffle node, as handy little node that allows you to "shuffle" the colour channels of any give image around. In this case the alpha was inverted and then shuffled into the RG & B channels respectively, giving this as a final result.

Once the overall visual style was accomplished (and it is based on those Apple iPod adds) it was just a matter of finessing the final product. Again in true apple style I added a particle system which I created with Trapcode Particular and controlled via a few simple expressions written into the Particle emitters transform attributes (where it moves in 3D space) telling it to move around in a random fashion (random(x/2)/5 if you really want to know).
Particle system on its own screened over the existing keyed footage.

The next element to add were the headphones. This was pretty simple as I just made a loose roto shape and essentially cut them out to separate them from the rest of the footage. By doing this I was able to give the particles and the headphones the same colour and again control their changing colour by an expression written into a hueshift node. This time the expression being t*1, of time*1. This makes the hue of the effected element equal the time in the timeline, and thus will change as the time does.
Rotoscoped headphones enabled separation from the rest of the scene.

That was pretty much it for visual elements. This style of advert is very clean and would have been easily over cluttered so I though keeping it simple was the best plan of action. The only other element that was added was the text. This I wanted to keep with the particle theme and also was looking for an excuse to have a play with some Fluid Simulations within Maya. I soon figured out that it was actually pretty easy to take some text with an alpha channel and plug it into the fluids density, making the text the starting point of the fluid.


Once this was rendered from Maya, it was inverted in Nuke to fit the rest of the scene and a Min blend node was used to remove the white and keep the black (opposite to a normal alpha channel).
The final script in Nuke looked a little like this...Exactly like this actually.

This above image represents the node based system that Nuke uses as it's main interface. At the top of the tree there lies the main green screen footage, with the 2 nodes below the keyer and the shuffle node to create the inverted alpha look (B&W). On the right of that is the headphone roto branch with the Hueshift node (shifting colour) and 2 glows to light them up, these 2 paths were then mereged together.

This is the basis of how the Nuke interface works and once you wrap your head around it, it is a lot more streamlined and refined than working in layers.

Anyway, that's enough of that for the moment. Next time I'll be have a look at working with Motion graphics in After Effects.