Blending CG & Camera Projection
At first, I thought I would create an alpha to use as a matte in Nuke, but instead found that using an animated Bezier with a feathered edge gave much more control of how much of the edge I wanted to remove in specific frames. An alpha would have been unnecessary work.
I connected the Bezier to the merge node of the jaw-render and the footage – that way all nodes from the jaw were affected, giving me a blend between them. Similarly, I used beziers to lighten up certain parts of the face, so I can blend the tones of the mesh to the real footage without destroying the blended edges.
I had a hard time getting the camera projection to work, though the real problem was actually time, which I did not have. I used a lot of tutorial tips for reference when creating the setup, and tried to take on many different angles. Finally I got the projection to work, and it looked excellent! …Until my actor turned his head. The rotation obviously did not work out properly from the camera track. I tried using points in his face that would not move or involve facial expressions that might occur, and the trackers seemed to move corectly. So the problem might have been that I did’nt set the right preference values for the camera in the tracker, and I might have missed a rotation option somewhere. But since I didnt have time to find a solution, I had to use an easier method for now. I will however, look into this again after the presentations, it would be a shame to have gotten this far without using it! I can truly recommend camera projection for painting/putting effects onto your footage, (when it works:)) it saves time, is more efficient, and makes it easier to see where on the footage/UV youre doing changes.
For the projection I first created a scene connected to the camera track, using the original footage as source.
I added a 3d project node, and connected it to the node I wanted the cam to project. I used my imported base mesh in 3dview and aligned it to the projection using a pointcloud. Since I had a base mesh that was built to fit my actor it was an easy set up. I connected everything to a scanlinerender, changed the projection mode to UV and I therefore had a frame with uv rererence over the footage I could paint on in PS.
By creating a second scene similar to the first, I could use this setup to make it project my changes, and still have a moving image. This was achieved by connecting an alpha to the image as well, removing all but my changes to the image.