Here is what I been working on this weekened. Lightning tests and a slap composition. The lightning is not perfect yet and the slap comp’s rotos are of course not 100% but this is how the sculpture looks right now. My real hand is behind the cg-hand, having the same pose as the sculpture. I will proceed with animation and texturing this week so hopefully I will have a test render on this shot on sunday.
In the beginning of this project I planned to model and sculpt my hand from reference images alone. However since my neighbours, from the project “Sabotage” who both are sitting next to me, were exploring the capabilities of Agisoft Photoscan I could not keep my curiosity at bay. They took a series of photos 360 degrees around their deer and then created geometry of those in Photoscan. For my project, I did not have to have a perfect copy of my hand, especially since my cg hand is going to be slightly larger than the real one. But since I now know that I probobly have to animate the fingers in maya, I thought it would be nice to at least have the correct proportions. Therefore I did a few tests…and a couple more. In the end, I did not expect the scan to go as well as it did.
My first thought was to only use the scan as a proportion reference. The first tests turned out useless. Photoscan did not recognise the hand in the images. This was due to three main problems:
1: The skin. Photoscan calculates its point cloud like tracker points. My skin was simply to smooth without any features to track.
2: The background. Although the background was blurry and the hand took up most of the imagespace, it disturbed the calculations.
3: Hard not to move. If one is to take a 360 degree photo serie of a hand, the rest of the body should preferably not be in the picture. This means holding the hand away from the body and above the head. You also have to keep that pose for as long as it takes to photograph approximately 30 images. Even a slight move can create an extra thumb, which did happen.
With help from Johan Lilja, who took the pictures and helped with some of the problems, I ended up like this:
Strapped against a table with tape, having the armrest of a chair as support for stablilty, a grid litterally drawn on my hand by an ink penn, inside a bunk draped with the Universitys white curtains. Oh we had fun! The table was there so I could cover my head under it. Yes, it looked pretty strange.
In total we took 32 images around my hand with a 50mm focal lenght:
Still, this is the result from Photoscan after this session:
I was very happy with the results. The model was not perfect, but good enough to use as a base for the highpoly model. I took the model inside mudbox, smoothed it out, increased the polycount and made the fingernails a bit more precise. I then sent it into Topogun, baked the normals in Xnormal, and extended the arm in maya. This is how it looks right now:
You can also see the beginning of the stand/stone that the hand originates from.
Now, did it go faster with Photoscan rather than sculpting my on hand? In the end, mabey no. The time it took to test and solve the problems I encountered could have been spent on sculpting. If I knew how to do it right away then the scanning definitly would have won. Still, I am happy with the results, I got a great bake and I now have an exact cg-version of my hand in my computer. Cool!
Milestone 2 and presentation is now over and posted. Here is my camera material, edited in premier. I am a bit unsure about the final clip but it have to do for now. Something else I have noticed is that some of my clips have a line in the middle of the image with dead pixels. This is something I have to fix in post. Although I’m not so far behind on my schedule I am starting to have this overall feeling I have to work up my pace. I really want to have as much time I can get on the texturing part. There is alot to be done and get done. Here we go Milestone 3!
I have cut my movie in premier and started to track my selected shots. My plan was to use pftrack for my whole tracking process. However, for some reason I can’t seem to find a way to make a good or sensible solve even for just the scene and with the reference images. Therefore I tried to track in Autodesk Matchmover instead. It was still a hard track. The camera movement has very little paralax. But together with the reference frames I finally got a sensable solve. YES! Analyzing this I now know that reference frames definitely are a great help solving camera movements like mine. It would not have worked without them. Also, the reason I think i couldn’t get a good solve in Pftrack is becuase I just don’t understand how to make the constrains to work properly. I have to look this up. But for now I am happy it worked out well with Matchmover.
Something else I have tried out is the geometry tracking within Pftrack. And it works…sort of. From the start I had already taken into account that geomtrey tracking could not solve all of the animation on the hand, like the fingers for example as they constantlycover each other. Exporting the geo track from Pftrack results in an animated joint chain in maya. Now when I have experimented with it I think I will go with just keeping the joint which drives the forearm and then later in maya, proceed with building on that chain and add joints to the hand and fingers I myself will animate. The forearm joint will add more realism to the overall animation, providing small movements that would be hard for me to animate.
The geo track is a fast way to get some additional animation. However, from the start I wanted the object tracking to work, but once again the solve is not quite right. I am aware that I can only track features that are static to eachother. I have tried to track the forearm’s markers since there isn’t much twist movements from the arm. If I have time I will try this again, but for now it’s good to now the geo track is working.
In waiting for the filmed material last week I did some testing on how to solve the cracking/fissure effect. RIght now I have discovered a way involving booleans. In the past tense me and booleans haven’t really agreed with each other, therefore I found it a suprise that they could provide a rather good and simple solution. What I did was to use the difference operation on a plane and the proxy hand. I then animated the history on the plane to make it to gardually cut through the hand. The plane’s vertices could also be transformed to make it a crack rather than a clean cut. The good thing about doing it this way is that I can do a custom animated crack to my liking, which is something that is hard to do using maya’s shatter effect. Perhaps I could find a way to use both effects for increased realsim. Here is a couple of quick examples I made. In the last one I turned the piece that is cut off into a rigid body and let it fall. I only did a simple keying of the visability and switched the hand with the duplicated, cut one.
The only problem whith the boolean operation is that it sometimes creates black triangles as the animated plane moves into the hand geometry. However, this is something that can be fairly easily fixed with a few tweaks to the plane.
One of my main concerns for the project is the integration of the hands ontop of my real hands. After the first week of the project had past I had researched enough to work out a plan on how to do this. These were my thoughts:
What I want is to insert a cg model (slightly larger, I want the stone to have some thickness) infront of my real hands. In the later part of the clip I will start to animate the model which will correspond to the real hand’s movements. In the joints of the hand there will start to fall some smaller debris and spread some additional cracks. The real hand’s major purpose is to act as a referance (both as model and animation) and a background to the holes and fissures the cracks make. How much I will reveal of the hand depends on how the effect will look in the end.
The greenscreen was a choice I made a few days before filming. Now I can remove the real hand more easily (perhaps in the beginning when I want the sculpture to be still for example or if I want the animation to differ from the real hand). I also took the time on set to film the background without the hand or greenscreen. This material I can use for projection purposes later on.
The markers on my hand I hoped to work with object tracking. I decided that if I could get the track to function properly, it would be a great help for the animation of the hand. This was also a reason why I decided to have reference cameras on set. In Pftrack there was also a feature called geometry tracking I wanted to try out if the others should fail.
Managing files from a RED camera is a bit diffrent than working with normal files. The 4k resolution and the raw images you are able to extract from the camera needs another workflow to stay as uncompressed as possible. Although my final delivery will be in 1080p, it is still good to now the workflow and when it is time to reformat them. Here is a workflowchart I made based on Simon Tingells lecture.
If you are able to, you want to stay in premier as much as you can. For my project however, I want to take the clips into nuke to be able to do the compositng I need. I am more comfortable with grading in nuke and since I need to do some greenscreen-work, cg implementation and 2d tracking, nuke is my first choice of program to do this.
First you edit/cut your clips in premier. Then send a xml-file to REDCINE-x were you are able to edit iso, tempertaure, exposrue and so on. Yes, these settings can be changed in both permier and nuke, the important part is that you change gamma to redlogfilm and export the frames in dpx (1080p in my case). The dpx will later be red as colorspace cineon in nuke. Which will give the images correct gamma to work within nuke. When you later export the files to premier (once again as dpx) for final editing and effects transitions, you have to add a cineon converter filter to each clip in prermier. Otherwise premier will read the files as sRGB as default. For me, it will be unnecessary heavy to work with 4k in nuke for most of my shots. However, in my heroshot I have a greenscreen and as I will be doing greenscreen and dispill removal, I think I will keep it in 4k to have as much information as I can.
Together with the photograhper Mattias, we went to Nordanå to shoot my clips. Although I think we got everything I needed, what a hectic day it turned out to be. Using a RED ONE camera definitely has its advantanges, however there is a lot more to think about and additional equipment you have to bring. But with help from Mattias, our teachers Arash and Samuel, and my fellow student Caroline Näslund, It turned out okay somehow. As I am looking at the data before me, I now know. All that extra trouble was definitley worth it. Thanks everybody!
When I am done with some editing of the material, I will put up the hand shot infront of the greenscreen. Since it is my hero shot for my project, I will be spending many hours getting to now it better from here on. For now, here I am taking some whiteball references for lightning.
A few days has past since I last posted and this is what I have been up to.
I have arranged for the location, which will be the Museum Nordanå here in Skellefteå. At monday morning we will start to shoot and since we are able to be there all day, hopefully one day is all that’s needed. I have also gone through my storyboard with the photographer Mattias Sjöstedt. He had som great ideas on how to evolve the shots and storytelling. Therfore I have now revised my previz and updated it. Instead of redoing it in maya, I went to Nordanå with a friend who helped me film a previz on set. I have added a coulpe of shots but still managed to keep the number of rendered frames to be the same as the last storyboard, which is good. You can find it here or in MS1:
Since we are filming on monday I have been preparing and planing which items I need to bring. As stated in my research document, I will be working with reference frames and sequences when matchmoving. Therefore I have prepared for two additional cameras to be brought to the location. I have also painted my own mini greenscreen which will be placed behind my hand in the hero vfx shot.
Furthermore, now when it’s been decided we will have access to a RED camera, I have been looking into the workflow of how to manage a RED camera’s files. A workflow chart will be posted later.