Agisoft photoscan vs Modelling

In the beginning of this project I planned to model and sculpt my hand from reference images alone. However since my neighbours, from the project “Sabotage” who both are sitting next to me, were exploring the capabilities of Agisoft Photoscan I could not keep my curiosity at bay. They took a series of photos 360 degrees around their deer and then created geometry of those in Photoscan. For my project, I did not have to have a perfect copy of my hand, especially since my cg hand is going to be slightly larger than the real one. But since I now know that I probobly have to animate the fingers in maya, I thought it would be nice to at least have the correct proportions. Therefore I did a few tests…and a couple more. In the end, I did not expect the scan to go as well as it did.

Testing

My first thought was to only use the scan as a proportion reference. The first tests turned out useless. Photoscan did not recognise the hand in the images. This was due to three main problems:

1: The skin. Photoscan calculates its point cloud like tracker points. My skin was simply to smooth without any features to track.

2: The background. Although the background was blurry and the hand took up most of  the imagespace, it disturbed the calculations.

3: Hard not to move. If one is to take a 360 degree photo serie of a hand, the rest of the body should preferably not be in the picture. This means holding the hand away from the body and above the head. You also have to keep that pose for as long as it takes to photograph approximately 30 images. Even a slight move can create an extra thumb, which did happen.

The solution

With help from Johan Lilja, who took the pictures and helped with some of the problems, I ended up like this:

IMG_7568

 

Strapped against a table with tape, having the armrest of a chair as support for stablilty, a grid litterally drawn on my hand by an ink penn, inside a bunk draped with the Universitys white curtains. Oh we had fun! The table was there so I could cover my head under it. Yes, it looked pretty strange.

In total we took 32 images around my hand with a 50mm focal lenght:

delens_7629

Still, this is the result from Photoscan after this session:

Hand_scan_02

Hand_scan_01

I was very happy with the results. The model was not perfect, but good enough to use as a base for the highpoly model. I took the model inside mudbox, smoothed it out, increased the polycount and made the fingernails a bit more precise. I then sent it into Topogun, baked the normals in Xnormal, and extended the arm in maya. This is how it looks right now:

Hand_sculpture

Hand_sculpture_wireframe

 

You can also see the beginning of the stand/stone that the hand originates from.

Now, did it go faster with Photoscan rather than sculpting my on hand? In the end, mabey no. The time it took to test and solve the problems I encountered could have been spent on  sculpting. If I knew how to do it right away then the scanning definitly would have won. Still, I am happy with the results, I got a great bake and I now have an exact cg-version of my hand in my computer. Cool!

Be Sociable, Share!