Courtesy : Digital Tutors
Topic : Digital-Tutors / chitCHAT / General Area / Industry
News /
Spider-Man 3:
Breaking Down Sandman (Spider-Man 3)
It was quite a shock to hear Sam
Raimi admit last summer at Comic-Con that he still wasn't sure how Sony
Pictures Imageworks was going to pull off Sandman. That's because the look of
the new villain in Spider-Man 3 literally kept changing. Fortunately, everything
turned out all right in the end for this complex, shape-shifting, sand creature
that is intended to evoke the pathos of the legendary golem.
Development began two years ago
with a team of technical directors, led by sand effects supervisor Doug Bloom,
who came up with a pipeline and toolset. "We figured that the more they
could duplicate the physics of sand, the better off they'd be, since story and
storyboards and animatics were still being worked on," Bloom explains.
"We wanted to prepare to emulate any possible behavior. We wanted the sand
to look as realistic as possible and then later art direct and break away from
reality of physics."
After six months of sand tests,
they did side-by-sides of live-action footage and CG tests. What came out of
that was a successful development effort in which all of the software R&D,
programming and custom tools were ready to roll right into production.
"During that process we had a team of maybe four full-time people doing
custom software development, some of which was done with C++, some of which was
done with Python and a lot of tools that were developed were exposed as
plug-ins to Houdini, the particle effects package from Side Effects,"
Bloom continues. "And all the tools were developed as libraries so we
could link into them easily from other packages as necessary. One of the tools
developed was a fluid and gas simulation engine and that was done early on. And
during the sand test sequence, one of our effects tds wrote a user interface
that connected up to the fluid engine. And later on, as we ramped up for
production with more and more tds, we exposed the UI to the fluid solver and
moved most of the work to Houdini at that point. Everything was done in an open
system because when we were going through the sand tests, aside from trying to
match these tests, we still weren't clear what was going to be required of the
character. So we wanted to create as many tools as possible that could be
portable across different applications in a fashion that would allow us to have
the various tools communicate and share data.
Img-2
For more details of
all the visual effects in this record-breaking film, check out VFXWorld's
additional Spider-Man 3 coverage.
It was quite a shock to hear Sam Raimi admit last summer at
Comic-Con that he still wasn't sure how Sony Pictures Imageworks was going to
pull off Sandman. That's because the look of the new villain in Spider-Man 3
literally kept changing. Fortunately, everything turned out all right in the
end for this complex, shape-shifting, sand creature that is intended to evoke the
pathos of the legendary golem.
Development began two years ago
with a team of technical directors, led by sand effects supervisor Doug Bloom,
who came up with a pipeline and toolset. "We figured that the more they
could duplicate the physics of sand, the better off they'd be, since story and
storyboards and animatics were still being worked on," Bloom explains.
"We wanted to prepare to emulate any possible behavior. We wanted the sand
to look as realistic as possible and then later art direct and break away from
reality of physics."
After six months of sand tests,
they did side-by-sides of live-action footage and CG tests. What came out of
that was a successful development effort in which all of the software R&D,
programming and custom tools were ready to roll right into production.
"During that process we had a team of maybe four full-time people doing
custom software development, some of which was done with C++, some of which was
done with Python and a lot of tools that were developed were exposed as plug-ins
to Houdini, the particle effects package from Side Effects," Bloom
continues. "And all the tools were developed as libraries so we could link
into them easily from other packages as necessary. One of the tools developed
was a fluid and gas simulation engine and that was done early on. And during
the sand test sequence, one of our effects tds wrote a user interface that
connected up to the fluid engine. And later on, as we ramped up for production
with more and more tds, we exposed the UI to the fluid solver and moved most of
the work to Houdini at that point. Everything was done in an open system
because when we were going through the sand tests, aside from trying to match
these tests, we still weren't clear what was going to be required of the character.
So we wanted to create as many tools as possible that could be portable across
different applications in a fashion that would allow us to have the various
tools communicate and share data.
"One of the big examples was
the fluid solver, which shared the same data formats as another simulator,
which was called Spheresim, which is a stripped down rigid body simulator that
only deals with spheres. It removes all of the extra calculations you need for
other shapes as well as any calculations you'd need for the rotation. So the
nice thing about that system was that it allowed us to simulate sand grains
piling up, and what we'd do is have each sand grain represented by a single
sphere. In the case of a very close-up shot or even a shot that might be a
little wider, each sphere would represent a cluster of 10-50 sand grains. The
nice thing about this application was, because it was developed in the same
library structure of C++ code, it actually shared forces and other data formats
with the fluid solver, allowing us to take all of these little spheres that
were stacking up like little rigid bodies as a result of the sphere same
algorithm and at any point we could flip a switch and have them enter into a
gas or fluid simulation, creating a nice swirly, turbulent motion that we could
then render as a fine sand grain, fine dust or individual rocks.
Img-3
There were a total of
260 sand shots in Spider-Man 3, requiring a host of new tools.
"This allowed us to mix and match Sandman as a solid
character in a human form. In the [Flint ]
Marko atomized sequence, for example, you see him dissolving and blowing away
into individual sand grains. Again, that was done with this whole suite of
tools that shared this common file format. At this moment, he's a polygonal
mesh. And at a particular frame, we're going to swap this out for millions of
little particles that will be constrained to the mesh. You won't actually see
this transition, but this allowed us to pick individual particles off that mesh
and have them blow away.
In addition to 'smart' sand that he
has full control over, the direction we got was that he's constantly forming
and trying to refine his shape, and as the movie progressed, he was able to
control that sand more and more. But what happened was in the process of
sucking up all of the sand, the extra sand would drip off or be tossed off. And
in both the [bank heist truck] sequence and final battle, you have the added
complexity of interaction with objects or characters. In the truck sequence,
when he was being shot at with bullets, the basis of all those effects came off
the base tool set, which we call Sandstorm. So here the effects artist was able
to start with a character that had been animated in Maya and, by using
Sandstorm, was able to scatter particles all over the polygonal surface and
also select regions of polygons and fill a volume defined by that region with
sand as well. So with the timing and placement of the bullet hits being driven
by the animation department, that allowed the effects td to take that volume of
particles -- and the remaining particles that had been scattered on the
surface-- and create the dynamics for the bullet hits and the impacts."
Img-4
There were a total of 260 sand
shots in Spider-Man 3, requiring a host of other new tools as well. The volume
renderer was developed at Imageworks for the whole facility in conjunction with
the Spider-Man R&D. What that allowed them to do was to have priority over
features that were added as well as bug fixes. "For the first
year-and-a-half, our show was the only one to use the renderer," Bloom
adds. "And the main developer, Magnus Wrenninge, came on to the show to
help push some shots through with the use of the volume renderer."
In addition, there was a RenderMan plug-in called the Sand
Instancer, which was the key to being able to render all of the data.
"Along with that was a custom shading language," Bloom suggests,
"which allowed the lighters or effects artists to write little shaders or
custom expressions to drive either the level of detail controls whether an
individual grain was rendered as a point or a model, and also control of the
size of grains or color and a rough distribution of size and density. One shot
went on for two weeks: the actual size of the sand grain as the camera pulls in
to Sandman's face. This allowed the lighter to dial the size back and forth and
modify the distribution of the grain size without ever having to go back to the
effects td."
Bloom says the whole experience
became a significant educational effort in how to mix and match simulation
engines and use the new renderers. They held training meetings about once a
week and took input from users on features that would allow them greater flexibility.
One tweak that illustrates the flexibility of the tools was altering what
happens when Spider-Man takes the feet out from under Sandman during the bank
heist sequence. Originally the effects team had envisioned the legs turning
into loose sand grains but revised the effect to have his legs turn into chunks
for greater impact.
Meanwhile, with so much attention paid to Sandman, digital
effects supervisor Ken Hahn came straight off of Ghost Rider to help out. His
main focus was on the birth of Sandman.
Img-5
"We were given a lot of
conceptual art that was very helpful from E. J. Krisor," Hahn explains.
"One of the problems early on that I saw was that there was a disconnect
between the conceptual art drawings and what we were trying to achieve in
effects and animation. So we needed to really get animation and effects
thinking on the same page because on the animation side, they weren't thinking
in terms of volumes and particle sets, and they were animating the character as
more of a typical rigid-bone, articulated skeleton. And we really insisted that
it's more of a fluid character encompassed within a flexible membrane. And once
we started having some discussions about animating his actions in terms of
pulling one volume of sand from one location to another and seeing what kind of
flow and direction there could be, it was a matter of coming of up a system
that was simplified.
"There was definitely a bit of compromise in the
beginning. Animation felt they had to do things a certain way and effects did
too and even rendering guys they needed data sets a certain way. Once we laid
down the commonalities amongst all of us, we understood what everyone's
limitations and capabilities were, which were very important."
According to Bloom, the core
R&D team is now taking the tools that were developed and repackaging them
so that other movies can start making use of them. The volume renderer is being
used on Beowulf, and I Am Legend has now started using it. The Sand Instancer
is being repurposed for use in rendering debris during collisions, such as the
destruction of a building or cars colliding. The Sand Instancer is becoming
more of a generic Render-Man plug-in that allows you to render large amounts of
data efficiently and also control a good portion of the look at render time
without having to regenerate data. Spheresim and the fluid simulator and file
formats are now being repackaged for vfx movies from Imageworks (Tonight, He
Comes) or Sony Pictures Animation.
It's all about
"efficiency and ease of use," offers Bloom.
Bill Desowitz is
editor of VFXWorld.
Note: Readers may
contact any VFXWorld contributor by sending an e-mail to editor@vfxworld.com.