Friday Flashback #331


 

BY ANY OTHER NAME:
Studio Ghibli Changes Everything with Spirited Away

… since 1995, the Studio Ghibli 3D team, armed with SOFTIMAGE®|3D, have been more than helping out with the visuals. The full transition from traditional ink & paint techniques and shooting to digital I & P and compositing was made in 1997.

by Michael Abraham

People have come to expect miracles from Hayao Miyazaki. Since he co-founded Studio Ghibli (with lifelong colleague and sometime creative collaborator Isao Takahata) in 1985, the now-revered anime director has been the creative force behind a long list of animated films that simultaneously manage to be intensely thoughtful, critically acclaimed and hugely successful. Any filmmaker – hell, any artist – can tell you how difficult it is to hit all three points. Miyazaki’s latest offering hits all three harder than ever before.

Miyazaki’s formula, if you can call it that, involves using dazzling visuals and engaging fables to suspend our disbelief, thereby clearing the way for some truly trenchant insights. The stories and insights are Miyazaki’s idea, but since 1995, the Studio Ghibli 3D team, armed with SOFTIMAGE®|3D, have been more than helping out with the visuals. The full transition from traditional ink & paint techniques and shooting to digital I & P and compositing was made in 1997.

“We are a traditional animation production studio,” says Mitsunori Kataama, 3D-CG Supervisor at Studio Ghibli. “There are about 150 people presently working here. Within that group, we have three sections using computers for production – ten people work on ink and paint, four in compositing and seven of us in 3D-CG.  We mainly use Silicon Graphics workstations, with over thirty CPUs, including those used as servers. We also use Linux and Mac OS computers.”

That set up makes for an immensely clever, and ultimately virtuous, method, and it is employed to great effect in his most recent film. Set in modern-day Japan, Spirited Away (or Sen to Chihiro no Kamikakushi, the Japanese title) joins the daily life of ten-year-old Chihiro, a somewhat spoiled and ill-tempered girl unhappy to be moving to a new town with her family.

On their way to their new home, Chihiro and her family pass through a mysterious tunnel only to find themselves in a world not of their choosing. When her hungry parents mistakenly eat food reserved for the gods, they are suddenly transformed into pigs, leaving Chihiro as their only hope. A great many things change in this new land: a young boy becomes a dragon, an origami bird transforms into a witch and a filthy bather is reincarnated as a river god. Even Chihiro is forced to barter her real name for her survival with the evil witch Yu-baaba, who gives her the more generic sounding Sen in its place. To rescue her parents and regain her name, Sen must also change from a frightened little girl into a courageous heroine.

In creating Spirited Away, Miyazaki claims to have been making a gift specifically for his friend’s daughters, all of whom were about 10 years old at the time he got the idea. After two years and a painstaking blend of traditional cel animation and seamlessly integrated digital technology, however, it seems that his gift is being shared by just about everybody. At the time of this writing, Spirited Away is poised to overtake James Cameron’s Titanic as the single-most successful film ever shown in Japan.

Although Studio Ghibli works pretty exclusively on feature animations, with the occasional short thrown in for good measure, Spirited Away was a big job even by their standards. All of the animation, backgrounds, compositing and 3D work were accomplished in-house. Working diligently on 100 of the movie’s 1400 scenes, Kataama and his team dealt primarily with complicated scenes impossible to create solely by hand, and including intense 3D camera work and object animation.

“We used several different techniques,” says Kataama matter-of-factly. We added depth information to original 2D images by mapping hand-written backgrounds on to 3D models. In the end, we also used SOFTIMAGE|3D to calculate a reflection and a highlight component, which we then added to the hand-written background. We also developed a unique 2D Texture Shader, so we could have a multiple position camera-texture projection for mapping of our background image. We have also developed a plug-in to make changing a particular field of vision much easier.”

Another significant challenge faced by the Studio Ghibli 3D team involved the creation of realistic, ever-changing sea surface, which required the in-house development of another 2D texture shader and several material shaders. According to Kataama:

“To accurately express the look of the waves, we created a 2D texture shader that would generate a procedural texture. We really appreciate that SOFTIMAGE|3D offers such a valuable environment for developing new functions. The high-quality rendering result was extremely effective in our efforts to draw rays that would act as both reflections and highlights. For that, we were very happy to have the Ray Tracer, which we could not find anywhere else.”

Kataama pauses reflectively before continuing. “Where I used to work, we used separate in-house applications for editing modeling, animation, and texture. When I joined Studio Ghibli, SOFTIMAGE|3D immediately enabled me to do everything in an integrated environment. Even an animator working on his first 3D project can do sophisticated animation work with it right away.”

Looking to the future, Kataama and Studio Ghibli have great plans for SOFTIMAGE|XSI™. Although they are still in the evaluation phase, Kataama has already seen enough to know what will be particularly useful.

“In the coming year, we are planning to switch all work to SOFTIMAGE|XSI,” he explains patiently. “So far, we have been most impressed with Render Passes. In our work, we do final image control at compositing stage, so it is a big help that Render Passes can separate 3D into various elements.  In the past, we needed to prepare scene data when rendering, but using Render Passes means we can make multiple materials from one scene. I’ve also had a chance to look at the Render Tree, which I found very easy to use.  I was very happy because even I can create shader, even though I have no programming skills. We also have high expectations for the Subdivision Surfaces functions.”

Although they have still to evaluate the animation functions in SOFTIMAGE|XSI, Kataama and his Studio Ghibli team already know that the Animation Mixer will soon be coming in handy:

“We are planning to create a human crowd,” says Kataama. “What we have in mind is likely impossible without the Animation Mixer.  We are also looking forward to the new Toon Shader, which will help us to create an even better hand-drawn animation look.”

And, no doubt, another Miyazaki masterpiece. A film by any other name would never look this great.

Friday Flashback #330


REAL TROOPERS
Computer Graphics World September, 1997

by Barbara Robertson

In late 1991, when Phil Tippett and his colleagues at Tippett Studio (Berkeley, CA) began thinking about the stop-motion animation they expected to do for their next big movie, they had no idea that the movie, Jurassic Park, would instead signal the end of traditional stop-motion animation at Tippett Studio.

“We were talking about how we`d have to build new rigs, how we’d expand,” says Jules Roman, producer and co-owner of Tippett Studio along with her husband, Phil Tippett. “But meanwhile, Dennis Muren at Industrial Light & Magic had been working in CG and showed a test shot to Steven [Spielberg].

“I remember Phil had a cold and wasn’t feeling very good at the time,” says Roman. “And then Dennis called and said, `Well, Phil, we`re going in a different direction. We’re going to do all CG.` Phil promptly got pneumonia and took to his bed.”

“I was completely devastated,” Tippett says. “The CG models held up; they yielded a new look and fluid motion. I got horribly sick.”

One of the alien insects created by Tippett Studio for the TriStar production of Paul Verhoeven`s Starship Troopers.Little did he know that in approximately five years he would describe his studio as 99.99% dependent upon computer graphics; that by August, 1997, Tippett Studio would have completed a 2 1/2 year-long movie project in which all the character animation would be created with CG. That movie, Paul Verhoeven’s Starship Troopers based on Robert Heinlein`s novel, is scheduled for release this fall.

Phil Tippett began his career in visual effects with George Lucas` Star Wars, released in 1978. In 1983, he became head of the Lucasfilm creature shop and won an Academy Award for his work on Return of the Jedi, which included the memorable character “Jabba the Hut” and other aliens. In 1985, he founded Tippett Studio, which created creatures for a string of movies, including Willow, resulting in an Academy Award nomination, Ghostbusters 2, all three of Paul Verhoeven`s Robocop movies, and Honey, I Shrunk the Kids.

Art director Craig Hayes began work on armatures such as this to give stop-motion animators a way to practice their skill in the CG world without having to learn and use 3D animation software. On the table in front of Hayes is the DID used at Tippett Studio to control the onscreen warrior bug for the movie Starship Troopers.As it turned out, Tippett was to play a large role in Jurassic Park as well and would ultimately share an Academy Award for that movie. Fortunately, Craig Hayes, art director and visual effects supervisor at Tippett Studio, had been working on an armature device to feed stop-motion animation data into Softimage’s 3D animation software. And Dennis Muren was interested in keeping Tippett involved in the project. The Dinosaur Input Device, or DID, as the armature came to be called, was used for 15 of the 52 computer animation shots in Jurassic Park, including the road sequence in which the T-Rex attacks the tourists` jeep and the kitchen sequence in which the two velociraptors hunt the children.

“We`d send ILM motion data, and they`d send us back the rendered dailies,” Roman says. “It pointed the way to our future.” (This year Hayes, Brian Knep, Rick Sayre, and Thomas Williams received a Technical Achievement Award from the Academy of Motion Picture Arts and Sciences for the now renamed “Digital Input Device.”)

Time For A Change

Around the time Jurassic Park was playing in movie theaters, Tippett received the first screenplay for Starship Troopers. It was time to make the big transition to computer graphics.

“You can’t do CG on your own in a garage with a camera and a puppet,” says Roman. “CG is highly technical, it requires all kinds of infrastructure, and it’s extraordinarily expensive. Our business went from a build-up, tear-down kind of thing into something where we’re only talking about build up.”

The 15-person crew that worked on Jurassic Park grew to around 30 people when the studio got a contract to create two CG sequences for the movie Three Wishes. “It was a very quiet movie,” says Roman, “but it gave us a fantastic opportunity to get our CG legs.” Meanwhile, they began working on tests for Starship and began hiring people–eventually tripling in size. The final “test” was an eight-shot mini movie directed by Paul Verhoeven. “After that, we felt pretty confident we could do the production we had been doing but with the computers,” says Hayes, who has been with the studio since its early days.

For a Starship Troopers` battle scene, animators at Tippett Studio used a combination of keyframe, puppet, and procedural animation techniques within Softimage 3D to orchestrate the movements of 700 nine-foot-tall warrior bugs.For Starship Troopers, Tippett Studio designed all the creatures, which are giant bugs, and created all 218 creature shots. (Other studios, Sony Pictures Imageworks, ILM, and Boss Film, worked on spaceship effects.) “This is like a World War II movie,” Hayes explains. “The bugs are the enemy.” Nine-foot-tall warrior bugs that resemble scorpions are the generic ground troops, mosquito-like “hoppers” function as an air force, and big tanker bugs that look like beetles act as battering rams. In addition, a 10-story-tall bug launches the bug-equivalent of rockets, and an evil brain bug does unspeakable things to its enemies.

“No matter what you do to them, all they want to do is kill you,” says Jeremy Cantor, an animator who worked on the brain bug. “They don’t think. They just kill.” What do they kill? Humans.

By loosening some joints to make the armature more flexible, Trey Stokes, animation director at Tippett Studio, transformed the warrior bug DID into a digital puppet that was used for real-time animation.Adds Trey Stokes, animation supervisor: “We tried to give the sense that these are real creatures that exist in the same world as the humans, so they`re living under the same gravity and have to obey the same physical laws. Of course, we cheated mercilessly for the needs of the shots. I guess the only character we injected into them was to give them the sense that they enjoy their job.”

As is typical in most CG effects studios, the production pipeline includes modelers, painters, animators, technical directors for lighting and rendering, rotoscopers, and compositors. The design process starts in the art department–Hayes` territory. Here, the modelers and painters create the drawings, build the maquettes, digitize the maquettes to create 3D models, and paint texture maps. They built their own 3D digitizer, but they use standard commercial software for modeling and painting: Softimage 3D from Microsoft Softimage (Redmond, WA), Amazon 3D Paint from Interactive Effects (Tustin, CA), and Photoshop from Adobe (Mountain View, CA).

What`s unusual is that all the modelers and painters have come from traditional backgrounds; none had any CG experience before being introduced to computers at Tippett Studio. “We`ve made a specific effort to get people who have production experience or movie experience, and who have done real sculpture, real painting, real mold-making. I would consider it a deficit if somebody didn`t have that experience,” Hayes says.

Nor are their animation techniques always typical of a CG effects studio. Fifteen of the animators worked entirely within Softimage, but four of the animators used a new DID (now called a “Bug Input Device”) for stop-motion animation and to do real-time puppeteering. “Originally the DID was only going to be used for stop motion, but I realized if we just loosened the joints and put some springs on it to make it flex and bend, we could get into real-time animation,” says Stokes. Stokes worked on one of the first real-time CG animations, “Mike the Talking Head,” with performance-animation pioneers Michael Wahrman and Brad DeGraf, and later helped puppet the alien in the movie Species at Boss Film. He and Tippett found the real-time DID helpful for previsualization as well as for final animation.

“Phil and I used the DID to do some early tests, and while we were messing around we improvised a move that became a signature move for the warrior bugs,” says Stokes.

The DID was created for only the warrior bug since that creature accounts for the bulk of the animation. In one scene, for example, 700 warrior bugs storm the barricades. “That scene was done with a combination of techniques,” says Cantor. “We would do 20 or 30 very long walk cycles and apply those to the 700 bugs.” By adjusting the motion curve slightly for each bug`s cycle, they could make sure no two bugs would be doing the same thing at the same time. Using Alias/Wavefront`s (Toronto) Dynamation software, they would set up motion paths on the terrain, put one cycle on each path, then set up a force field for the objects. That motion would then be moved into Softimage where it could be tweaked if needed, according to Doug Epps, CG Supervisor.

With a similar goal in mind, that of tweaking the procedural animation for a scene in which some of the walking warrior bugs needed to look as if they’d been bombed, Stokes wrote a little program that let him blend real-time puppet animation with procedural animation.

The foreground bugs in the battle scenes were all animated within Softimage, adding a third method of CG animation. “In one shot there were 90 bugs done by hand by three animators, and behind those bugs were another 600 or 700 in the background done procedurally,” says Cantor.

A Natural Mix

This mixture of animation techniques seems only natural to Tippett. “If you have talented computer graphics animators at your disposal, you let them work with the tools they know; if you have stop-motion animators, give them the tools they’re familiar with; if you have decent puppeteers, give them some real-time puppeteering stuff. We want to let every craftsperson work in the milieu that’s the most familiar,” he explains.

The sense of creating a studio that still has a flavor of an artisan`s workshop rather than a data-processing factory is important to Tippett and Hayes, who remain somewhat uncomfortable with computer technology even though they acknowledge its advantages.

“What’s good about computer graphics is that it gives you a million opportunities to go back and fix something,” says Hayes. “And I don`t have as many cuts or abrasions. But the keyboard is really a lousy way to work.”

Tippett agrees that the editorial capability is the main benefit of CG, but close behind is the elimination of physical constraints. “Now we can manipulate the physical world with a plasticity that a painter has in the two-dimensional world.

“But I have a lot of arguments with the technology,” he adds. “I perceive a significant numbing of the workforce, although I`m not exactly sure why. It may be the banality of the activity itself. There`s a level of relaxation that reminds me very much of watching television. I don`t know if it`s a result of anything physiological, but there`s something mesmerizing about the process.”

He continues, “The intensity with which people make things by using their hands can be very exciting, and the level of enthusiasm this engenders helps make the thing more than what it would be if you were just putting something together in a banal way. But if you try to work like that on a computer, you get carpal tunnel syndrome because it`s a repetitive task.”

It isn`t just the change in working methods that bothers Tippett; an attitude change concerns him as well.

“I`m beginning to see a tendency, particularly with people from a computer-graphics perspective rather than a craft- or art-driven perspective, to live in a world where there`s no relationship to consequences,” he says. “They think you can always fix something. You can always change it, edit it. So when your $80,000 piece of junk that you just bought blows up, they just accept it. They don`t get angry.

“I really don`t like it,” Tippett continues. “When anyone is performing in real space, there`s a relationship to everything in the world around him. The architecture and certainly the light and shadow play a great deal into one`s awareness into how one moves, and to not have a shadow–it`s like Peter Pan or something. How do you know what the best dramatic highlight is?

“Computers are getting stronger and more powerful, and you can work in a milieu that approximates the real world, but you`re not getting the whole story,” he says. “Once you put the key light in the right place, the performance you thought was so clear and so clean can look very confusing.”

The picture-making knowledge Tippett and Hayes have to offer is one of the things that drew animator Cantor to Tippett Studio. “It`s great to be in a place where this 45-year-old legend of the industry walks over to your desk and goes, `Not like this, like this,` and physically acts out a warrior bug attack. I`m a much better animator than I was a year ago,” he says.

If what you want to see is a bug of a different size, why not just redraw it right on the screen, as Phil Tippett is doing for this scene in Starship Troopers?Adds Stokes: “It`s often difficult to get computer animators to stand up and act out, so part of our job is to teach them how to get up and move around. Phil will get down on all fours and start crawling around in front of your desk.”

“When Phil starts doing a demo, you feel embarrassed for not doing it,” Cantor says. “Phil is the exact opposite of a guy in a suit who makes all the decisions. He works with us. You don`t feel like you work for him so much, although of course you do. At the end of the day, he`s the dad, you know.”

That role might soon change. For the next two movies, Universal`s Virus and Disney`s My Favorite Martian, Roman hopes she and Phil will take less of a hands-on role. The crew seems ready. “The studio has been debugged and we`ve gone through our first mission together,” says Stokes. “It`s really a tight organization now.”

Phil Tippett and Adam Valdez; as dinosaur supervisor for Jurassic Park, Phil Tippett made the transition from traditional stop-motion techniques to CG animation.What will Tippett and Roman do now that they`ve more than weathered the storm? “I think we will be more like `God-mothering` and `God-fathering` and that will be a good thing,” Roman says. “It`s time for Phil to step back a bit to do something he wants to do on his own. The next step is not to be a major part of a movie but to be the movie. I think Phil has that experience, and that is the next logical thing to do.”

And that will put Tippett fully back on track, something he didn`t expect when he first learned that ILM would use CG for Jurassic Park. “I`m exhausted,” he says. “But it`s worked out pretty well.”

hat tip: http://www.angelfire.com/film/philtippett/articles/realtroopers.html