Friday Flashback #339



For Resident Evil 4 Capcom Co. Ltd. put SOFTIMAGE|XSI at the core of its development pipeline to establish an efficient workflow, to manage huge volumes of data and to create seamlessly integrated visual sophistication from event scenes to game play.

Even if you think that you’re ready to be thrilled, terrified and amazed, Resident Evil 4 might still be too much for you. It’s one fast-paced, hair-raising, visually incredible game that’s sure to be on everyone’s list for 2005 and beyond. This is definitely not your big brother’s game.

Resident Evil 4 Uses SOFTIMAGE|XSI to Redefine Survival Horror

By Alexandra Pasian

On January 27th, 2005, Capcom Co. Ltd released the latest installment in their Resident Evil franchise and redefined the survival horror genre. With spellbinding visuals, three-dimensional game play and dynamic camera work, Resident Evil 4 will have you seeing and experiencing things that you’ve never seen or felt before. Yoshiaki Hirabayashi, a Designer at Capcom, talked to us about the major role that SOFTIMAGE|XSI played in the franchise’s new look and feel.


In the past, Capcom used both SOFTIMAGE|XSI and SOFTIMAGE|3D for the development of the Resident Evil franchise, including on Resident Evil for Game Cube. For Resident Evil 4, , however, the development environment was migrated to SOFTIMAGE|XSI exclusively for everything from character modeling to animation as well as to the outputting of scene data to actual equipment.

When asked why Capcom selected SOFTIMAGE|XSI as their main creation tool, Hirabayashi explains: “The most obvious advantage to XSI is the fact that it builds on our already substantial knowledge and experience with SOFTIMAGE|3D. In addition, we have developed a real trust in the product through the support that Avid Technology offers. And, ultimately, we know that you have to select high quality tools if you want to create high quality games.”

6In order to achieve the quality that they wanted for Resident Evil 4, the team at Capcom first had to concentrate on their workflow. With more than ten times the amount of content of other installments in the series, the team had to be confident in their development pipeline. In the end, the content for the game was completed faster than usual because the developers at Capcom created an efficient workflow using SOFTIMAGE|XSI as the core of their pipeline.

With an environment that enables outputting to actual equipment, SOFTIMAGE|XSI helped to make Resident Evil 4 a reality. In their game development environment, Capcom also used the Animation Mixer in XSI to manage the volume of data, such as motion data and camera animation, that was necessary for game creation.

mixerIn order to export the scene data to the game, for example, the data that had to be outputted to the Animation Mixer and the character nodes needed to be selected in such a way that, after pressing a single button, the data could be played on the actual equipment.

Even though such operations normally require five to six steps, the developers were able to customize XSI so that, by coordinating the VBS and a proprietary tool, these operations were done in one step. This meant that the designers were able to play the scene in a split second without needing to pay attention to the program running behind XSI.

According to Hirabayashi: “The Animation Mixer in XSI is very intuitive, allowing the designers to easily understand the interface. And our developers appreciate the open and flexible environment. Because of all the advantages XSI has to offer, we were able to produce content faster and with better quality that surpassed even the director’s expectations.”


According to Hirabayashi: “There were big changes and big challenges on Resident Evil 4 as compared to previous installments. The toughest challenge involved creating all of the cinematic portions of the game as in-game cut scenes.” The team at Capcom used in-game cut scenes to create the cinematic content for the game so that the game portions and event portions would tie together seamlessly. By employing in-game cut scenes, the team was able to reduce the discrepancies in visual quality between game and cinematics, which, they felt, would allow players to concentrate on their game play. And they were absolutely right.

Resident Evil 4 has players on the run for their lives. It is wonderfully scary and offers some of the best graphics out there. What’s more, it boasts game play that is so intense that it prompted one reviewer to say: “You don’t own Resident Evil 4, it owns you.” This tension is due, in no small part, to the fact that the event scenes and game portions of Resident Evil 4 fit so well together visually.

To achieve this visual cohesion, the team at Capcom believed that they had to make sure that their in-game cut scenes had the same visual quality as a pre-rendered movie. In order to achieve the look they wanted, the team at Capcom turned to SOFTIMAGE|XSI.


Being able to control the volume of data was extremely important on this project. The team knew that they had to limit the number of polygons used in modeling but also knew that reducing the number of polygons—in order to add the right amount of texture data, for example—would result in a reduction of light. Achieving and keeping the right balance between quality and the data volume was quite a challenge. According to Hirabayashi, the team met this challenge using SOFTIMAGE|XSI.

“For the process of controlling the volume of data, we have to thank the powerful polygon modeling functionality in SOFTIMAGE|XSI that allowed us to quickly edit the model data. And, since XSI allowed us to make small edits to texture easily using such features as UV development, we were able to maintain the ideal quality for our cinematics and were also able to control the volume of data. This project would not have been possible without SOFTIMAGE|XSI.”


Friday Flashback #335

Interview with Joseph Kasparian
Textures & Lighting Lead on 300 at Hybride, Joseph Kasparian talks about creating 540 visual effects shots, the production process and how to work with completely green-screen shot movies.
March, 8th, 2007by Raffael Dickreuter 

Joseph Kasparian, Textures &
Lighting Lead at Hybride.

Tell us how and why you got started in the cg industry?
When I was 18, I dreamed of becoming the lead guitarist of a big band that would tour around the world or becoming a professional skateboarder. For some enigmatic reasons, I ended up in Finance at the University of Montreal (HEC). But the day that I saw the T1000 in Terminator 2, I realized what I really wanted to do for a living and that would be Special Effects.

In 1992, a close friend of mine introduced me to the world of computer graphics. I realized that the possibilities were endless. I kept on studying full time in Finance but was spending 6 to 8 hours a day learning 3D at home. It was the biggest hobby I ever had.

When Jurassic Park came out, I heard the professional software used for the dinosaurs was Softimage 3D and that it was made in my hometown Montreal. My dream of working in that field was more than ever possible. As soon as I got my degree in 1996, I took a specialize course in 3D animation at the NAD Centre. Once I finished, I got a job at Hybride in 1997 as a 3D animator. It’s been 10 years now.

What do you like to do in your spare time?
For what I remember, I use to play guitar, mountain bike and snowboard. But now I have a 4 year old son and I’m trying to spend most of my spare time with him and my wife. So the correct answer would be I play with Legos and Transformers and make up my time to not miss the incredible kids channel shows.

Tell us about your responsibilities on 300 as Lighting and Texturing Supervisor
As the Textures and Lighting lead, I have to evaluate the complexity of each sequence with the supervisors. I establish procedures to speed up the artist’s work which includes constant R&D; on new techniques to texture and light scenes. I make sure the outputs of my team suit the compositors needs. I guide my colleagues technically as to follow the art direction and to keep the composition in place. And finally, I take care of delivering some of the more complex shots.

What were the biggest challenges in order to deliver the desired look on this production?
We had to transform the very stylistic look of the renowned American comic book author into film: silhouetted images, painted skies with brush-stroke effects, contrasting colors, charcoal blacks, and so on. For the environments created, each structure had to be an obvious part of Frank Miller surreal world before to be photo reel. The Hot Gates walls and skies were probably the most important aspect to define before to start the production. It was crucial that each vendor respects the exact same look. In order to help that process we received detailed documentation and concept arts about the look we had to achieve.

The wolf is another good example of the style they were looking for: a surreal beast in a photo real environment developed with an artistic touch. In the graphic novel, the wolf appears only as a huge black silhouette with red eyes. We kept those elements and integrated skin, muscles and hair to go a few steps closer to the real world.


What custom tools or techniques were used especially in the area of lighting?
Many lighting techniques were used depending on the location. At first, we had to rely on precise layouts. Once we had the client approbation and depending on the amount of shots, we chose to move forward with regular Textures and Lighting techniques or Matte paintings.

We created high dynamic range environments for each sets. The effort on textures was critical to accomplish the movie style. In fact, we had to reduce the contrast levels on the 3D side to bring them back with the live footage on the final comp. The combination of flat textures with dynamic lighting helped us deliver images that allowed great flexibility for extensive color correct sessions.

The textures created where either a mixture of procedurals done in Dark Tree called within XSI or a good use of quality pictures taken from the actual set.
The matte paintings were developed over shaded scenes lit using final gathering or light rigs.

For many locations, specific light rigs where built at a very early stage and we used internal scripts to automate the creations of the passes required by the compositing department.

What kind of inspirations were used to inspire the artists for this kind of look besides the comic book?
Sin City and Sky Captain and the World of Tomorrow were without a doubt a major influence because they allowed us to work on very stylized movies that were entirely shot on green screen. We acquired a great confidence and expertise in that kind of movie.

One of the biggest sources of inspiration was the artwork sent by the production. It definitely captured the essence of the comic book and gave us a great head start.
The detailed and very technical documents done from the art department were essential to define properly each location and to develop the appropriate look for the backgrounds and skies. Each section was defined by a specific color palette to the great pleasure of the artists.


How long was the production schedule for the team at Hybride?
Hybride produced 540 visual effect shots for a total of 45 minutes. A total of 95 Hybride employees worked on the project for 16 months. We were 45 in the 3d department, 35 in the compositing department and 15 in administration and technical support.

We had a wide variety of shots to do:

  • Animation: wolf, warriors (Spartans, Persians, Free Greeks, Immortals), troops animation (thousands of warriors), whip, banners, swords, weapons, spears, axes, etc…
  • Virtual Environments: backgrounds for all of our scenes, mountains, cliffs, plains, valleys, winter sceneries, oceans, skies (clouds, lightning, moon), etc.
  • Particles: Snow, embers, fire, rain, blood, smoke, dust, dirt.

Was the film entire shot in blue screen or to what extent is the film life action and to what extent cg?
Having shot the film entirely on a blue-screen background, each step involved in post-production required a considerable amount of work. The live footage included the heros and the ground. Everything else was CG.

How was the processing of the life action footage to not only blend with the cg elements but at the same time look stylized?
We had to start by matching the 3d environment to the live footage to pretend they were shot simultaneously. Once the integration done, we stylized the whole image with the help of different passes (mats, depth fading, etc…). The extensive use of color correct definitely eased the integration of the live footage with CG elements.

How was XSI used in this production? 
XSI was used to generate almost every 3d elements. The software was very efficient to create complex environments, to do character animation and to manage army scenes with more than a 100 000 soldiers.

Which features were especially useful?
The ability to built solid output pipelines with the creations of customizable passes was without any question a very important Feature in XSI.

These Features where also very usefull:
1. GATOR to transfers any surface attributes
2. Ultimapper to grab normal maps and occlusion maps.
3. The Render Tree to build complex shaders
4. The Fxtree to do practically everything with image clips.

Which areas of the software should be improved?
The procedurals could be improved in XSI. More than ever, we are asked to generate full cg environments and the usage of procedurals is very efficient for that need. It would also be nice to improve the Texture Editor with some tools such as Uv Layout’s flatten tool and Deep Uv’s Relax tool. And since Pixologic’s Zbrush is now used everywhere, it would be interesting to include a Ztool reader in XSI. That would allow to interactively change the meshes resolution in a different way than with referenced models.

Do you think we will see more movies in the future like this and what possibilities do you see in this kind of filmmaking?
More movies are shot entirely on green screen. The director can do what he wants with each shot. I think this way of filming will become a standard. The big advantage consists in the infinite freedom of creativity that last long after the shooting is achieved. For 300, the graphic quality quickly takes over the complex and technical aspect to generate the images. This allows the audience to plunge in Frank Miller’s fantasy world. It is obvious that a new trend of film is emerging, a trend that joins more than ever the art of telling a story with the art of drawing that story.

Is there anything you would like to say to the rest of the cg community?
Nothing is more thrilling than working in a field in constant explosion. Not evolution, explosion. Each year, I’m overwhelmed by the images produced by the movie and the game industry. All technical borders are falling apart. The tools are more efficient and the artists more talented. There is no doubt that everything that made this profession exciting ten years ago is true more than ever and I’m delighted to be part of it.


Hybride near Montreal, Canada.


Friday Flashback #334

From “Prototyping 3D Games: Lesson learned from Riven,” in the March 1998 issue of Game Developer Magazine.

“Softimage’s tools are really flexible, and are one of the biggest strengths of that whole package I think,” said Richard of the 3D application used to create RIVEN. “A lot of these animations were so complex in terms of the geometry that we knew we were only going to have one shot at fully rendering this thing — it was just going to take so much time. So we really tried to make sure that we had seen it as many times as possible in its various primitive stages, including wireframe and shaded views.”

For example, even with four SGI servers, the submarine adventures at the bottom of Jungle Island Bay lived in the queue for months because the atmospheric shaders and reflections caused the  animations to creak out very slowly, frame by frame.

Also in the March 1998 edition, a review of 3D Studio MAX R2, with a list of competitors with estimated prices:

  • Softimage 3D ($7,500)
  • Softimage 3D Extreme ($15,000)



Friday Flashback #331


Studio Ghibli Changes Everything with Spirited Away

… since 1995, the Studio Ghibli 3D team, armed with SOFTIMAGE®|3D, have been more than helping out with the visuals. The full transition from traditional ink & paint techniques and shooting to digital I & P and compositing was made in 1997.

by Michael Abraham

People have come to expect miracles from Hayao Miyazaki. Since he co-founded Studio Ghibli (with lifelong colleague and sometime creative collaborator Isao Takahata) in 1985, the now-revered anime director has been the creative force behind a long list of animated films that simultaneously manage to be intensely thoughtful, critically acclaimed and hugely successful. Any filmmaker – hell, any artist – can tell you how difficult it is to hit all three points. Miyazaki’s latest offering hits all three harder than ever before.

Miyazaki’s formula, if you can call it that, involves using dazzling visuals and engaging fables to suspend our disbelief, thereby clearing the way for some truly trenchant insights. The stories and insights are Miyazaki’s idea, but since 1995, the Studio Ghibli 3D team, armed with SOFTIMAGE®|3D, have been more than helping out with the visuals. The full transition from traditional ink & paint techniques and shooting to digital I & P and compositing was made in 1997.

“We are a traditional animation production studio,” says Mitsunori Kataama, 3D-CG Supervisor at Studio Ghibli. “There are about 150 people presently working here. Within that group, we have three sections using computers for production – ten people work on ink and paint, four in compositing and seven of us in 3D-CG.  We mainly use Silicon Graphics workstations, with over thirty CPUs, including those used as servers. We also use Linux and Mac OS computers.”

That set up makes for an immensely clever, and ultimately virtuous, method, and it is employed to great effect in his most recent film. Set in modern-day Japan, Spirited Away (or Sen to Chihiro no Kamikakushi, the Japanese title) joins the daily life of ten-year-old Chihiro, a somewhat spoiled and ill-tempered girl unhappy to be moving to a new town with her family.

On their way to their new home, Chihiro and her family pass through a mysterious tunnel only to find themselves in a world not of their choosing. When her hungry parents mistakenly eat food reserved for the gods, they are suddenly transformed into pigs, leaving Chihiro as their only hope. A great many things change in this new land: a young boy becomes a dragon, an origami bird transforms into a witch and a filthy bather is reincarnated as a river god. Even Chihiro is forced to barter her real name for her survival with the evil witch Yu-baaba, who gives her the more generic sounding Sen in its place. To rescue her parents and regain her name, Sen must also change from a frightened little girl into a courageous heroine.

In creating Spirited Away, Miyazaki claims to have been making a gift specifically for his friend’s daughters, all of whom were about 10 years old at the time he got the idea. After two years and a painstaking blend of traditional cel animation and seamlessly integrated digital technology, however, it seems that his gift is being shared by just about everybody. At the time of this writing, Spirited Away is poised to overtake James Cameron’s Titanic as the single-most successful film ever shown in Japan.

Although Studio Ghibli works pretty exclusively on feature animations, with the occasional short thrown in for good measure, Spirited Away was a big job even by their standards. All of the animation, backgrounds, compositing and 3D work were accomplished in-house. Working diligently on 100 of the movie’s 1400 scenes, Kataama and his team dealt primarily with complicated scenes impossible to create solely by hand, and including intense 3D camera work and object animation.

“We used several different techniques,” says Kataama matter-of-factly. We added depth information to original 2D images by mapping hand-written backgrounds on to 3D models. In the end, we also used SOFTIMAGE|3D to calculate a reflection and a highlight component, which we then added to the hand-written background. We also developed a unique 2D Texture Shader, so we could have a multiple position camera-texture projection for mapping of our background image. We have also developed a plug-in to make changing a particular field of vision much easier.”

Another significant challenge faced by the Studio Ghibli 3D team involved the creation of realistic, ever-changing sea surface, which required the in-house development of another 2D texture shader and several material shaders. According to Kataama:

“To accurately express the look of the waves, we created a 2D texture shader that would generate a procedural texture. We really appreciate that SOFTIMAGE|3D offers such a valuable environment for developing new functions. The high-quality rendering result was extremely effective in our efforts to draw rays that would act as both reflections and highlights. For that, we were very happy to have the Ray Tracer, which we could not find anywhere else.”

Kataama pauses reflectively before continuing. “Where I used to work, we used separate in-house applications for editing modeling, animation, and texture. When I joined Studio Ghibli, SOFTIMAGE|3D immediately enabled me to do everything in an integrated environment. Even an animator working on his first 3D project can do sophisticated animation work with it right away.”

Looking to the future, Kataama and Studio Ghibli have great plans for SOFTIMAGE|XSI™. Although they are still in the evaluation phase, Kataama has already seen enough to know what will be particularly useful.

“In the coming year, we are planning to switch all work to SOFTIMAGE|XSI,” he explains patiently. “So far, we have been most impressed with Render Passes. In our work, we do final image control at compositing stage, so it is a big help that Render Passes can separate 3D into various elements.  In the past, we needed to prepare scene data when rendering, but using Render Passes means we can make multiple materials from one scene. I’ve also had a chance to look at the Render Tree, which I found very easy to use.  I was very happy because even I can create shader, even though I have no programming skills. We also have high expectations for the Subdivision Surfaces functions.”

Although they have still to evaluate the animation functions in SOFTIMAGE|XSI, Kataama and his Studio Ghibli team already know that the Animation Mixer will soon be coming in handy:

“We are planning to create a human crowd,” says Kataama. “What we have in mind is likely impossible without the Animation Mixer.  We are also looking forward to the new Toon Shader, which will help us to create an even better hand-drawn animation look.”

And, no doubt, another Miyazaki masterpiece. A film by any other name would never look this great.

Friday Flashback #330

Computer Graphics World September, 1997

by Barbara Robertson

In late 1991, when Phil Tippett and his colleagues at Tippett Studio (Berkeley, CA) began thinking about the stop-motion animation they expected to do for their next big movie, they had no idea that the movie, Jurassic Park, would instead signal the end of traditional stop-motion animation at Tippett Studio.

“We were talking about how we`d have to build new rigs, how we’d expand,” says Jules Roman, producer and co-owner of Tippett Studio along with her husband, Phil Tippett. “But meanwhile, Dennis Muren at Industrial Light & Magic had been working in CG and showed a test shot to Steven [Spielberg].

“I remember Phil had a cold and wasn’t feeling very good at the time,” says Roman. “And then Dennis called and said, `Well, Phil, we`re going in a different direction. We’re going to do all CG.` Phil promptly got pneumonia and took to his bed.”

“I was completely devastated,” Tippett says. “The CG models held up; they yielded a new look and fluid motion. I got horribly sick.”

One of the alien insects created by Tippett Studio for the TriStar production of Paul Verhoeven`s Starship Troopers.Little did he know that in approximately five years he would describe his studio as 99.99% dependent upon computer graphics; that by August, 1997, Tippett Studio would have completed a 2 1/2 year-long movie project in which all the character animation would be created with CG. That movie, Paul Verhoeven’s Starship Troopers based on Robert Heinlein`s novel, is scheduled for release this fall.

Phil Tippett began his career in visual effects with George Lucas` Star Wars, released in 1978. In 1983, he became head of the Lucasfilm creature shop and won an Academy Award for his work on Return of the Jedi, which included the memorable character “Jabba the Hut” and other aliens. In 1985, he founded Tippett Studio, which created creatures for a string of movies, including Willow, resulting in an Academy Award nomination, Ghostbusters 2, all three of Paul Verhoeven`s Robocop movies, and Honey, I Shrunk the Kids.

Art director Craig Hayes began work on armatures such as this to give stop-motion animators a way to practice their skill in the CG world without having to learn and use 3D animation software. On the table in front of Hayes is the DID used at Tippett Studio to control the onscreen warrior bug for the movie Starship Troopers.As it turned out, Tippett was to play a large role in Jurassic Park as well and would ultimately share an Academy Award for that movie. Fortunately, Craig Hayes, art director and visual effects supervisor at Tippett Studio, had been working on an armature device to feed stop-motion animation data into Softimage’s 3D animation software. And Dennis Muren was interested in keeping Tippett involved in the project. The Dinosaur Input Device, or DID, as the armature came to be called, was used for 15 of the 52 computer animation shots in Jurassic Park, including the road sequence in which the T-Rex attacks the tourists` jeep and the kitchen sequence in which the two velociraptors hunt the children.

“We`d send ILM motion data, and they`d send us back the rendered dailies,” Roman says. “It pointed the way to our future.” (This year Hayes, Brian Knep, Rick Sayre, and Thomas Williams received a Technical Achievement Award from the Academy of Motion Picture Arts and Sciences for the now renamed “Digital Input Device.”)

Time For A Change

Around the time Jurassic Park was playing in movie theaters, Tippett received the first screenplay for Starship Troopers. It was time to make the big transition to computer graphics.

“You can’t do CG on your own in a garage with a camera and a puppet,” says Roman. “CG is highly technical, it requires all kinds of infrastructure, and it’s extraordinarily expensive. Our business went from a build-up, tear-down kind of thing into something where we’re only talking about build up.”

The 15-person crew that worked on Jurassic Park grew to around 30 people when the studio got a contract to create two CG sequences for the movie Three Wishes. “It was a very quiet movie,” says Roman, “but it gave us a fantastic opportunity to get our CG legs.” Meanwhile, they began working on tests for Starship and began hiring people–eventually tripling in size. The final “test” was an eight-shot mini movie directed by Paul Verhoeven. “After that, we felt pretty confident we could do the production we had been doing but with the computers,” says Hayes, who has been with the studio since its early days.

For a Starship Troopers` battle scene, animators at Tippett Studio used a combination of keyframe, puppet, and procedural animation techniques within Softimage 3D to orchestrate the movements of 700 nine-foot-tall warrior bugs.For Starship Troopers, Tippett Studio designed all the creatures, which are giant bugs, and created all 218 creature shots. (Other studios, Sony Pictures Imageworks, ILM, and Boss Film, worked on spaceship effects.) “This is like a World War II movie,” Hayes explains. “The bugs are the enemy.” Nine-foot-tall warrior bugs that resemble scorpions are the generic ground troops, mosquito-like “hoppers” function as an air force, and big tanker bugs that look like beetles act as battering rams. In addition, a 10-story-tall bug launches the bug-equivalent of rockets, and an evil brain bug does unspeakable things to its enemies.

“No matter what you do to them, all they want to do is kill you,” says Jeremy Cantor, an animator who worked on the brain bug. “They don’t think. They just kill.” What do they kill? Humans.

By loosening some joints to make the armature more flexible, Trey Stokes, animation director at Tippett Studio, transformed the warrior bug DID into a digital puppet that was used for real-time animation.Adds Trey Stokes, animation supervisor: “We tried to give the sense that these are real creatures that exist in the same world as the humans, so they`re living under the same gravity and have to obey the same physical laws. Of course, we cheated mercilessly for the needs of the shots. I guess the only character we injected into them was to give them the sense that they enjoy their job.”

As is typical in most CG effects studios, the production pipeline includes modelers, painters, animators, technical directors for lighting and rendering, rotoscopers, and compositors. The design process starts in the art department–Hayes` territory. Here, the modelers and painters create the drawings, build the maquettes, digitize the maquettes to create 3D models, and paint texture maps. They built their own 3D digitizer, but they use standard commercial software for modeling and painting: Softimage 3D from Microsoft Softimage (Redmond, WA), Amazon 3D Paint from Interactive Effects (Tustin, CA), and Photoshop from Adobe (Mountain View, CA).

What`s unusual is that all the modelers and painters have come from traditional backgrounds; none had any CG experience before being introduced to computers at Tippett Studio. “We`ve made a specific effort to get people who have production experience or movie experience, and who have done real sculpture, real painting, real mold-making. I would consider it a deficit if somebody didn`t have that experience,” Hayes says.

Nor are their animation techniques always typical of a CG effects studio. Fifteen of the animators worked entirely within Softimage, but four of the animators used a new DID (now called a “Bug Input Device”) for stop-motion animation and to do real-time puppeteering. “Originally the DID was only going to be used for stop motion, but I realized if we just loosened the joints and put some springs on it to make it flex and bend, we could get into real-time animation,” says Stokes. Stokes worked on one of the first real-time CG animations, “Mike the Talking Head,” with performance-animation pioneers Michael Wahrman and Brad DeGraf, and later helped puppet the alien in the movie Species at Boss Film. He and Tippett found the real-time DID helpful for previsualization as well as for final animation.

“Phil and I used the DID to do some early tests, and while we were messing around we improvised a move that became a signature move for the warrior bugs,” says Stokes.

The DID was created for only the warrior bug since that creature accounts for the bulk of the animation. In one scene, for example, 700 warrior bugs storm the barricades. “That scene was done with a combination of techniques,” says Cantor. “We would do 20 or 30 very long walk cycles and apply those to the 700 bugs.” By adjusting the motion curve slightly for each bug`s cycle, they could make sure no two bugs would be doing the same thing at the same time. Using Alias/Wavefront`s (Toronto) Dynamation software, they would set up motion paths on the terrain, put one cycle on each path, then set up a force field for the objects. That motion would then be moved into Softimage where it could be tweaked if needed, according to Doug Epps, CG Supervisor.

With a similar goal in mind, that of tweaking the procedural animation for a scene in which some of the walking warrior bugs needed to look as if they’d been bombed, Stokes wrote a little program that let him blend real-time puppet animation with procedural animation.

The foreground bugs in the battle scenes were all animated within Softimage, adding a third method of CG animation. “In one shot there were 90 bugs done by hand by three animators, and behind those bugs were another 600 or 700 in the background done procedurally,” says Cantor.

A Natural Mix

This mixture of animation techniques seems only natural to Tippett. “If you have talented computer graphics animators at your disposal, you let them work with the tools they know; if you have stop-motion animators, give them the tools they’re familiar with; if you have decent puppeteers, give them some real-time puppeteering stuff. We want to let every craftsperson work in the milieu that’s the most familiar,” he explains.

The sense of creating a studio that still has a flavor of an artisan`s workshop rather than a data-processing factory is important to Tippett and Hayes, who remain somewhat uncomfortable with computer technology even though they acknowledge its advantages.

“What’s good about computer graphics is that it gives you a million opportunities to go back and fix something,” says Hayes. “And I don`t have as many cuts or abrasions. But the keyboard is really a lousy way to work.”

Tippett agrees that the editorial capability is the main benefit of CG, but close behind is the elimination of physical constraints. “Now we can manipulate the physical world with a plasticity that a painter has in the two-dimensional world.

“But I have a lot of arguments with the technology,” he adds. “I perceive a significant numbing of the workforce, although I`m not exactly sure why. It may be the banality of the activity itself. There`s a level of relaxation that reminds me very much of watching television. I don`t know if it`s a result of anything physiological, but there`s something mesmerizing about the process.”

He continues, “The intensity with which people make things by using their hands can be very exciting, and the level of enthusiasm this engenders helps make the thing more than what it would be if you were just putting something together in a banal way. But if you try to work like that on a computer, you get carpal tunnel syndrome because it`s a repetitive task.”

It isn`t just the change in working methods that bothers Tippett; an attitude change concerns him as well.

“I`m beginning to see a tendency, particularly with people from a computer-graphics perspective rather than a craft- or art-driven perspective, to live in a world where there`s no relationship to consequences,” he says. “They think you can always fix something. You can always change it, edit it. So when your $80,000 piece of junk that you just bought blows up, they just accept it. They don`t get angry.

“I really don`t like it,” Tippett continues. “When anyone is performing in real space, there`s a relationship to everything in the world around him. The architecture and certainly the light and shadow play a great deal into one`s awareness into how one moves, and to not have a shadow–it`s like Peter Pan or something. How do you know what the best dramatic highlight is?

“Computers are getting stronger and more powerful, and you can work in a milieu that approximates the real world, but you`re not getting the whole story,” he says. “Once you put the key light in the right place, the performance you thought was so clear and so clean can look very confusing.”

The picture-making knowledge Tippett and Hayes have to offer is one of the things that drew animator Cantor to Tippett Studio. “It`s great to be in a place where this 45-year-old legend of the industry walks over to your desk and goes, `Not like this, like this,` and physically acts out a warrior bug attack. I`m a much better animator than I was a year ago,” he says.

If what you want to see is a bug of a different size, why not just redraw it right on the screen, as Phil Tippett is doing for this scene in Starship Troopers?Adds Stokes: “It`s often difficult to get computer animators to stand up and act out, so part of our job is to teach them how to get up and move around. Phil will get down on all fours and start crawling around in front of your desk.”

“When Phil starts doing a demo, you feel embarrassed for not doing it,” Cantor says. “Phil is the exact opposite of a guy in a suit who makes all the decisions. He works with us. You don`t feel like you work for him so much, although of course you do. At the end of the day, he`s the dad, you know.”

That role might soon change. For the next two movies, Universal`s Virus and Disney`s My Favorite Martian, Roman hopes she and Phil will take less of a hands-on role. The crew seems ready. “The studio has been debugged and we`ve gone through our first mission together,” says Stokes. “It`s really a tight organization now.”

Phil Tippett and Adam Valdez; as dinosaur supervisor for Jurassic Park, Phil Tippett made the transition from traditional stop-motion techniques to CG animation.What will Tippett and Roman do now that they`ve more than weathered the storm? “I think we will be more like `God-mothering` and `God-fathering` and that will be a good thing,” Roman says. “It`s time for Phil to step back a bit to do something he wants to do on his own. The next step is not to be a major part of a movie but to be the movie. I think Phil has that experience, and that is the next logical thing to do.”

And that will put Tippett fully back on track, something he didn`t expect when he first learned that ILM would use CG for Jurassic Park. “I`m exhausted,” he says. “But it`s worked out pretty well.”

hat tip: