Friday Flashback #339


05_CS_RE4

Summary

For Resident Evil 4 Capcom Co. Ltd. put SOFTIMAGE|XSI at the core of its development pipeline to establish an efficient workflow, to manage huge volumes of data and to create seamlessly integrated visual sophistication from event scenes to game play.

Even if you think that you’re ready to be thrilled, terrified and amazed, Resident Evil 4 might still be too much for you. It’s one fast-paced, hair-raising, visually incredible game that’s sure to be on everyone’s list for 2005 and beyond. This is definitely not your big brother’s game.

Resident Evil 4 Uses SOFTIMAGE|XSI to Redefine Survival Horror

By Alexandra Pasian

On January 27th, 2005, Capcom Co. Ltd released the latest installment in their Resident Evil franchise and redefined the survival horror genre. With spellbinding visuals, three-dimensional game play and dynamic camera work, Resident Evil 4 will have you seeing and experiencing things that you’ve never seen or felt before. Yoshiaki Hirabayashi, a Designer at Capcom, talked to us about the major role that SOFTIMAGE|XSI played in the franchise’s new look and feel.

XSI AT THE CORE OF THE DEVELOPMENT PIPELINE

In the past, Capcom used both SOFTIMAGE|XSI and SOFTIMAGE|3D for the development of the Resident Evil franchise, including on Resident Evil for Game Cube. For Resident Evil 4, , however, the development environment was migrated to SOFTIMAGE|XSI exclusively for everything from character modeling to animation as well as to the outputting of scene data to actual equipment.

When asked why Capcom selected SOFTIMAGE|XSI as their main creation tool, Hirabayashi explains: “The most obvious advantage to XSI is the fact that it builds on our already substantial knowledge and experience with SOFTIMAGE|3D. In addition, we have developed a real trust in the product through the support that Avid Technology offers. And, ultimately, we know that you have to select high quality tools if you want to create high quality games.”

6In order to achieve the quality that they wanted for Resident Evil 4, the team at Capcom first had to concentrate on their workflow. With more than ten times the amount of content of other installments in the series, the team had to be confident in their development pipeline. In the end, the content for the game was completed faster than usual because the developers at Capcom created an efficient workflow using SOFTIMAGE|XSI as the core of their pipeline.

With an environment that enables outputting to actual equipment, SOFTIMAGE|XSI helped to make Resident Evil 4 a reality. In their game development environment, Capcom also used the Animation Mixer in XSI to manage the volume of data, such as motion data and camera animation, that was necessary for game creation.

mixerIn order to export the scene data to the game, for example, the data that had to be outputted to the Animation Mixer and the character nodes needed to be selected in such a way that, after pressing a single button, the data could be played on the actual equipment.

Even though such operations normally require five to six steps, the developers were able to customize XSI so that, by coordinating the VBS and a proprietary tool, these operations were done in one step. This meant that the designers were able to play the scene in a split second without needing to pay attention to the program running behind XSI.

According to Hirabayashi: “The Animation Mixer in XSI is very intuitive, allowing the designers to easily understand the interface. And our developers appreciate the open and flexible environment. Because of all the advantages XSI has to offer, we were able to produce content faster and with better quality that surpassed even the director’s expectations.”

STREAMLINING VISUAL SOPHISTICATION

According to Hirabayashi: “There were big changes and big challenges on Resident Evil 4 as compared to previous installments. The toughest challenge involved creating all of the cinematic portions of the game as in-game cut scenes.” The team at Capcom used in-game cut scenes to create the cinematic content for the game so that the game portions and event portions would tie together seamlessly. By employing in-game cut scenes, the team was able to reduce the discrepancies in visual quality between game and cinematics, which, they felt, would allow players to concentrate on their game play. And they were absolutely right.

Resident Evil 4 has players on the run for their lives. It is wonderfully scary and offers some of the best graphics out there. What’s more, it boasts game play that is so intense that it prompted one reviewer to say: “You don’t own Resident Evil 4, it owns you.” This tension is due, in no small part, to the fact that the event scenes and game portions of Resident Evil 4 fit so well together visually.

To achieve this visual cohesion, the team at Capcom believed that they had to make sure that their in-game cut scenes had the same visual quality as a pre-rendered movie. In order to achieve the look they wanted, the team at Capcom turned to SOFTIMAGE|XSI.

MANAGING HUGE DATA VOLUME

Being able to control the volume of data was extremely important on this project. The team knew that they had to limit the number of polygons used in modeling but also knew that reducing the number of polygons—in order to add the right amount of texture data, for example—would result in a reduction of light. Achieving and keeping the right balance between quality and the data volume was quite a challenge. According to Hirabayashi, the team met this challenge using SOFTIMAGE|XSI.

“For the process of controlling the volume of data, we have to thank the powerful polygon modeling functionality in SOFTIMAGE|XSI that allowed us to quickly edit the model data. And, since XSI allowed us to make small edits to texture easily using such features as UV development, we were able to maintain the ideal quality for our cinematics and were also able to control the volume of data. This project would not have been possible without SOFTIMAGE|XSI.”

 

Friday Flashback #335


Interview with Joseph Kasparian
Textures & Lighting Lead on 300 at Hybride, Joseph Kasparian talks about creating 540 visual effects shots, the production process and how to work with completely green-screen shot movies.
March, 8th, 2007by Raffael Dickreuter 

1173333602_jo1
Joseph Kasparian, Textures &
Lighting Lead at Hybride.

Tell us how and why you got started in the cg industry?
When I was 18, I dreamed of becoming the lead guitarist of a big band that would tour around the world or becoming a professional skateboarder. For some enigmatic reasons, I ended up in Finance at the University of Montreal (HEC). But the day that I saw the T1000 in Terminator 2, I realized what I really wanted to do for a living and that would be Special Effects.

In 1992, a close friend of mine introduced me to the world of computer graphics. I realized that the possibilities were endless. I kept on studying full time in Finance but was spending 6 to 8 hours a day learning 3D at home. It was the biggest hobby I ever had.

When Jurassic Park came out, I heard the professional software used for the dinosaurs was Softimage 3D and that it was made in my hometown Montreal. My dream of working in that field was more than ever possible. As soon as I got my degree in 1996, I took a specialize course in 3D animation at the NAD Centre. Once I finished, I got a job at Hybride in 1997 as a 3D animator. It’s been 10 years now.

What do you like to do in your spare time?
For what I remember, I use to play guitar, mountain bike and snowboard. But now I have a 4 year old son and I’m trying to spend most of my spare time with him and my wife. So the correct answer would be I play with Legos and Transformers and make up my time to not miss the incredible kids channel shows.

Tell us about your responsibilities on 300 as Lighting and Texturing Supervisor
As the Textures and Lighting lead, I have to evaluate the complexity of each sequence with the supervisors. I establish procedures to speed up the artist’s work which includes constant R&D; on new techniques to texture and light scenes. I make sure the outputs of my team suit the compositors needs. I guide my colleagues technically as to follow the art direction and to keep the composition in place. And finally, I take care of delivering some of the more complex shots.

What were the biggest challenges in order to deliver the desired look on this production?
We had to transform the very stylistic look of the renowned American comic book author into film: silhouetted images, painted skies with brush-stroke effects, contrasting colors, charcoal blacks, and so on. For the environments created, each structure had to be an obvious part of Frank Miller surreal world before to be photo reel. The Hot Gates walls and skies were probably the most important aspect to define before to start the production. It was crucial that each vendor respects the exact same look. In order to help that process we received detailed documentation and concept arts about the look we had to achieve.

The wolf is another good example of the style they were looking for: a surreal beast in a photo real environment developed with an artistic touch. In the graphic novel, the wolf appears only as a huge black silhouette with red eyes. We kept those elements and integrated skin, muscles and hair to go a few steps closer to the real world.

jo5

What custom tools or techniques were used especially in the area of lighting?
Many lighting techniques were used depending on the location. At first, we had to rely on precise layouts. Once we had the client approbation and depending on the amount of shots, we chose to move forward with regular Textures and Lighting techniques or Matte paintings.

We created high dynamic range environments for each sets. The effort on textures was critical to accomplish the movie style. In fact, we had to reduce the contrast levels on the 3D side to bring them back with the live footage on the final comp. The combination of flat textures with dynamic lighting helped us deliver images that allowed great flexibility for extensive color correct sessions.

The textures created where either a mixture of procedurals done in Dark Tree called within XSI or a good use of quality pictures taken from the actual set.
The matte paintings were developed over shaded scenes lit using final gathering or light rigs.

For many locations, specific light rigs where built at a very early stage and we used internal scripts to automate the creations of the passes required by the compositing department.

What kind of inspirations were used to inspire the artists for this kind of look besides the comic book?
Sin City and Sky Captain and the World of Tomorrow were without a doubt a major influence because they allowed us to work on very stylized movies that were entirely shot on green screen. We acquired a great confidence and expertise in that kind of movie.

One of the biggest sources of inspiration was the artwork sent by the production. It definitely captured the essence of the comic book and gave us a great head start.
The detailed and very technical documents done from the art department were essential to define properly each location and to develop the appropriate look for the backgrounds and skies. Each section was defined by a specific color palette to the great pleasure of the artists.

jo4

How long was the production schedule for the team at Hybride?
Hybride produced 540 visual effect shots for a total of 45 minutes. A total of 95 Hybride employees worked on the project for 16 months. We were 45 in the 3d department, 35 in the compositing department and 15 in administration and technical support.

We had a wide variety of shots to do:

  • Animation: wolf, warriors (Spartans, Persians, Free Greeks, Immortals), troops animation (thousands of warriors), whip, banners, swords, weapons, spears, axes, etc…
  • Virtual Environments: backgrounds for all of our scenes, mountains, cliffs, plains, valleys, winter sceneries, oceans, skies (clouds, lightning, moon), etc.
  • Particles: Snow, embers, fire, rain, blood, smoke, dust, dirt.

Was the film entire shot in blue screen or to what extent is the film life action and to what extent cg?
Having shot the film entirely on a blue-screen background, each step involved in post-production required a considerable amount of work. The live footage included the heros and the ground. Everything else was CG.

jo3
How was the processing of the life action footage to not only blend with the cg elements but at the same time look stylized?
We had to start by matching the 3d environment to the live footage to pretend they were shot simultaneously. Once the integration done, we stylized the whole image with the help of different passes (mats, depth fading, etc…). The extensive use of color correct definitely eased the integration of the live footage with CG elements.

How was XSI used in this production? 
XSI was used to generate almost every 3d elements. The software was very efficient to create complex environments, to do character animation and to manage army scenes with more than a 100 000 soldiers.

jo2
Which features were especially useful?
The ability to built solid output pipelines with the creations of customizable passes was without any question a very important Feature in XSI.

These Features where also very usefull:
1. GATOR to transfers any surface attributes
2. Ultimapper to grab normal maps and occlusion maps.
3. The Render Tree to build complex shaders
4. The Fxtree to do practically everything with image clips.

Which areas of the software should be improved?
The procedurals could be improved in XSI. More than ever, we are asked to generate full cg environments and the usage of procedurals is very efficient for that need. It would also be nice to improve the Texture Editor with some tools such as Uv Layout’s flatten tool and Deep Uv’s Relax tool. And since Pixologic’s Zbrush is now used everywhere, it would be interesting to include a Ztool reader in XSI. That would allow to interactively change the meshes resolution in a different way than with referenced models.

jo6
Do you think we will see more movies in the future like this and what possibilities do you see in this kind of filmmaking?
More movies are shot entirely on green screen. The director can do what he wants with each shot. I think this way of filming will become a standard. The big advantage consists in the infinite freedom of creativity that last long after the shooting is achieved. For 300, the graphic quality quickly takes over the complex and technical aspect to generate the images. This allows the audience to plunge in Frank Miller’s fantasy world. It is obvious that a new trend of film is emerging, a trend that joins more than ever the art of telling a story with the art of drawing that story.

Is there anything you would like to say to the rest of the cg community?
Nothing is more thrilling than working in a field in constant explosion. Not evolution, explosion. Each year, I’m overwhelmed by the images produced by the movie and the game industry. All technical borders are falling apart. The tools are more efficient and the artists more talented. There is no doubt that everything that made this profession exciting ten years ago is true more than ever and I’m delighted to be part of it.

 

Hybride near Montreal, Canada.


 

Friday Flashback #334


From “Prototyping 3D Games: Lesson learned from Riven,” in the March 1998 issue of Game Developer Magazine.

“Softimage’s tools are really flexible, and are one of the biggest strengths of that whole package I think,” said Richard of the 3D application used to create RIVEN. “A lot of these animations were so complex in terms of the geometry that we knew we were only going to have one shot at fully rendering this thing — it was just going to take so much time. So we really tried to make sure that we had seen it as many times as possible in its various primitive stages, including wireframe and shaded views.”

For example, even with four SGI servers, the submarine adventures at the bottom of Jungle Island Bay lived in the queue for months because the atmospheric shaders and reflections caused the  animations to creak out very slowly, frame by frame.

Also in the March 1998 edition, a review of 3D Studio MAX R2, with a list of competitors with estimated prices:

  • Softimage 3D ($7,500)
  • Softimage 3D Extreme ($15,000)

gdm_march1998

 

Friday Flashback #331


 

BY ANY OTHER NAME:
Studio Ghibli Changes Everything with Spirited Away

… since 1995, the Studio Ghibli 3D team, armed with SOFTIMAGE®|3D, have been more than helping out with the visuals. The full transition from traditional ink & paint techniques and shooting to digital I & P and compositing was made in 1997.

by Michael Abraham

People have come to expect miracles from Hayao Miyazaki. Since he co-founded Studio Ghibli (with lifelong colleague and sometime creative collaborator Isao Takahata) in 1985, the now-revered anime director has been the creative force behind a long list of animated films that simultaneously manage to be intensely thoughtful, critically acclaimed and hugely successful. Any filmmaker – hell, any artist – can tell you how difficult it is to hit all three points. Miyazaki’s latest offering hits all three harder than ever before.

Miyazaki’s formula, if you can call it that, involves using dazzling visuals and engaging fables to suspend our disbelief, thereby clearing the way for some truly trenchant insights. The stories and insights are Miyazaki’s idea, but since 1995, the Studio Ghibli 3D team, armed with SOFTIMAGE®|3D, have been more than helping out with the visuals. The full transition from traditional ink & paint techniques and shooting to digital I & P and compositing was made in 1997.

“We are a traditional animation production studio,” says Mitsunori Kataama, 3D-CG Supervisor at Studio Ghibli. “There are about 150 people presently working here. Within that group, we have three sections using computers for production – ten people work on ink and paint, four in compositing and seven of us in 3D-CG.  We mainly use Silicon Graphics workstations, with over thirty CPUs, including those used as servers. We also use Linux and Mac OS computers.”

That set up makes for an immensely clever, and ultimately virtuous, method, and it is employed to great effect in his most recent film. Set in modern-day Japan, Spirited Away (or Sen to Chihiro no Kamikakushi, the Japanese title) joins the daily life of ten-year-old Chihiro, a somewhat spoiled and ill-tempered girl unhappy to be moving to a new town with her family.

On their way to their new home, Chihiro and her family pass through a mysterious tunnel only to find themselves in a world not of their choosing. When her hungry parents mistakenly eat food reserved for the gods, they are suddenly transformed into pigs, leaving Chihiro as their only hope. A great many things change in this new land: a young boy becomes a dragon, an origami bird transforms into a witch and a filthy bather is reincarnated as a river god. Even Chihiro is forced to barter her real name for her survival with the evil witch Yu-baaba, who gives her the more generic sounding Sen in its place. To rescue her parents and regain her name, Sen must also change from a frightened little girl into a courageous heroine.

In creating Spirited Away, Miyazaki claims to have been making a gift specifically for his friend’s daughters, all of whom were about 10 years old at the time he got the idea. After two years and a painstaking blend of traditional cel animation and seamlessly integrated digital technology, however, it seems that his gift is being shared by just about everybody. At the time of this writing, Spirited Away is poised to overtake James Cameron’s Titanic as the single-most successful film ever shown in Japan.

Although Studio Ghibli works pretty exclusively on feature animations, with the occasional short thrown in for good measure, Spirited Away was a big job even by their standards. All of the animation, backgrounds, compositing and 3D work were accomplished in-house. Working diligently on 100 of the movie’s 1400 scenes, Kataama and his team dealt primarily with complicated scenes impossible to create solely by hand, and including intense 3D camera work and object animation.

“We used several different techniques,” says Kataama matter-of-factly. We added depth information to original 2D images by mapping hand-written backgrounds on to 3D models. In the end, we also used SOFTIMAGE|3D to calculate a reflection and a highlight component, which we then added to the hand-written background. We also developed a unique 2D Texture Shader, so we could have a multiple position camera-texture projection for mapping of our background image. We have also developed a plug-in to make changing a particular field of vision much easier.”

Another significant challenge faced by the Studio Ghibli 3D team involved the creation of realistic, ever-changing sea surface, which required the in-house development of another 2D texture shader and several material shaders. According to Kataama:

“To accurately express the look of the waves, we created a 2D texture shader that would generate a procedural texture. We really appreciate that SOFTIMAGE|3D offers such a valuable environment for developing new functions. The high-quality rendering result was extremely effective in our efforts to draw rays that would act as both reflections and highlights. For that, we were very happy to have the Ray Tracer, which we could not find anywhere else.”

Kataama pauses reflectively before continuing. “Where I used to work, we used separate in-house applications for editing modeling, animation, and texture. When I joined Studio Ghibli, SOFTIMAGE|3D immediately enabled me to do everything in an integrated environment. Even an animator working on his first 3D project can do sophisticated animation work with it right away.”

Looking to the future, Kataama and Studio Ghibli have great plans for SOFTIMAGE|XSI™. Although they are still in the evaluation phase, Kataama has already seen enough to know what will be particularly useful.

“In the coming year, we are planning to switch all work to SOFTIMAGE|XSI,” he explains patiently. “So far, we have been most impressed with Render Passes. In our work, we do final image control at compositing stage, so it is a big help that Render Passes can separate 3D into various elements.  In the past, we needed to prepare scene data when rendering, but using Render Passes means we can make multiple materials from one scene. I’ve also had a chance to look at the Render Tree, which I found very easy to use.  I was very happy because even I can create shader, even though I have no programming skills. We also have high expectations for the Subdivision Surfaces functions.”

Although they have still to evaluate the animation functions in SOFTIMAGE|XSI, Kataama and his Studio Ghibli team already know that the Animation Mixer will soon be coming in handy:

“We are planning to create a human crowd,” says Kataama. “What we have in mind is likely impossible without the Animation Mixer.  We are also looking forward to the new Toon Shader, which will help us to create an even better hand-drawn animation look.”

And, no doubt, another Miyazaki masterpiece. A film by any other name would never look this great.