Friday Flashback #268

i am 4 wallpaper

We used both SOFTIMAGE|XSI and Avid|DS extensively in the making of the Britney Spears Toxic video. The intelligent, fast and intuitive interface in XSI, coupled with the overall speed of the software, meant that we could get the job done faster and in a style that our competitors couldn’t match.

– Amy Yukich, Executive Producer, KromA

Friday Flashback #267

Softimage Store circa 2003: mental ray licenses for $1,895


mental ray v3.1 – High-powered, photo-realistic rendering

The industry standard raytrace renderer for generating images of outstanding quality and unsurpassed realism, mental ray achieves scaleable performance on both multi-processor machines and across a network of CPU’s, without any sacrifices to render quality.

mental ray v.3.1 licenses are available from Softimage for users seeking additional power for their SOFTIMAGE|XSI v.3.0 systems or looking to render .mi files from external 3-D applications such as Alias|Wavefront’s Maya*, Discreet’s 3ds max*, or Side Effects’s Houdini.

Friday Flashback #265

Interview With Francois Lord
The CG Supervisor at Buzz Image Group talks about demanding schedules, XSI, Softimage|Behavior and the cg industry.
March, 6th, 2004by Will Mendez, Bernard Lebel, Raffael Dickreuterxsibase_interview_flord_2004

Tell us a bit about your background
After studying sciences for two years in college, I decided that I needed something a little more creative. I was already doing 3D animation as a hobby on my Amiga. I realized that it would become more and more popular in the movies and games business and that it would be possible to have a career in 3D. I studied at the NAD center when I was 20 years old. It was an eight months program back then. I immediately got a job at Buzz right after that. I was hired to work on The Real Adventures of Jonny Quest. Although this project was a total nightmare, I learned a lot during the months it lasted. Then I was assigned to various projects from commercials to movie shots. I volunteered to learn a new camera tracking software called 3D-Equalizer. This gave me the opportunity to go on the sets with the VFX supervisor to place the trackers myself and take pictures and notes on the set. A few years later, I was going on the sets alone.

Then I left Buzz to join a new company called Behaviour Studio. I worked there for fifteen months doing mostly animation, special effects and camera tracking. It was a very good experience to work in a new company because we were a small team and I got to decide how the 3D dept was going to work. This is where I learned mental ray and perl. The team didn’t grow much though, and the company was sold about a year later just when business was getting profitable. I left again to join Big Bang FX/Animation. I worked there for six month
s on a ride called Oceania. It was 90% CG, in HD resolution with stereoscopic projection. That was fun. Then I came back here at Buzz about three years ago for a total of eight years in the business.
I’ve spent all these years in Montreal and I don’t have any intentions to leave. Many of my friends went to L.A, I never felt the need to reach for this higher prestige. I prefer staying here in a small company, being involved in a bigger part of the job as an artisan than going in an industrial shop to be a small part in a huge pipeline. Now I am CG supervisor. I am responsible for many of the projects evaluations and I often lead the team on bigger projects.

What are your sources of inspiration?
I like IMAX movies. I don’t know why. Maybe it’s because they require so much money to produce that they end up being among the best documentaries. They always have this astonishing photography, images you only dreamed of and very good music to support it. Of course, there is nothing like the actual experience of the IMAX theater itself.

What do you do on your spare time?
Watching movies and documentaries, playing music (piano), skiing, cycling and rollerskating, drinking beer and playing LAN games with the guys after a work day (currently it’s Medal of Honor and Homeworld2… we got tired of Quake, we played too much).

Describe your typical work day
I usually arrive between 9h10 and 9h30, which is about half an hour before most of the team (yes, they are lazy). First, I read my mail. This takes a while because of the many mailing lists I follow. I also forward the many jokes I receive.
We don’t have a strict pipeline with dailies. So the rest of the day changes from project to project. I spend my day trying to work between all the questions and evaluations that I’m asked. Then we play a little before lunch. We go out to one of the many restaurants around the company. We come back to work until 18h – 18h30. Then we play for about an hour. On wednesdays, we play badminton (I mean real badminton) followed by a bear at the Quartier Latin. The other days it’s just Medal of Honor.

How is being a CG Supervisor different than a “regular” artist?
I’m always distracted by producers who need evaluations, coordinators who need revisions, animators who have technical questions, and sometimes it’s the president who is showing the company to potential partners and he wants me to show those little crowd making-ofs we did. I must be spending a third of my time helping people, answering questions and evaluating projects.
Evaluating a project is not easy. I often have to deal with written concepts that have no storyboard. You have to imagine what the director will want to see and how he will cut it. You have to consider the shooting constraints to decide what is better done in 3D, in comp or in live action. You also have to deal with the pressure; the numbers you say will be used as is in the budget and you don’t want anyone to work overtime because of you.

Do you use XSI exclusively?
Yes, for 3D. We prefer to learn one software the best we can than many in surface. Also, team work is better when we all speak the same language. The older members of the team all coma from a soft3D background. The newer members have learnt XSI directly at school. One of the characteristics of the 3D department at Buzz is that we do most of our compositing ourselves in Shake. The FXTree is getting better and better but it came in too late. Shake is still our preferred tool but has ceased to evolved on Windows. The FXTree may well become the new replacement one day, although I wouldn’t use it to comp movie shots yet.

What aspect of your work do you enjoy the most?
Quake? No seriously, I think it’s when a client comes with a concept that doesn’t require any shooting, and there is no external director involved. Then we can create a final product from scratch with only the agency and the end client to please. No director, no VFX supervisor, no intermediate producer to screw the whole thing up. Then I can get my hands on a bit of directing and I like that. Dollar Bank and Videotron were good examples of this, I co-directed both commercials.

What is/are the things that thrilled you the most in your career so far?
My first experience as a supervisor was quite something. It was a Coke commercial. It was the first time I went on a set alone. The director was Christian Duguay, one of the star directors in Montreal and when he asks for something, he wants it. It took two long days of shooting in cold temperature with the director telling me: ”I’m gonna shoot like I want and you do what you can to fit the effects in”. Then he flew to LA for his next movie so I had to deal with the agency alone in the Flame room. The Flame guy was even younger than me (he was 21, I was 22) and the agency had a little trouble trusting us. I also had to do an approval with the director by video conference (that was quite new at the time) with the agency behind my back watching my every moves. That was extremely stressing. In the end, all went well and the client was happy. Then I knew I could do it, I could be a supervisor… I just needed a lot more practice at it.

Can you tell us about the techniques used for the Gaz Metro commercials?
Not really… I didn’t work on this. These three commercials were created by one guy alone at Buzz. His name is Philippe Sylvain, maybe he could have his own interview.:-) I can say that the campaign was a big success, it has won many prizes, and the final look and concept was a collaboration of the Diesel agency and Philippe who worked many weeks in a row to complete it.

How does XSI make your daily work easier?
When you work in commercials, you quickly realize how much the clients are difficult. They don’t know what they want, they say something one day and the day after they say something else. They will ask you to change tiny little details just to justify their salary. So you need a tool that allows you turn around very quickly. Although I don’t use to work with the clients in my back, I often have to make changes to a scene while the client is waiting in the Flame room. This is where the passes, the animation mixer and other high level editing features come to the rescue.

What is your favorite XSI feature and what would you like to see in future releases?
The one feature that I love the most is the Passes. When I first moved from Soft3D to XSI, I immediately saw the time and energy savings it gave me as a generalist. From then on, I could animate, texture, light and set all the passes on an element for a commercial, then do a quick composite to show the client, and quickly modify the animation to the clients needs in a few hours without having to redo all the passes again. Very time saving. I like the fact that they don’t rely on presets, I can build my own complex passes to fulfill my needs.
One thing I would like to have is a powerful particle system. Right now in XSI, to be able to achieve complex particle behaviors, you have to script it. And it is almost impossible to get anything that looks volumic like thick smoke. XSI is very good for surfacing, but when it comes to fluids, gases and volumic effects, it just fails.

Being one of the first companies to use Softimage|Behavior in production can you tell us the process of using a software that had not been production proven and how did you overcome any stumbling blocks?
There were no show stopper problems in Behavior when we began working with it. We found it surprisingly stable and complete for a version 1.0 software. The downside was that we had to program everything we wanted it to do. There were a few examples coming out-of-the-box but we needed more. We were lucky enough to have enough time during the production of Dollar Bank to try and see what was achievable. We struggled a little bit with the import of the data back in XSI because of the high number of elements involved. We had to filter the curves to reduce the 16 million keyframes to a minimum. Then we hit the rendering wall. Behavior creates the animation for thousands of characters easily but it doesn’t give you any tool to render them. We had to find ways to split the scenes in layers. It wasn’t as easy as expected because of the camera movements.
The end result was not as impressive as we expected because the client has changed his mind in mid-production. It was supposed to be a one shot commercial, with the camera starting at street level and rising up to see the entire Pittsburgh city. But it ended as a multi cuts editing, showing different parts of the city that people would recognize.

What do you think of the Montreal 3D industry?
It seems to be as changing as in any other major city. I think we are lucky to have such a big 3D market for a city of that size. But it’s too bad that the market is not big enough to support many big projects at the same time. Cinégroupe had one but when it was over, many animators had to move away to find a job. There also seems to be a general problem with managing big projects in Montreal. We’ve heard so many stories of projects coming to town with big budget, only to hear later that the quality is rapidly degrading to an unacceptable level to fit the schedule. I guess this is in part because the people competent enough to manage such projects still prefer doing shots than spent their whole time filing paper work. Stupid opportunists take the job instead. This is changing, but slowly.
It’s also interesting to see that many new companies have moved away from Softimage products, as it was almost the only player in Montreal a few years ago.

Do you think the school system is adequate for the needs of the professional industry?
Yes and no. Yes because I see new students becoming better and better each year as the schools programs evolve. We have seen young animators being hired and already kicking the ass of older and more experienced animators. Then a year later those same young guys were being kicked by the new ones. Although I must say that this is often because people stop learning new features and techniques when they begin their career.
No because I have seen too many schools getting away with poor programs, inexperienced teachers (coming from the year before) and inadequate equipment. Those schools are only there for the money and they don’t care about the quality of the education they provide. Also, I have heard that the Quebec government wants to create a program for 3D animation in public schools (cegeps). I don’t believe it could be very successful. I think that a 3D school must be able to update quickly its program to follow the new technologies and techniques to adapt itself to the changing market. We all know how long it can take to reform an educational institution, 3D in general is moving way too fast for the government to adapt.

What would you say to people who want to get started in the industry?
Go to school. And before you study in 3D, make sure you learn something else. I say this because 3D animators are expected to do many things. They must be able to animate, model, texture, paint, light, frame a camera, and even program a computer. Your not going to learn all this very deeply in a 3D school. So it is a good thing to learn at least one of those arts in it’s ‘real world’ form (I was going to say ‘analog’ but I think you can learn to paint in Photoshop and you can learn the behavior of light and frame a camera with a digital one).
Also, choose your school carefully. Some of them are good, others are very bad. One good way to know which school is good is by communicating with a company working in 3D. Ask them where their employees come from.
And most importantly, make sure you like 3D animation before you spend thousands of dollars on a 3D course. The people I know who succeeded in 3D love 3D.

Having taught at Centre NAD what are some of the things that users can gain from attending a school than learning at home?
At Centre NAD, and I’m sure it’s similar in other good schools, many teachers work in the industry. So, not only do they know their matter very well, they know what’s important and what’s not, what techniques are used frequently and what is not worth spending two months learning. They also know good tricks to speed up production that took them many years to develop, and they can teach those tricks in a matter of minutes to the students.
Another important advantage of school is the fact that you learn to work in a team. At Centre NAD, the students must produce four different demos, three of them must be done in teams of at least two. This is good because they have to deal with their differences. They must learn to know what they are good at and what their team mates are better at. They also must learn to accept that their ideas are not always accepted and that their attitude is as important as their talent to succeed. Anyway, it’s a lot easier to find motivation when you work in a team than when you work alone, and motivation is very important in success.

What is your advice to fellow artists when dealing with crunch times?
It all comes down to your attitude. There is a balance between getting too much involved in your work and being too detached. Getting too much involved will consume you alive, you will spend too much time at work, you will begin to dream in wireframe, your skin color will change, your girlfriend will leave you, and finally the worst might happen: you will get tired of 3D. That’s a burnout. On the opposite, if you say that if the project was badly evaluated and it’s the management’s fault that the we’re behind schedule and you’re not going to stay late nights and weekends because of the poor team we’ve got, then you should check behind your back when the next wave of layoffs come by.
It really is a matter of having the right balance between involvement and detachment. It’s just a job, but it’s a good job. And it’s always after the big crunches that you look back and say: “Now that was fun!”.
Stress management is also very important. For this, the best solution is a LAN party. I recommend one every day.

Friday Flashback #263

use this coupon This coupon is good for $2,500 U.S. off the end-user price of a DS turnkey system


Back in 1999, a turnkey DS system ran about $130,000 US:

Pricing and Availability
Complete turnkey multi-stream systems, consisting of SOFTIMAGE|DS version 2.1 software and Intergraph Computer Systems StudioZ GT Workstations with dual Pentium III 500 MHz processors, plus video storage from either Intergraph or Avid are now available through authorized SOFTIMAGE|DS resellers. The turnkey solution has a USMSRP of $130,000. Hardware upgrade options are available for existing users of SOFTIMAGE|DS.



Friday Flashback #262

SOFTIMAGE|XSI User Profile: Capcom Co., Ltd

If you focused on just a single function then other tools may have seemed better than XSI, but they gave us the impression of functioning in isolation. But in XSI the different functions were fully and naturally connected together, which was very attractive.



SOFTIMAGE|XSI User Profile: Capcom Co., Ltd

Yoichiro Kadoguchi
Translated by: Alun Simpson

The making of Lost Planet was first disclosed at the end of 2005, and ever since, its visual quality has blown away the competition. It was released first in Japan and then in North America, where it immediately became a top-selling title for the Xbox 360. I heard that XSI had been used in the development of Lost Planet, so I decided to interview the Capcom development team and the producer, Jun Takeuchi.
Lost Planet’s Target Market

When I first saw Lost Planet, it didn’t seem to me at all like a Japanese game. Was it your goal from the beginning to develop a game for the non-Japanese market?

Takeuchi: Yes, from the start our design target was to develop a game that would be successful in North America. This was the first time for Capcom to have such an objective. We are pretty happy with what we produced.

Resident Evil and Devil May Cry were also bigger hits overseas than in Japan. I had always thought that all Capcom’s products were designed for overseas.

Takeuchi: In Resident Evil we made the dialog in English with Japanese subtitles only because we thought it would look cooler for the Japanese market, but this also resulted in the game being far more popular overseas than we expected. But like I said, Lost Planet was different because it was our goal from the beginning to be successful in North America.

For a while, I kept being invited to play Halo and Warcraft by people from Capcom. I was sure that if I took them up on the challenge I’d be killed straight away, and I didn’t want to put up with the motion sickness, so I respectfully declined!

Takeuchi: Ah! That was the Lost Planet development team! For a while they were so into those games that I began to wonder if it was interfering with their work! (Laughs) By the way, the latest research findings were incorporated into the Lost Planet design to reduce motion sickness as much possible. You should try it!
Why Did Capcom Choose XSI?

Since Jun is here today, I would like to talk about old times. Some years ago you let us give out a press release saying that Capcom was using XSI as its main tool in game development. At that time, what were your reasons for choosing XSI as your main tool?

Takeuchi: As you know, Capcom used SOFTIMAGE|3D for a long time. During the transitional stage from PS1 to PS2, we had the option of switching to other tools, but in the end we went for XSI. When we selected XSI, the key points for us were the excellent quality of the animation background, the seamless interconnection of each function, and its high scalability.

I’m happy to hear that. But since XSI back then was still in an early version, weren’t there also many functions that were missing?

Takeuchi: Of course, we also had to create many of our own tools! (Laughs) If you focused on just a single function then other tools may have seemed better than XSI, but they gave us the impression of functioning in isolation. But in XSI the different functions were fully and naturally connected together, which was very attractive. By selecting XSI with its excellent productivity, we have been able to smoothly release a number of titles, from Onimusha 3 to the latest Lost Planet.

Softimage also owes a great deal to Capcom, because as a result of their detailed feedback we have been able to further enhance XSI.

When you gave the Onimusha 3 presentation at Siggraph, you mentioned the Onimusha engine. Did you use the Onimusha engine for Lost Planet?

Takeuchi: The Onimusha engine was created for the PS2. Since then, Capcom has been preparing for the next generation of engines, and has been investing quite a lot of resources in their development. MT Framework is a part of this development series, operating on the Xbox 360 and the PS3. As well as Lost Planet, we used MT Framework for Dead Rising. By the way, although some people say that MT is an abbreviation of “multi-thread”, others say it doesn’t actually stand for anything! (Laughs)

In the questionnaire at the Onimusha 3 presentation one respondent said: “Now I understand the power of the Onimusha engine!”

Takeuchi: (Laughs) I can see how a game developer might think that. But remember that even if the Onimusha engine or the later MT Framework version is used, the actual data are created with XSI. For example, if someone were to look at an animation scene, they may think mistakenly that the data was recorded with motion capture and imported into a game engine. However, at Capcom we have traditionally added motion to game animation by hand. This work is done individually by each designer, so the support of a tool with flexible animation functions such as XSI is indispensable. Further, MT Framework was designed on the premise of it being used together with XSI, so the compatibility is extremely high.
Background in Lost Planet

I’d like to move on to our main topic. One of Lost Planet’s major attractions is its high screen density. About how much was the volume of the data?

Hara: We didn’t use LOD for the background in Lost Planet. For normal landscapes we used 300,000 polygons, while combat scenes required much more data, about 600,000 polygons. We only used about 15,000 to 30,000 polygons when making Onimusha 3, so the amount of data has increased by as much as 20 times since then.

You used effects where objects are broken, and at other time objects fill up the entire screen.

Hara: That’s right. For example, snowfields aren’t flat, so they have to be created as data, and objects that up to now could be created only once must now be created many times for use when they break. Even though the data that can be seen with the human eye is about 600,000 polygons, in fact we are using more.

Did you use PhysX as your physics engine?

Hara: No, we used Havok. When we started development, XSI was still in version 4.0, so it did not support PhysX. Version 5.0 was released during the project, but by that time we had already decided to go with Havoc.

A lot of attention has been focused on the screen effects and motion blur, and the texture applied to each object is certainly very detailed. How do you set this?

Hara: We add albedo, normal mapping, specular mapping and light mapping to each object. Each has a maximum 2K resolution. With the exception of some normal mapping, we used the render mapping function to create the objects. Light mapping was especially laborious. For HDR images, we created light mapping with an empty render map, but the number of images required was extremely high, with 300 required for a single area. Further, we had to adjust the UV to reduce the amount of data, which then meant we had to increase the density of texels in areas that can be easily seen by players. XSI’s unique UV function is extremely useful because it evenly distributes the texel density in polygon units. After this, we further edited the UV. In next generation consoles a relatively large number of polygons will be used, so we are now thinking about using a method other than mapping.

You based the main character, Wayne, on the Korean movie star, Lee Byung Hun. Doesn’t basing a character on an actor make facial animation even more difficult?

Kawano: Yes. It might have been different had we been able to use Face Robot, but when it was released by Softimage we were already in the middle of production. We considered using lip synch with a certain amount of automation, but when we tried it out we weren’t satisfied with the results. In the end we created the basic patterns and then applied animation to create the expressions. Please take a look at one of the scenes.

Did you use the GATOR function?

Kawano: When we created Lee Byung Hun’s face model, we first performed a 3D scan and recorded the shape data and texture. But as you know, although data obtained from 3D scans might seem at first glance to be OK, because the system generates polygons and UV automatically, the data is quite contaminated when viewed from a game data perspective. As such, we used the polygon data and UV that we had created as a model, and then used the GATOR function to transfer the data.

An important part of Lost Planet’s density is the excellent quality of its character animation. In particular, enemy movements that can so often be problematic, such as rotating and attacking while twisting and jumping, all seem to be effortlessly produced. How did you achieve this level of animation?

Nasu: As you know, when rotation control in bone animation exceeds the normal operating range, it becomes extremely difficult. It would be possible to use motion capture, but as Jun said earlier, at Capcom the animators manually add motion to games. Even if we were to edit motion capture data, we would still have to deal with the problem of curve discontinuity. Instead, we resolved the rotation axis problems by changing the order of rotation. XSI allows the XYZ order to be freely changed from the normal sequence, so we used this function when we thought that problems might occur. Animation layers can be used in XSI 6, and we have high hopes for this function.

Thank you very much! I’d also like to know how you added dynamic motions such as the those of enemy tentacles.

Nasu: Previously, when creating the movement of a tentacle or hand we defined all the bones and applied movements to an effector at the front end and then propagated the movement to the rest of the hand by applying a time difference to each bone. But with this method it was difficult to capture the bending caused by the rebound if the tentacle hit the floor or an object, and it was hard to represent the scenery. For this reason, in Lost Planet we used XSI’s spine skeleton. When using the spine we can control both the front and rear ends in both directions, making control very easy when it hits an object.

How did you make the tentacle extend?

Egawa: The default spine skeleton is restricted so that it cannot be extended, but if the expression in this section is rewritten then it can be. This modification enables the tentacle length to be changed. However, this time I only used one spine at a time. Using multiple spines together instead of just one should enable more complex movements and allow a greater degree of freedom, such as moving in a vortex. I want to try this out next time.

How do you add animation manually?

Nasu: I like moving the animation little by little in the animation editor, so my work is mostly based around this tool. Some people like adding animation to a rough f-curve and then adjusting the timing on the dope sheet. It all depends on the person. I also use an animation mixer a lot. In fact, I can’t imagine trying to work without it! (Laughs)

I believe that Capcom uses real-time IK. In such cases, what is your workflow when using the animation mixer? I guess you don’t convert it into FK?

Egawa: We plot each frame of the animation mixer editing results and reconvert the animation to IK information. However, there were too many key frames for the final output, so we used a tool we had prepared for thinning out the data. When we use the animation mixer, we use the time warp function to add brakes, or we edit animation that we already have, such as weighted motion, to create new movements. We perform motion blending with MT Framework. We blend and adjust each animation item on the animation mixer to ensure high-quality blending.

Hopes for Future XSI Versions

Does Capcom have any feedback to XSI?

(From left) Animator: Akihiko Nasu, Animator: Koichi Egawa, Character Modeler: Takahiro Kawano

Nasu: Oh yes! Almost more than I can tell you. I’d like to say that whenever XSI is upgraded, even the most detailed areas are properly corrected, so we have confidence in the process. For example, in an earlier version there was a problem where the snapshot information in the animation editor was not maintained if a different object was selected, but it was properly corrected in a later version. Although these small, detailed updates tend to be obscured when big new functions are released, for us animators they are extremely important. We hope that XSI updates in the future will continue to make our work easier. Ah, that reminds me, there was something I wanted to ask you after this interview. Do you have time?

Hmm, I might be here all night…! (Everyone laughs)

Visit the Capcom website
Visit the official LOST PLANET EXTREME CONDITION website

You can also read this story in Japanese on the Softimage Japan website.

Character Wayne by (C) Lee Byung Hun/FANTOM CO., LTD,