Friday Flashback #217

Interview With Alain Laferrière
The Program Manager of Softimage Japan talks about his career at Softimage, the Japanese Industry and XSI 4.0.
June, 28th, 2004, by Raffael Dickreuter, Will Mendez

Alain Laferrière,
Program Manager,
Softimage Japan.

How did you get started in the cg industry?
I got interested in computer imagery from a young age, playing with the first video game consoles and personal computers and following the evolution of CG with a deep interest. The “demo scene” development communities for both C-64 and Amiga computers also brought a lot of innovation in realtime graphics synthesis.
At University of Montreal I studied computer graphics and did a M.Sc. in human-computer interfaces ( agents ). There, I met Réjean Gagné, Richard Laperrière and Dominique Boisvert, who would later implement the Actor Module in SI3D, and many other friends who were, or are still working at Softimage today. After I completed my studies I got a job interview at Softimage and was hired ; it was a dream come true ! I started in the modeling team, then moved on to motion capture & control, then headed the new Games team and now Special Projects Japan.

What do you do on your spare time?
I love music very much and I’ve been DJ’ing on and off since the age of 13, so I buy records and practice mixing. I also like programming personal projects, reading, cooking, nature, going out with friends, going to the gym, etc.

Tell us a bit about the highlights of your 11 year career at Softimage

  • April 14th 1993 – my first day at Softimage !
  • solving a data loss problem with an Ascension “Flock of Birds” motion capture system not working properly with a Onyx computer, which was being used at a Jurassic Park theme park at Unviersal Studios. I implemented a fix in Montreal and I was sent to Los Angeles to configure the system. I had a chance to meet the movie actors and Steven Spielberg made an appearance, it was a very impressive “Hollywood” type first experience for me.
  • then I worked on an interesting project : one of our customers was using a Waldo-like hand manipulation device to radio-control the facial expressions of the robotic heads used for the Teenage Mutant Ninja Turtles movies. The actors had to wear the Turtle heads while servo-motors moved metal blades around their face to flex and animate the head latex skin ; as well as a radio receiver backpack which was hidden in the turtle shell. Since the facial animation had to be redone live at each shot , it was not possible to produce a constant facial animation quality. First, I wrote a motion capture driver for the Waldo device and a designer modeled the turtle head in SI3D and connected the Waldo inputs to its facial animation shapes. Now we were able to record a live facial animation using the Waldo device, then perfect it in SI3D by editing the animation curves. After that, I modified the motion capture communication protocol to support motion control, and wrote a control driver to radio-broadcast the data which was previously motion captured and edited in SI3D. With this pipeline it was possible to author a perfect animation sequence which could be played back identically at each shot. The project had an extra dimension of danger, since there were urban legends of actors getting mutilated by the blades attached inside the robotic heads. Something that keeps your mind busy when your head is inside.. 😉
  • Fall of 1994, I went to Japan with Daniel Langlois for a 2 weeks business trip, which mutated into a large effort between Softimage and SEGA to create a first generation of 3D game authoring tools. I was called to stay there for nearly 3 months in the first trip, and to frequently pay visits to Japan afterwards. I supervised the project and other developers helped me designing and implementing the features ( 3D paint, color reduction, SEGA Saturn Export / Import and Viewer, raycast polygon selection, etc). It was a very intense and exciting coding experience. Following this, SI3D quickly became the standard for 3D game authoring in Japan.
  • RenderMap – a project which I started after talking to a Japanese game designer, who explained me how he was planning to use clay models to create the characters of his next game. They would scan the clay models into high-resolution polygon mesh models with colors at vertices, and then transform into texture data by rendering front and back view images and re-projecting them on the low-resolution model. With this method there is no control over how the texture is being distributed on the geometry, and although it works for polygons facing the camera, the texturing quality quickly degrades as polygons face at an increasing angle. Instead, I thought it would be better to fire rays along the surface to capture the color information and if those rays could hit a high-resolution version of the object, then we could carry the texturing data from high-res to low-res. RenderMap solved this by using mental ray to fire rays at the intersection of triangles and texels ( according to their location on the object ) and accumulating the weighted color contributions to a final color per texel. You could carry the high-res data to a low-res model, by placing the high-res model in the same location as the low-res one, but making it slightly bigger. Since it uses Mental Ray, it also allows to burn procedural effects into texture maps, generate vertex colors, normal maps, etc. My colleague Ian Stewart implemented the much improved version in XSI.

    Manta model before applying RenderMap – we can see the results of rendering using mental ray in the render region.

    Texture data after being pre-rendred with mental ray using RenderMap.

    Manta model after applying RenderMap.
  • dotXSI file format – an initiative which I launched over 7 years ago, at a time when our customers were asking us to design a high-level 3D data interchange solution. After much discussions with my colleagues in the Games team, we finally opted for using the Microsoft .X format concept, with many new data templates to support a lot of features which were not covered in basic .X files. Since that time, the format has become very popular in the cg and games markets as a generic data interchange solution between authoring tools and pipelines. I was initially hoping to see a good level of popularity one day, but it has now reached well beyond those early expectations. This was also an opportunity for us to understand the fundamental differences between a high-level generic format and an optimized “memory image” data format which can be loaded and used directly by a game engine without further optimizations. From this, we derived our strategy behind the dotXSI file format and the FTK ( File Transfer Kit ), which is a library to read and write dotXSI files and which can be used to write converters between dotXSI files and optimized memory image formats tailored to any custom game engine ( or between any other format and dotXSI ). I came up with the “.XSI” name suffix, which was later adopted by the marketing team for the name of the XSI product itself, although I had no involvement with this decision.
  • April 99 – moving to Japan
  • various Special Projects with Japanese clients
  • misc. SI3D features like vertex colors, generic user data, GC_ValidateMesh, SI3D 4.0 ; and now XSI tools

Working in Japan: Was it difficult to adjust to a new location and culture?
Not really. I got a good experience of Japan during my first trip ; having a routine and working on-site at SEGA was a chance to feel how life would be living here. I came back again on business trips about a dozen times, my interest growing until I relocated in April 99. Adapting to the Japanese culture is a life-long effort, but Japan is very welcoming so this can be relatively painless. Of course I always miss my friends and family, but fortunately I go back a few times a year and we stay conneced by mail and phone.

What is the biggest difference when working for a company in Japan compared to Canada?
Since I work from home the differences for me are very low. But traditional Japanese companies are more hierarchical than american ones so you have to be aware of that. Also, your ability to connect with a team will completely depend on your ability to communicate in Japanese.

Do companies in Japan and Northern America work closely together on projects?
Many Japanese companies have branch offices in North-America, however I don’t know how much is new production work, regional adaptation or collaboration on a joint project.

How do Japanese customers differ from Canadian customers?
The main difference of course is the language. And Japanese companies employ many designers and programmers, so there are many sources of feedback to report issues and submit suggestions for improvements. Sometimes there are misunderstandings, missing information in order to reproduce a problem or understand what a specific request is about or why it’s important, so I keep an eye on all incoming feedback from Japan and work with our support teams in Japan and Montreal and our Japan resellers to make sure we have all the information needed to fully understand and address customer requirements.

Another difference is related to the fact that most customers in Japan are game developers. It is normally easier for film companies to switch pipelines from one 3D application to another than for game companies, because the final output for film is a rendered movie while for games is most often data which needs to be compatible with a run-time game engine. A large portion of our customers in Japan are game developers, so they tend to migrate pipelines at a slower pace than for customers in other markets since they need to validate the workflow and data compatibility with a new pipeline before they can adopt it in production. Japanese customers are very thorough in their analysis of new technologies and do not migrate without serious consideration. XSI has already been used in production for a while by many Japanese companies, and its popularity is now accelerating. This is very exciting to see !

When dealing with high profile gaming companies how do you meet their demands for new features that are not implemented?
We gather all incoming feedback from Japan, prioritize according to severity, popularity and amount of work involved and then plan the next version features. In cases where a customer requires something which is specific to them, like special training ( features, SDK ), assistance to setup / migrate a pipeline or custom development, then there is always the possibility of purchasing R&D time from Softimage through our Special Projects team.

Part of your job is “managing escalation of critical issues”. can you give us an example?
If a customer reports something very bad like a production showstopper or something which prevents the adoption of XSI in production, then we may escalate the issue internally and implement a fix which is provided to the customer as a QFE ( Quick Fix Engineering ). This is a service available to customers under maintenance. QFEs done in answer to issues in our last released product are always integrated in the next public release : either a point release ( Service Pack ) or the next main version.

Softimage 3D has been a long time favorite by Japanese companies and we are seeing some migration to XSI, Why has it been so hard for them to move to XSI more quickly ?
There is a cost involved in migrating since you need to train designers and port your pipeline ( workflow, plugins ) from one application to another. We released our first generation of game tools in SI3D at the time where the 1st generation of game consoles were coming out : SEGA Saturn, Sony PlayStation, Nintendo 64. Since we designed this in conjunction with SEGA, it quickly became the de-facto standard for 3D game authoring in Japan. Then, Alias released Maya in 1998. Although the first version was not ready for production use ( Japanese customers called it “mada”, i.e. “not yet” ), Alias had 2 years to work on Maya until we released XSI 1.0. Of course, we lost some users from SI3D to Maya during this period. It was a hard time for us, but we wanted to build a solid architecture from the beginning rather than something we would need to patch along the way. Now we can see this bet is paying off, as our development speed has greatly improved and is now imposing a pace of innovation which is getting difficult for our competitors to follow. With XSI 4.0 which includes the new poweful and affordable Foundation product, and many new advanced features throughout our product line, we have everthing we need to accelerate the expansion of our user community.

What features that you originally developed for SI3D have found their way into XSI?
Polygon raycasting, vertex colors, generic user data, polygon reduction, dotXSI support, RenderMap, pipeline tools ( export, viewing ), etc. These were implemented in XSI by my colleagues in Montreal. I wrote a few things for XSI like a User Normal Editing interactive tool and some realtime shader examples which I published on XSINet. Now I am studying the new Xgs and CDH features of XSI 4.0 ( which I think are really cool btw ! ).

What features excel in XSI for game companies?
Character modeling and CDK ( Character Development Kit ), polygonal modeling, polygon reduction, texturing, RenderMap / RenderVertex, CDH ( Custom Display Host ) and Xgs ( Graphics Synthesizer ), Realtime Shaders, dotXSI pipeline and FTK, ability to attach generic user data to scene elements and have it automatically supported through the dotXSI pipeline, general ease and flexibility of customization using the XSI Net View, Relational Views, Synoptic Views, and finally, the XSI SDK itself which is rich and now provides strong support for UI customization, among many new things in 4.0.

Will 4.0 change the gaming industry in Japan , and gaming industry in general?
Definitely. The low price point of our new Foundation product is opening up access to XSI to middle and low-end segments of cg production. It will also become easier for 2nd and 3rd parties to collaborate with high-end clients using XSI on joint projects.

As for the features of 4.0, there are many innovations which bring exciting new opportunities to our users. For example, the new rigid body dynamics are based on ODE ( Open Dynamics Engine ), an open source royalty-free solution. It is possible for game users to adopt ODE in their run-time engine and create realtime simulations which are entirely compatible with how things behave in XSI ( or you could simply plot / bake and export the animation if you do not want to recompute it in the game engine ). Also, the CDK ( Character Development Kit ) brings all the tools needed for making custom character animation rigs.

The CDH ( Custom Display Host ) allows an external application to communicate with XSI and display its output into a XSI view. It is possible for XSI to drive an external application and vice-verca ; this external application could be a game engine synchronized with XSI, or any custom tool which can be interacted with from within XSI. The Xgs ( XSI Graphics Synthesizer ) allows to create scene-level realtime rendering effects and can communicate with realtime shaders for advanced effects.

The Polygon Reduction tool in 4.0 is simply amazing, Texture Layers provide a very powerful workflow for multi-texturing management, Material Libraries simplify material management and the SDK is both rich and powerful.

XSI v4.0 is our biggest release ever since v1.0. It contains many other interesting features which are not listed here, as well as a large number of fixes for issues which were reported by our customers around the world.

What advice would you give to an artist of Northern America or Europe who wants to start working in Japan?
Learn the whole package. Designers in Japan do not tend to specialize in one thing, but instead are called to work on many different aspects of production : modeling, character design, animation, texturing, custom data editing, etc. The more flexible and proficient you are with the package, the easier you will connect with a Japanese production team.

Learn Japanese. Even though you can usually always find a few Japanese persons in each company who can speak a good level of English, it is more an exception than the rule and it would be better not to rely on that to bridge the gap with your Japanese colleagues. The better you understand and speak Japanese, the more chances you will have to connect with the team and contribute to the project. An american friend of mine who worked in a game company in Tokyo learned conversational Japanese very quickly because he was completely immersed in a Japanese working environment, but if you can learn some before heading to Japan it will make things a lot smoother and expand your opportunities.

Friday Flashback #109

Softimage started with $350,000 in venture capital funding. Here’s some comments from Loudon Owen, who with John Eckert helped finance and advise Softimage in its growth.

Proceedings of the Standing Senate Committee on Banking, Trade and Commerce
Issue 51 – Evidence
TORONTO, Thursday, April 29, 1999
The Standing Senate Committee on Banking, Trade and Commerce met this day at 9:00 a.m. to consider the present state of the financial system in Canada (equity financing).

From the opening comments by Mr. Loudon F. Owen, Managing Partner, McLean Watson Capital Inc.:

When we started, we were trying to raise money for a company in Montreal called Softimage. We were carrying around our little flip books but nobody wanted to give us money. We were quite astonished because we thought it was an exciting opportunity. We spoke to American venture capital firms, we spoke to Canadian venture capital firms and we decided there was an opportunity for a highly specialized venture group, so that is what we set up. We invest exclusively in software companies. We were highly focused, driven by what we perceived to be a market need. That was quite a few years ago and I think the market has changed dramatically in the last five years. However, that was what gave us the impetus to go forward.

I do not know if you have heard about Softimage. It is an animation software company. If you have seen Titanic, Jurassic Park, Death Becomes Her or most of the commercials on the television, you will have seen Softimage’s technology. The company was funded with $350,000. The shares which we receive from Microsoft are today worth $2.2 billion. It has 400 employees in Montreal and it was instrumental in building the animation industry in Montreal. There have been a variety of spinoff companies such as Discreet Logic and other companies in Montreal, so the company grew pretty dramatically. The only venture capital that went in was $350,000. After that it went public on the NASDAQ.

Our role was to invest. John Eckert and I share the duties of chief operating officer, and took it public on the NASDAQ. It was the first Quebec company to make its initial public offering on the NASDAQ. We considered the Canadian markets and elected not to go public here. We then sold it to Microsoft. We took the company from the initial point of investment, with its four employees, including the founder, Daniel Langois, to over 200 when we sold it to Microsoft.

On the question of whether Quebec was a hot of entrepreneurship [at that point in time, 1999] due to a more favourable regulatory and tax climate or just because people are more into the culture of entrepreneurship:

Do hotbeds of technology or clusters grow naturally because they are sponsored and supported? Again, it is a combination. I think Montreal’s animation, post-production and special effects community grew without any government support. For example, neither Softimage nor Discreet Logic had any significant government support or tax breaks. In fact, we tried to sell our first product to the CBC, and they would not buy it. They bought a French product, so we had to go to France and sell our first product there.

These companies grew up indigenously through their own creative efforts. What is happening now to sustain those industries and help them grow with their larger working capital requirements is assisted by government efforts.

Loudon Owen is co-founder of McLean Watson Capital. Prior to establishing McLean Watson Capital, Loudon and John Eckert financed and advised Softimage, a world leader in high-end 3D animation, in its growth from 4 to 250 employees, its IPO on Nasdaq in 1992 and the sale to Microsoft in 1994. Loudon and John served as the Joint COO for Softimage from 1993 to its sale.

Friday Flashback #108

Hmmm…last Friday was the 19th anniversary of the Microsoft purchase of Softimage (15 Feb 1994). I really missed it on that one. Now I’ll have to wait for the 20th anniversary; hopefully I’ll still be doing Friday Flashbacks this time next year.

Anyways, on to this week’s flashback…From Jurassic Park (1993) to Gladiator (2001), a “representative sample” of motion pictures created with Softimage products.

Gladiator Mill Film 2001
Jurassic Park 3 Industrial Light & Magic 2001
Moulin Rouge 2001
The Mummy Returns 2001
Shadows Mitch Levine Director 2000
Star Wars: Episode 1 The Phantom Menace Industrial Light & Magic 2000
X-MEN Pacific Ocean Post 2000
Fight Club Pixel Liberation Front & BUF 1999
Forces of Nature Dreamworks Pictures 1999
Galaxy Quest Industrial Light & Magic 1999
Stuart Little Centropolis FX 1999
The Mummy 1999
Antz Pacific Data Images & Dreamworks Pictures 1998
Babe: Pig in the City Animal Logic 1998
Deep Impact Industrial Light & Magic 1998
Deep Rising Industrial Light & Magic 1998
Fear & Loathing in Las Vegas Peerless Camera 1998
Flubber Industrial Light & Magic 1998
Godzilla Centropolis 1998
Jack Frost Industrial Light & Magic and Warner Bros 1998
Jurassic Park 2 Industrial Light & Magic 1998
Lost in Space Framestore 1998
Matrix Animal Logic 1998
Meet Joe Black Industrial Light & Magic 1998
My Favorite Martian Tippett Studio 1998
Prince of Egypt Dreamworks Pictures 1998
Saving Private Ryan Industrial Light & Magic 1998
Small Soldiers Industrial Light & Magic 1998
Snake Eyes Industrial Light & Magic 1998
Species II Digital Magic & Transfer 1998
Sphere Cinesite 1998
The Borrowers Framestore 1998
The Thin Red Line Animal Logic 1998
What Dreams May Come Pacific Ocean Post 1998
A Simple Wish Blue Sky 1997
Air Force One Cinesite 1997
Alien Resurrection Blue Sky – VIFX 1997
An American Werewolf in Paris Santa Barbara Studios 1997
Anastasia Fox Animation Studio 1997
Batman and Robin BUF Compagnie 1997
Contact Sony Pictures Imageworks and Weta Ltd. 1997
Men in Black Industrial Light & Magic 1997
Mortal Kombat:Annihilation The Digital Magic 1997
Spawn Industrial Light & Magic 1997
Speed 2 Industrial Light & Magic 1997
Starship Troopers Tippett Studio 1997
Star Wars Trilogy Industrial Light & Magic 1997
The Edge Peerless Camera 1997
The Fifth Element Digital Domain 1997
The Lost World Industrial Light & Magic 1997
The Relic VIFX 1997
Titanic Digital Domain 1997
101Dalmations Industrial Light & Magic 1996
12 Monkeys Peerless Camera 1996
Dragonheart Industrial Light & Magic 1996
Eraser Mass Illusion 1996
Joe’s Apartment Blue Sky 1996
Mars Attack! Industrial Light & Magic 1996
Mission Impossible Industrial Light & Magic 1996
Space Jam Industrial Light & Magic 1996
Star Trek:First Contact Industrial Light & Magic 1996
Surviving Picasso Peerless Camera 1996
T2-3D Digital Domain 1996
The Adventures ofPinocchio MediaLab 1996
The Frighteners Weta Ltd. 1996
The Island of Dr. Moreau Digital Domain 1996
Balto Amblimation 1995
Casper Industrial Light & Magic 1995
Judge Dredd 1995
Jumanji Industrial Light & Magic 1995
La Cite des Enfants Perdus BUF Compagnie 1995
Star Trek:Generations Industrial Light & Magic 1994
The Flinstones Industrial Light & Magic 1994
The Mask Industrial Light & Magic 1994
The Shadow R/Greenberg & Associates 1994
Death Becomes Her Industrial Light & Magic 1993
Jurassic Park Industrial Light & Magic 1993

Friday Flashback #103

Just about 13 years ago to the day, the URL went live.

Sumatra is Coming from

Sumatra is Coming from

Rather predictably, this sparked some debate on the mailing lists, with a number of different riffs on the URL, including “”:

Sorry Softimage, your software has served me well, but it’s time to wake up and smell the coffee. You sat around on your ass too long while I watched everybody around me switch to Maya, now it’s my turn. I’m actually excited to learn Maya, it seems like it’s creators are willing and able to stay up-to-date and on the cutting edge.


3d Discussion archive via the Wayback machine

Friday Flashback #99

In Dec 2012, there’s a rumor that ICE is “going to Maya” that’s causing some concern. Nobody wants to lose ICE 🙂

Let’s flash back five years to Dec 2007 when, in something of an ironic counterpoint, there was concern that rampant speculation about Moondust (aka ICE) would result in disappointment and a negative backlash.

…looking at all the nonsense floating around on the forums about Moondust, I already can see the negative posts when people realize it doesn’t do feature XYZ…

…It’s not going to go well for Softimage at launch if Moondust doesn’t meet expectations, and at this point, I’d be willing to bet that it won’t…

…Although it’s fun to speculate about Moondust, the over excited anticipation can only lead to disappointment…

Looking back, I don’t think that ICE did disappoint. What do you think?

web.archive of page

Friday Flashback #97

I came across this SOFTIMAGE|3D photo in an article on rotoscoping. It shows ILM co-supervisor Tom Bertino working on one of ILM’s Flubber shots.

Once the background plate was scanned into ILM’s Silicon Graphics computers, the match movers went to work. “We’re able to bring up that clip in the computer in a Softimage 3-D environment,” says Bertino. “The matchmovers then took what’s seen on film and recreated it in primitive wireframe models.”
Breaking the Mold:Physics of Jell-O Inspires CGI Stars of Flubber

A little more time on Google led me to some postings on vimeo from Philip Edward Alexy, who was the lead technical animator on Flubber.

First of all, sorry for the quality: this was ripped from a DVD copy of a D-beta tape.
As you can see, there is a heck of a lot more going on that you would think for this shot. As you see at the beginning, there is the Blob Flubber sitting in the matchmove representation of Robin William’s hand. Now keep in mind, back then, all of the matchmove stuff, both camera and object geometry, was HAND-ANIMATED. There was a crew of guys from the old practical ILM shop who transferred into the digital side: some of these guys worked on “Empire Strikes Back” and onwards, so they knew how cameras worked and were able to use this experience to do the one thing that made ILM stand out back then, properly reconstruct scene and camera information into the computer.
So we have the Blob sitting there, with what appears to be somesort of orthopedic back-brace and a black fuzzy alien sitting in its belly. Well, the “brace” is in fact the up-vector construct I had to develop because the Meta-Clay elements that made up the Blob Flubber where not spherical, they were shaped like overlapping mass of blobby M&Ms because when the client want to get away from the “pear-shaped” Flubber that spherical Meta-Clay created. BUT, Softimage3|D didn’t have up-vector constraint and (of if it it, it did not work well at all) when they were lined up on the cluster-deformed path spline that held them in place, they would start flipping randomly along the shortest axis. This was bad because it looked like the Flubber was having a seizure when animated. So I had to invent an up-vector constraint that worked consistently. So that’s what the “brace” is, which had to be, at times, key-framed to prevent the flipping.
So what’s that alien? Why it’s the Puppy Flubber rig, elements and geometry, all compressed, waiting for the moment Robin Williams sticks his fingers into the Blob Flubber. Presto-change-o, without any quick cutting or changing of the scene file because of the nature of Meta-Clay, the Puppy Flubber pops up, all ready and IK-rigged, and the Blob Flubber lines up inside the body part of the Puppy.

The puppy design, by Scott Leberecht, had to be envisioned with the tools at the time, which was Meta-Clay balls in Softimage 3|D. If you had ever used that tool, you would understand what a task it was to get the right shape, and then rig it so it could be animated.

At the time, it was the densest CGI structure ever made. There were about three hundred Meta-Clay elements, all spine/spline/cluster controlled. It took about two minutes just to refresh to the next frame. It took me about two months to build and rig.

A bit of test animation, to show that ILM could actually DO the Character Flubber, that ended up in the official Disney trailer.
Little bit of trivia: all those bubbles you see? They’re not part of the shader: those are all individual pieces of geometry that are parented to the rig. Sometimes, because they would fly out of the mesh depending of the pose, they had to be key-framed

This one shot took a year to do. Seriously.

Finally, in a Word document at, I found this. It’s attributed to a no-longer existing page at Philip Edward Alexy’s web site.

“We had thought of doing something where we could use B-spline patches that we could shape animate over time. But, that wasn’t practical because Flubber changed so much within a sequence that it would have been too time-prohibitive to model all the different forms. Even when he was just a little blob he changed so much that to do it using patches, shape animation and lattices just wouldn’t work.”

“Since the Flubber character was composed almost entirely of metaballs, the animators could easily turn him into anything from a pair of lips to a tail-wagging puppy to a hip-shaking mambo dancer. In addition to Softimage, ILM developed several custom effects to turn a blob into everything that blobs could possibly become within the animators’ collective imagination. Several Flubber models were developed: the Basic Blob, a male and female Actor-Flubber, a Scare-Flubber, a Puppy-Flubber, a Fingers-Flubber, a Bubble-Flubber and several others – each more difficult to pronounce in rapid succession.”