Xfile

Millimeter, July 1998

Expanding the Files: Pyro, Prosthetics, Models, and CGI All Play a Role in X-Files Film

By JErich Spencer

xfile

The mysterious nature of The X-Files TV universe has seamlessly transitioned to the big screen with The X-Files Movie-not only in terms of the storyline, but also with the philosophical approach producers took towardspecial effects.

As with the TV show, the film mixes high-tech and conventional techniques, including the liberal use of prosthetics, animatronics, and digital effects-around 300 CGI shots. The end result satisfies the effects strategy of series creator and executive producer, Chris Carter: "All visual effects must enhance the believability of the work.

To meet Carter's mandate while staying within the approximate $60 million budget and tight deadlines, the film's digital team frequently had to improvise in its collaboration with artists from several Hollywood digital effects shops. The bulk of the work was done at Light Matters/Pixel Envy, Pacific Palisades (co-owned by visual effects supervisor Mat Beck) and the L.A. division of Blue Sky/ VIFX. The creative team-Carter, director Rob Bowman, producer Dan Sackheim, co-producer Frank Spotnitz, and Beck-were all veterans of the TV series, which made the ambitious plan for the film feasible. In fact, several effects sequences come directly out of the TV show's mythology, including the mysterious "black oil.

That substance, which may or may not be an alien life form, plays a central role in the "Neanderthal sequence" that features a boy and a living Neanderthal squirming in pain because the black oil "really screws up their skin," explains Beck.

That sequence and the devastating explosion that destroys a building in Dallas early in the film typify the willingness of X-Files filmmakers to experiment with traditional techniques.

The challenge of the explosion scene was to focus on the devastating aftermath, seconds after the big blow-up. Yet, filmmakers also needed to show the power of the explosion itself. "The city of Los Angeles wouldn't give us a permit to blow up a real building, and our budget only provided money for us to build one miniature," explains Beck. "So we had to make some tough choices."

Hunter Gratzner Industries, Los Angeles, built a 23-1/2-foot tall "aftermath" miniature of the devastated building. "Then, we did a detailed CG pre-visualization of what we needed for the explosion itself and went back to the city, which gave us permission to build a false front onto part of the Unocal building in Downtown L.A. for a small, practical explosion," Beck adds.

The pre-viz, created at Light Matters/Pixel Envy in Alias software before Blue Sky/VIFX took over for the final shot, allowed filmmakers to figure out how to create the illusion that the entire building was exploding. The answer: mix shots of two small, practical explosions involving two separate facades. The first was a false front attached to the real Unocal building; the second was a model version of the facade attached to the front of the aftermath model.

Thanks to the pre-viz, filmmakers were able to determine the correct camera angle in mid-air over the building for a POV that captures the fireball rising up as the building collapses around it. That allowed them to shoot separate explosions with crane-held cameras positioned at exactly the same angle over both false fronts. The model and practical elements were comped together in Inferno, by Blue Sky/VIFSX artist Cesar Romero.

During the composite, Romero added portions of the miniature facade onto the background plate of the real building. He mixed fireball shots and other elements from the collapsing lower part of the building (taken from the explosion conducted at the real Unocal building) with fireball shots and other elements from the collapsing higher part of the building (taken from the explosion involving the false front attached to the model). "We put it all in a box, shook it, and out came the explosion," says Beck.

But the scene posed other challenges, as well, including complicated tracking work for a shot in which stars David Duchovny and Gillian Anderson walk through smoke in front of the wrecked building. For that shot, producers wanted to shoot the actors in front of a green screen and then digitally insert a shot of the aftermath model behind them, but they wanted to do it without the expense of a full motion-control camera setup.

Their solution was to combine a computerized match move with elements of a motion-control shot taken without use of a full motion-control rig.

"We used a motion-control head with encoders attached, but not a motion-control dolly, when we shot the actors," says John Wash, senior visual effects supervisor at Blue Sky/VIFX. "That gave us enough data in the computer to use for verification as we duplicated the camera move (using Ras-Track software from Hammerhead) before it moves around the building."

The move that continues around the building was done "full pan-and-tilt, with the conventional dolly no longer moving," adds Beck. That also provided helpful motion-control data when filmmakers used a conventional motion-control setup (provided by General Lift, Los Angeles) to shoot the aftermath miniature with elements such as sparks, flames and falling debris. Blue Sky/VIFX Inferno artist John Heller later combined the shot with the live-action plate to create the final camera move.

Improvised tracking also turned out to be the key to getting the Neanderthal sequence just right. Because nasty "blood worms" wiggle on the actors' bodies as they squirm, the sequence required the creation of CG body parts to replace certain limbs. That meant attaching reflectors onto the actors to record tracking data, but there were some twists.

Simple green stick-on reflectors were used on actor Lucas Black (who played the boy), but tracking the Neanderthal (played by Carrick O'Quinn) was more complicated. That character had to be shot in shadows (the boy was shot in daylight), and the camera could not pick up reflections from normal tracking markers without using ultra-bright lights. So filmmakers improvised again.

"I designed a soft ring-light rig to fit around the lens of the camera," Beck explains. "We then attached tiny reflective markers to the actor made out of Scotch Light material, which were able to reflect a softer light and were virtually invisible when no light was on them. Then, we rigged the ring light to flash on every other frame as we shot 48 frames per second. That allowed us to get the tracking data, while still giving us shots of the actor that we could use without needing to do dot removal. Because we shot at 48 frames a second, we were able to use every other frame for the tracking data and the others for the shots seen in the movie played back at 24 frames a second. The technique wouldn't work if the character needed to move really fast, but the Neanderthal is slow and lumbering."

In the case of the boy played by Black, one arm and his face are entirely digital. Tracking the CGI to the live action and then compositing the whole thing together was a complicated affair.

Unlike the tracking shot of Duchovny and Anderson in front of the wrecked building, tracking work on the Neanderthal sequence was performed by Colin Strause, 3-D supervisor at Light Matters/Pixel Envy, without special tracking software. Instead, Strause first animated CG limbs, blood, and other elements in Alias Power Animator (version 8.2) and then spent months painstakingly hand-tracking them onto the sequence, pushing both the Alias software and the Amazon Piranha Compositor (version 3) from Interactive Effects further into the tracking realm than ever before.

"We didn't use special tracking software because none of the ones available could match the elements we needed to match since the moving body elements themselves have moving blood worms on top of them," says Strause.

He goes on: "We felt it was better to do it frame-by-frame. It was painful and took a few months, but it was worth it. We would play back the comp frame-by-frame, counting off how many pixels were off in each frame, and then hand-move the CG body parts where they needed to go. We could do it because Piranha has real-time film playback. That allowed us to render the CG in Alias, comp the two together in Piranha, and then view the tracking shots in the compositor and do 2-D image stabilization."

The technique was especially useful in shots involving Black's face. His face was so covered with blood worms that it required a complete CGI replacement. Since Black was flinching facial muscles, Strause had to hand-animate the muscles on the CG model of the face (taken from a Cyberware scan of Black's face). Matching them up was done via the use of tracking reflectors, but not from the live-action shoot. Rather, Strause replicated that concept inside the computer-performing what he calls "real-time, 2-D tracking."

"We went into Alias and placed digital tracking dots onto a simple 2-D version of the model," he explains. "Then, on the 3-D face, I put the same blue dots in about the same places based on flecks of dirt or moles on the face. When I lined them up in Piranha, I could see where they matched up and where they didn't. Then, I could push the muscle around until they fit. That was the first time I ever tried anything like that."

Strause later went into Amazon 3-D Paint and projected the real shot of Black's face directly onto the 3-D model version of his face. This allowed him to add dirt and shadows exactly where they were located in the original shot. The CGI face was made more realistic because it was textured with real skin from the actor.

© 2003, PRIMEDIA Business Magazines & Media Inc.


Back to News