Visual Effects 3: Morphing to Digital
George Lucas was never satisfied with the technical tools of filmmaking. Suddenly, he had the money and the power to change them. Crucially, he was willing to share these new tools with other filmmakers at a price—generally, commanding a very high one.
Using electronic tools to edit or alter images was one thing. Creating the images was another, far harder job. Old school analog TV didn’t have nearly enough detail to stand comparison with movies. Not even close. Optical scientists experimented to determine just how much better television would have to improve to equal the appearance of film. The difference in visual resolution between the two was so great, it was hard to even measure it meaningfully. It would be like trying to reach the Moon by climbing a higher tree. Or so it seemed--at the time. That verdict would change.
The first crude digital moving images were made in the mid-Sixties, by computers at Bell Laboratories, and by the BESM-4 system in Russia. In 2001: A Space Odyssey, a few seconds of (supposedly) computer animation on a monitor shows a dish antenna. In actuality, the tiny sequence was hand-drawn; real digital animation barely existed yet. But by the time Star Wars was made a decade later, it did. The plan of the attack on the Death Star is illustrated on a small screen with simplified wire frame animation. At the time, that primitive “machine drawing” was state of the art.
The last major use of this technique in motion pictures was in the trailer and the opening credits of The Black Hole (1979). It was an impressive looking way for the green grids of wire frame to leave us. They’d done their job in VFX history. From this point on, software companies like Utah’s Evans and Sutherland, and hardware companies like Sun and Silicon Graphics made the leap to adding textured surfaces to those wire frames, creating solid-looking objects of greater and greater realism. Television, as noted, had more forgiving visual standards than feature films, so some very limited bits of digital animation began to be seen in TV commercials.
Disney’s TRON (1982) was hot stuff in its day. Not every scene was digital, but the ones that were attracted lots of attention. Nowadays, the kind of on-the-fly visual calculations that made TRON have long since been bested by ordinary home video game consoles, but forty years ago, producing this film required the services of a Cray 1, the pride of Chippewa Falls, WI, then one of the fastest and most expensive computers in the world.
That same year, Star Trek II’s “Genesis Effect” became the first semi-realistic landscape computer-generated for a film, benefiting from graphics software exploiting a reinvigorated mathematical field, fractals. It was created by a special Lucas-owned digital unit that was legally distinct from ILM. They gave it a name: Pixar.
The next step was a fully digital character. That would happen in 1985 with the Stained-Glass Knight, who appeared for about half a minute in The Young Sherlock Holmes. Since it was a fantasy being, the then-still-major shortcomings of digital didn’t matter. Same with James Cameron’s The Abyss (1988), The mysterious, malevolent “water snake” was a purely digital invention that looked realistic enough to be treated as a real object. It was a new hallmark in effects. But it was, like the others, a fundamentally unreal creature that an audience couldn’t compare to anything in the real world. Digital had proven to be able to do fantasy. Reality was still out of reach.
As the Eighties ended, the vast majority of VFX was still done on film all the way. So were the movies in general. Up through the end of the century, they were still film-based. The jazzy new effects were computer-based, but every Hollywood production still originated on 35mm Eastman color negative. The final product still shipped around the world in heavy steel cans.
Gradually, the Society of Motion Picture and Television Engineers revisited the question of how much finer an electronic image would have to be to equal the perception of standard 35mm movies. It turned out that with digital, the jump was not nearly as unattainable as it once seemed. Those Fifties experiments compared television to original 35mm slides. But no movie audience ever sees original film; at the very best they saw a copy of a copy of a copy. At each step, there are microscopic line-up errors.
By 1991, in Terminator 2, James Cameron used digital to produce a liquid metal robot in human form. It wasn’t quite photorealistic, but it was getting there. (The most amazing and terrifying effect in T2 is the Hiroshima-like scene of the nuclear blast, but it doesn’t quite fit here because it was a mixture of techniques.) This pioneering “morphing” technology was also used that year most strikingly in a Michael Jackson music video, Black or White? which uses the person-to-person transformation of the startling special effect as what comes across as some kind of surprisingly early ad for identity fluidity.
It was Jurassic Park, two years later, that became, in effect, the Jazz Singer of digital character animation, a hit that would forever change audience expectations of what the movies can do. 1993’s audiences were wowed. They’d never seen anything approaching this level of magic. Steven Spielberg bet his whole movie on the effects being believable, and won. The value of CGI was proven. Hollywood never looked back.
Compare the digital visual effects of 1995’s Apollo 13 with roughly its Reagan-era equivalent, the film-based effects of 1983’s The Right Stuff. They’re both films that get a great response from a packed theater. The often-improvised, relatively low-tech techniques used in The Right Stuff are slightly funky and impressionistic, more or less matching the director’s tone, irreverent if not outright snarky. The digital effects of Apollo 13 are flawless and precise, resurrecting long gone visions like the majesty of a Saturn V launch. (Apollo 13 also used some practical effects and models.)
Periodically, filmmakers do set themselves the challenge of doing things with minimal or no CGI. Oppenheimer is a contemporary example, but even from the very beginning, there have always been a few films like Francis Coppola’s Bram Stoker’s Dracula (1991) that made a big point of not taking the easy way out, of staying old school.
The pre-Eighties field of visual effects was done by old guys with lots of experience but little technical education, who apprenticed twenty years earlier on finicky equipment that was ancient even then. There were only so many old FX guys and old optical printers to go around; that bottleneck was one reason why Lucas wanted to change the system. Nineties CGI brought new flexibility to VFX production. When Independence Day had too many effects shots for one company, they simply broke up the rush work among several FX houses. There was no custom equipment; their graphics workstations were leased in bulk and the software package was enterprise-wide.
Over the following years, motion picture special visual effects companies hired thousands of computer-trained graphic arts graduates, often right off campuses. The Star Wars and immediate post-Star Wars world of special effects was dominated by Boomers; the digital revolution brought Gen X into power in a big way. And for the first time, the numbers of women and men on movie tech credits began to equalize.
After setting HDTV standards, SMPTE got around to filmless movie theaters. They determined more than 25 years ago that theatrical digital video of roughly 4K (in today’s terms) would do the job. When silent movies gave way to sound, almost every piece of filmmaking equipment had to change, and audiences saw and heard the difference immediately. But seventy-plus years later, when film gave way to digital at both ends of the movie pipeline, it was barely noticeable to most people. Visual effects no longer had to be converted to film. They just joined the normal digital editing workflow.
By the turn of the century, movies like Gladiator used digital to do the things that only it could do, like aerial helicopter-style shots of crowded ancient Rome at its height. (It also brought deceased star Oliver Reed back to life for one necessary scene.) Overuse of CGI has been a recurring complaint for the nearly quarter century since, but for better or worse, 21st century audiences responded to the lavish spectacle that visual effects enabled. Studios backed these Cinematic Universes because people paid to see them.
George Lucas had wanted his prequel trilogy to be digital, start to finish, but when he began shooting, there were still few top-quality digital cameras. The theaters weren’t ready in 1999. By the time of the next film in 2002, though most theaters still ran film, there were already a sizeable number of digital screens. George could justly boast in ads that a DVD copy of Star Wars II: Attack of the Clones was “a perfect clone”. From the point of view of copyright security, that boast would become a two-edged sword.
Since CGI pasted a reasonable facsimile of Oliver Reed’s face, it has been pressed to do more. When Paul Walker died in a car crash in 2013, his scenes in Furious 7 were completed by CGI that was vastly more capable than it had been 13 years earlier.
It’s never wise to bet against the wizardry of visual effects. That earlier post about dealing with the mortality of actors mentioned the decades-long SF concept of “synthespians”, actors reanimated by, uh, animators. Actors are now de-aged, sometimes pretty convincingly, and dead ones have briefly been brought back to life, usually not quite as convincingly. Yet. Nonetheless, it’s no small deal to Hollywood, currently embroiled in simultaneous strikes of writers and actors, partly motivated by fear of artificial intelligence. It’s real enough to them.
Visual effects has become a high technology field, but its creative directors and business managers have been caught flatfooted by AI. VFX houses are going to have a tough time staying around, if all a director has to do is say, “Okay, ChatGPT, a view of Saturn from its moon Titan” and it appears on the screen instantly. With AI entering the writing arena as well, it’ll be able to do the whole job, dreaming up stories, drawing the backgrounds, composing the music, filling in realistic moving and acting images of background extras in whatever numbers are needed, and even doing the acting.
What’ll that do to Hollywood’s workforce? I’d like to be able to say: only a human can truly know what it is to be a human, so real artists will never be replaced for the big stuff, the prestige shows and films. For many of the simpler tasks, though, like illustrated textbook lessons, corporate training, documentaries, they wouldn’t be replaced—well, not completely, certainly not at first-- but simply undercut by less soulful but cheaper alternatives. If this succeeds, is accepted and makes money, they’ll try doing synthe-soaps, in Latino daytime TV markets first. How soon could that happen? Not long ago I would have said, at least a generation or two away. (In practical terms, say 15 to 30 years away.) Now? A guess: at this rate, in fewer than five years.
In little more time than that, your smart TV, or even a smartphone might well be a “thin client” that accesses the network-enabled capability of making up its very own waste-of-time entertainment, its own basic cable quality shows and movies, on the fly, tailored to your tastes, starring anyone whose image service you subscribe to—or starring you. We’ve come a long, long way since George Melies took a trip to the Moon.
As alluring as some of that might be to some, my bet is it won’t take over everything. Completely machine-generated entertainment might very well have a dazzling start, then hit a limit of popular interest and acceptance, like disco, polyester, and Atari did.
Hollywood has been re-learning a painful, periodic lesson, one that slaps it in the face every couple of decades or so: this is a business of hits. Artists have hunches, make bets that others will find the same things interesting that they do. Software can’t do that.
Related articles:
These articles are derived from lectures, talks and web posts. Most have also been posted on Ricochet.com.