Product Launches That Fail
Let's be honest: it’s only human to get a kick out of the spectacle of powerful, infinitely rich companies making a disastrous entry into a splashy new field, like Google's recent epic fail with Gemini Al. These are rare, special cases of plunging right over the edge, involving an unusually choice degree of hubris and corporate humiliation.
Thirty years ago, it was Apple’s turn in the barrel when it introduced Newton, predecessor of today’s tablets and smartphones. To be fair, new products often don't succeed, even if they aren't paradigm-shaking novelties. Most of the time there's nothing unusual, let alone disgraceful in that normal winnowing process. It doesn't usually involve becoming an instant, media-wide, national laughingstock that spoils the launch of a Whole New Thing. Yet at the time, Newton seemed almost miraculous.
Back then, business people often carried Dayrunners, a trendy brand of totable small looseleaf notebook with calendars, notes, and contact info. Newton let you carry all that, reduced to the size of a paperback book and easily kept in sync with your desk computer. You could scribble notes to yourself, and—here comes the hangup—it could even read your handwriting. It was that one feature that led to ridicule, because Apple released it too soon after insufficient testing. Product designer Jony Ive joined Apple to revise the early Newton, and Steve Jobs would keep him on for every major product launch thereafter. A year later, handwriting recognition worked much better, but by then the PR damage had been done.
Newton’s main problem was it wasn’t networked. WiFi didn’t exist yet. When your 2024 smartphone does handwriting and voice recognition, it’s acting solely as a thin client; the work is done in the cloud and instantly returned to your screen. Newton, by contrast, had to do everything all by itself.
In retrospect, Apple’s big mistake was overpromising, with a pretentious ad campaign that implied a lot more AI than the little MessagePad could deliver. Newton’s basic idea wasn’t dumb. The proof was the late Nineties success of the Palm Pilot, a smaller, lighter, and cheaper Personal Digital Assistant.
Four years before the first Betamax arrived, more than a decade before Blockbuster stores would spread through our towns and cities, a U.S. company named Cartrivision released a domestically designed and built home video player. Unlike the handful of earlier attempts to sell home video recorders, this one had a specific marketing target and function: playing pre-recorded movies. Sold through Sears, Roebuck, the actual manufacturing was done by Packard Bell, a respected if second tier electronics company. Every Sears store that sold it also offered a fifty-film rental library of legal, licensed Hollywood feature films. No previous home video machine offered any.
This pioneering effort did things a little differently. Since most people didn't live within daily driving range of a Sears, those rentals weren't charged by the day, but by the number of viewings before a customer brought it back. (Each video cartridge had a simple mechanical counter, like an odometer. There were also a few cartridges for outright sale, mostly of the same kind of drearily cheap content that would later fill direct-to-VHS tapes.) The need to trade off rental tapes would bring customers back into the store frequently, which Sears liked.
It wasn't a crazy scheme, but it didn't quite work. Aimed at the top end of the market, for most of its existence Cartrivision was available only as built-in to “wideboy”-styled mahogany console TV sets for the living room. But the target buyers already had expensive color TVs, and didn’t need another console set. It also meant the huge, heavy sets wouldn’t be brought in for servicing, and few of the field technicians in the Sears service trucks had any training in fixing the moving parts of a videotape machine.
In contrast, when Sony came to America with Betamax, it too targeted a specific purpose: time shifting of broadcast programs. Just about any of the previous video recorders could have done that, but Sony was the first to make it the key to sales.
Often, the development period has been so long and expensive that the sunk cost fallacy takes over. That's what happened to RCA's incompatible, non-laser videodisc system; by the time it made it to market in 1981 they already knew it would flop, but after seven years of effort and $100 million, they couldn't go back to their stockholders without making a valiant try. That four-year “try” wasted an additional $60 million.
In other cases, the product is stillborn, barely on the market at all. In 1951, it was CBS's early form of color TV; a much later example is HD-DVD, the 2007 competitor of Blu-Ray. In 2012, Hewlett-Packard's tablet computer lasted only 49 days before it was pulled. They all made it to the sales counter, but were almost immediately abandoned by companies that belatedly realized they weren't going to win.
Most technological products become cheaper over time, but some have an intrinsic wall of high cost and limited demand that never fully goes away. Supersonic flight, Concorde style, never did enter a virtuous circle of becoming affordable. AT&T’s 1970 Picturephone, at a monthly rent of $100 ($775 in today's money), couldn't attract enough customers to make calling each other worthwhile. Color TVs, which started out three times as expensive as black and white, spent ten years as a niche for the rich that didn't hit its 1955 sales goals until 1965, when they were only twice as expensive. But they doggedly hung around long enough to succeed, mostly because RCA, color’s chief developer, never gave up.
The failure of the Alto computer, invented a half century ago at Xerox's Palo Alto Research Center and a decade ahead of its time, was more than a simple case of lazy or overcautious corporate timing. What made Alto so unique, so advanced was its windows-icons-and mouse interface and its (relatively) high-definition bitmapped screen. At the time, fifty years ago, that required so many expensive silicon storage chips that the product would have been impractically costly. Then, a key breakthrough in cyber engineering allowed the computer to save most of that money by continuously, imperceptibly swapping parts of that screen image in and out of a smaller memory.
With the Alto, Xerox had a five-year head start on a workplace-capable, networked, windowing computer with a laser printer. But this is critical: they knew, they had to know it would be a rapidly depreciating asset. It was literally a case of use it or lose it. They didn't use it in time, so, in one of the great what-ifs of corporate history, they lost it.
Apple's nearly forgotten 1983 Lisa computer series preceded the Macintosh by a year, with much of the same system architecture—windows, icons, a mouse. The problem was a familiar one for groundbreaking new technology: it was way more expensive than planned. A well-equipped Lisa workstation cost about half as much as a car. From that point forward, the brand that once called itself "The computer for the rest of us" was twice as expensive as the computer for the rest of “them”—i.e., MS-DOS, later in the form of Microsoft Windows (with a capital "W').
Sometimes it's a problem of timing. The project is rushed to the market before it's ready, like first generation fission power stations, the space shuttle, Newton, or VR headsets in the Nineties. Or it's too late to reach the market, like Polaroid's Polavision, instant (silent) home movies that would have rocked the Super 8 market in 1966. But when they finally appeared ten years later, they were overshadowed by truly "instant home movies"—video, with sound, on reuseable tape. Sony's Walkman cassette players were world-beating hits, but their much-ballyhooed 1992 follow-up, the MiniDisc, had a unique Sony recording format that didn't have time to catch on before direct online digital delivery and storage (MP3 and Apple iTunes) took over from physical media.
At the other end of a particular technology’s birth, widespread adoption, and economic lifetime, there are often a few ideas, once novel, some quirky, that have had unpredictable staying power beyond the grave of obsolescence. Super 8 movies, vinyl records, vacuum tube music amplifiers, instant film cameras; hobbyists gave them ghostly afterlives, decades after their abandonment by the mainstream market.
Some costly corporate misfortunes are outright blunders, combined with, in many cases, infighting and ego. In the case of Google Gemini, it was wokeness, combined with willful blindness. To be fair, plenty of other bad product launches were normal misjudgments, combined with unexpected technical snags and plain bad luck.
Apple's very recent cancellation of its decade-long, multi-billion-dollar pursuit of a practical self-driving car, is a rare, apparently laudable case of a megacorporation that prudently pulled back from the brink before taking that last, bet-the-entire-company risk of manufacturing and selling a vastly expensive new product. Even the world's most successful firms are wary of the tricky transition between a promising idea that works in a lab and a commercially viable product for sale to the public.
One suggestion, though: if you've got a risky new product, don't give it a too-easily-mockable name, like Edsel, Ishtar, or Gigli.
You don't always know what, at the outset, will fail; on the other hand, you don't always know when something established that seems to have been humming along practically forever is about to enter a failure spiral.
These articles are derived from lectures, talks and web posts. Most have also been posted on Ricochet.com.