Winner Takes All: 8 of the Most Dramatic Format Wars

istock / istock

If you’ve been to a thrift store in the past couple decades, you’ve likely seen the dusty casualties of bitter format warfare: a few 8-tracks mixed in with the cassettes, some massive LaserDiscs hiding in the LPs.

Several of the fiercest format battles in the past century or so were waged between competing media platforms, but some lesser-known ones—involving gas-powered kitchens, electrocuted elephants, and Edison’s hard-as-nails game face—were also downright ruthless.


It’s often said that the U.S. and Canada were established thanks to railroads, but the tracks themselves weren’t always laid out the same way, as an 1887 issue of the Railroad Gazette explains. 

In the early 19th century, many—but not all—Southern railroads laid rails set 5 feet apart (or to wide “Russian” gauge), while most Northern lines adopted the British-developed gauge of 4 ft 8 1⁄2 or 4 ft 9 in, a width derived from existing cart roads (i.e. wide enough for a team of two horses, which suited horse-drawn railcars, or “horsecars,” just fine). During the American Civil War, the Confederacy suffered from poor supply lines due to the region’s mixed railroad track types; loads would have to be transferred by hand from one track’s cars to the other’s. 

In the 1880s, U.S. railroad heads and politicians conferred to finally standardize the system. One result of these meetings was that, during a 36-hour period starting on May 31, 1886, tens of thousands of workers moved about 11,500 miles of Southern track 3 inches closer together to make them compatible with the “standard” gauge. Another result of these meetings was that, because train schedules couldn’t reliably be set according to the local times of different towns, a GMT-based Standard Time was established, along with five official time zones (the fifth, then referred to as the Intercolonial, is now called the Atlantic and was used for Eastern Canada).

Various smaller-gauge railroads from before this era are still in existence (often for historical or tourist purposes), and some were even of great use during World War II when tire rationing made trucks scarce. However, many were scrapped by 1943 for their iron. 

Today, a variety of track gauges are used throughout the world, but the Standard Gauge remains the most popular, with Russian Gauge tracks coming in second with 140,000 miles laid (5 feet apart) throughout former Soviet Union states, Finland, and Mongolia. 


Wikimedia Commons

Thomas Edison contributed plenty to late-19th and 20th-century innovation, including some very firm opinions about how things ought to be. In 1877, he developed a method for recording and playing sound using tin-foil-wrapped cylinders; they weren’t very effective, though, and he abandoned the technology for various other interests until, seven years later, the Volta Laboratory team of Charles Sumner Tainter, Alexander Graham Bell, and Chichester Bell came to him with a superior wax cylinder method for recording and playing media that they’d developed. Edison rebuffed them, and set the stage for a graphophone (Volta) vs. phonograph (Edison) showdown. What would have been an epic face-off was quickly relegated to the history books thanks to a new invention.

In 1887, Emile Berliner invented cheaper and more compact disk cylinders, quickly following this innovation up with an early motorized gramophone to replace hand-cranked models. In the last few years of the 19th century and the first several of the 20th, both these newfangled disks and Edison's cylinders were popular and had their advantages and disadvantages. However, after Berliner’s gramophone business was effectively dismantled by legal issues and his disk patent expired, Edison finally deigned to market the other man’s (ultimately far more successful) format, as well as his own.



The battle for top analog videocassette magnetic tape recording format, waged between JVC’s Video Home System (VHS) and Sony’s Betamax (a.k.a. Beta) from around 1976 to 1988, is perhaps the most infamous format war of recent decades.

After the two formats beat out other early options and became industry front-runners, home viewers were faced with a choice (for the sake of which some stores offered side-by-side system comparisons): from a consumer perspective, the VHS tape’s longer recording time and cheaper player made the system a better investment. However, as Beta fans argued, Sony’s version had better picture, better sound, and was more durable. 

By the early 1980s, the U.S. marketplace (driven by tape distributor decisions and other factors) had spoken, and VHS controlled 70% of it thanks to JVC’s seemingly less expensive format.



The road to supremacy for the digital video/versatile disc (or DVD)—that more compact format which killed off VHS and smoothed the transition to digital media—was a gradual one with lots of players along the way. In the early ‘90s, Sony and Philips repped the MultiMedia Compact Disc (MMCD) as the next big thing in visual media formats, while Toshiba and other companies lauded the Super Density disc (SD). In a rare moment of cooperation for the history of format wars, the companies agreed to combine the best elements of both disc types and first launched the optionally double-layer, double-sided DVD in Japan in 1996.

Meanwhile, Philips had also been developing another video disc format: the VideoCD (VCD), which gained steam in Japan and Europe while the DVD was beginning its successful world tour. VCDs were cheaper to produce and purchase, while the DVD, like Betamax before it, offered film aficionados a richer viewing experience. Ultimately, the film industry—which wasn’t keen on the fact that unprotected VCDs could be easily ripped with CD burners—put the kibosh on VCD and started printing its wares on DVD only.

One last format also tried (and failed) to unseat DVD as the go-to digital format: Divx (unrelated to the modern DivX), a rental system that involved a disc costing approximately $5 with a 48-hour shelf life after first viewing, and repeat charges (via phone line hookup) for subsequent viewings. While some film studios were game, video rental businesses and collectors, broadly, were not; once again, the market spoke.



In 1994, the major videogame-makers were readying themselves to launch the next generation of game consoles, and gamers saw companies going in two different directions: some went for the far bigger storage capacity of CD-ROM, including Sega, which released the (not debate-ending) Sega CD two years prior, and Sony, which was prepping its first PlayStation for a 1995 launch. Nintendo, on the other hand, was sticking with cartridges and hard at work on its “Ultra 64” console (later known as Nintendo 64) with the aid of hired-gun experts from Silicon Graphics, who helped make the film Jurassic Park a visual knockout. 

The slimmer, higher-capacity CD-ROM and its brethren won the whole industry after the Nintendo 64, one of the last cartridge hold-outs, had its dignified day in the sun in the late ‘90s. However, many gamers continue to rue the industry’s abandonment of cartridges, pointing out that, in addition to being more durable, the format—while far more expensive to manufacture—offered faster load times plus graphics that were comparable to a CD-ROM’s when it was finally retired.


Many a history and science buff has heard the tale of the bitter, life-long feud between Thomas Edison and Nicola Tesla: how Edison, champion of direct current (DC), mocked his then-employee Tesla for the latter’s alternating current (AC) system, designed to solve the problem of DC’s limited transmission reach; how Tesla, enraged, brought his system to inventor and entrepreneur George Westinghouse, and the two began promoting AC across the country; how Edison threw shade, the safety gloves came off, and the AC/DC battle began. 

To put weight behind the claim that his DC system was safe while Westinghouse’s was lethal, Edison began electrocuting animals using AC power in large public demonstrations—zapping horses, cows, and dogs—and tried to coin the phrase “Westinghoused” to describe electrocution. (Despite common misconception, Edison didn’t electrocute Topsy the elephant, sentenced to death for killing three people, in 1903. She was simultaneously poisoned and electrocuted by employees of the Edison Company, most likely New York Edison, a company Thomas Edison hadn’t been associated with for years. The event happened a decade after the end of the War of Currents anyway.) Edison did, however, arrange for AC to power the first execution by electric chair in 1890, which took two (torturous) tries and several minutes, and garnered the response from Westinghouse, “They could have done better with an axe.”

After successfully demonstrating the long-range capabilities of AC at the International Electro-Technical Exhibition of 1891 in Frankfurt, Germany, Westinghouse and Tesla’s system quickly overtook DC in the U.S. and worldwide (and even General Electric, Edison’s company, quietly got on board with its own version of AC).


Wikimedia Commons

Edison’s investment in the future of electricity wasn't just rooted in the AC/DC battle, though. While about two dozen other inventors were working to perfect the incandescent light bulb in the mid-to-late-19th century, Edison—whose version of the bulb finally triumphed—had a broader vision: to not just replace then-popular gas and oil lamps as standard lighting, but also supplant gas power entirely with an electrical infrastructure that could, like gas, be piped directly into homes and metered for usage.

In the 1880s, Edison began building out his planned system of (due to DC’s distance limitations) heavily localized, shared generating stations in and around New York. His initial vision for electricity’s use—until wide infrastructure was developed, at least—targeted business and private capital. By the 1930s, however, consumer demand for in-home power had grown, and electricity proponents found themselves squared off with gas companies in a battle to be the modern age’s power source.

To compete with new electric appliances, gas companies and proponents released a range of devices powered by manufactured gas, including radios—seen as a “killer app” for electricity—and refrigerators. Unlike noisy, electrically powered compression fridges (the descendants of which most of us use today), gas-powered absorption fridges like the 1932 Electrolux model (inspired in part by a patent from the absorption-based “Einstein refrigerator”) were almost silent, and cheaper to run. Consumers were faced with a major formatting choice; a 1931 Milwaukee Journal laid out the difference between the two fridge types for the sake of curious buyers in the paper’s “Home Refrigeration Special Section.” Gas proponents even envisioned whole modern kitchens, chock-full of appliances, being powered by the stuff. 

In the end, however, electricity won over the masses with its greater diversity of possible applications (try to picture, for example, a gas-powered computer), the host of new appliances that electric companies released, and that fact that—unlike gas, which was perceived as dangerous, and created smells and stains with its vapors—electricity was simply less obtrusive. The two energy sources also proved incompatible at times; in 1937, a reported 294 people died as the result of an explosion created when leaking natural gas was likely ignited by a sparking electric light switch, which all but leveled a high school in New London, Texas. By the ‘40s, electric companies had mostly absorbed gas ones, and methane-heavy natural gas slowly replaced hydrogen- and carbon dioxide-heavy manufactured coal in the home. 



Even though the days of choosing between cartridges and discs may be behind us, major format wars are still being waged in a very big way. While the Apple vs. Microsoft feud has cooled significantly in recent years, the current tech face-off of Apple vs. Google, writes TIME, is “a war between two fundamentally different visions for the future of computing, described in simplistic terms as closed vs. open.” 

In other words, critics point out that the Apple model of technology is based on the company having complete control over its hardware and software, while Google has generally invited developers and consumers to try their own hands at making better Android products—or, as TIME puts it, “let a thousand flowers bloom.” 

One way that the companies have secured their status of having a good, old-fashioned format war is by enthusiastically suing each other over patents on an enormous scale; in 2011, for example, both Apple and Google “spent more on patent litigation and intellectual property than on research and development” for the first time ever.

As for what the market will finally conclude on the matter of Apple vs. Google, only time—and consumers’ preferences—will tell.