When Y2K Sent Us Into a Digital Depression

iStock.com/Laspi
iStock.com/Laspi

It's hard to pinpoint the exact moment when the paranoia first began to creep in. Sometime during the late 1990s, consumers noticed that their credit cards with expiration dates in the year 2000 were being declined by merchants. Shortly thereafter, people began stocking up on shelf-stable food and water, potentially condemning themselves to months of all-SPAM diets. A number of concerned citizens outside of Toronto, Canada, flocked to the Ark Two Survival Community, a nuclear fallout shelter-turned-bunker comprised of dozens of decommissioned school buses buried several feet below the Earth and protected by a layer of reinforced concrete.

In the months leading into New Year's Day 2000, millions of people steeled themselves for a worst-case scenario of computers succumbing to a programming glitch that would render them useless. Banking institutions might collapse; power grids could shut down. Anarchy would take over. The media had the perfect shorthand for the potential catastrophe: Y2K, for Year 2000. The term was used exhaustively in their coverage of a situation some believed had the potential to become one of the worst man-made disasters in history—if not the collapse of modern civilization as we knew it.

In the end, it was neither. But that doesn't mean it didn't have some far-reaching consequences.

John Koskinen of the President's Council on Y2K Conversion makes a public address
Michael Smith, Getty Images

The anticipatory anxiety of Y2K was rooted in the programs that had been written for the ginormous computers of the late 1960s. In an effort to conserve memory and speed up software, programmers truncated the date system to use two digits for the year instead of four. When the calendar was set to roll over to the year 2000, the belief was that "00" would be a proverbial wrench in the system, with computers unable to decipher 2000 from 1900. Their calculations would be thrown. Using "98" for 1998 was a positive value; using "00" would result in negative equations. How computers would react was based mostly on theories.

That ambiguity was quickly seized upon by two factions: third-party software consultants and doomsday preppers. For the former, rewriting code became a cottage industry, with corporations large and small racing to revise antiquated systems and spending significant amounts of money and manpower in doing so. General Motors estimated the cost of upgrading their systems would be about $626 million. The federal government, which began preparing for possible doom in 1995, ended up with an $8.4 billion bill.

Some of that cost was eaten up by soliciting analyses of the potential problems. The U.S. Department of Energy commissioned a study looking at the potential for problems with the nation's energy supply if computers went haywire. The North American Electric Reliability Council thought the risks were manageable, but cautioned that a single outage could have a domino effect on connected power grids.

As a result, many newspaper stories were a mixture of practical thinking with a disclaimer: More than likely nothing will happen … but if something does happen, we're all screwed.

"Figuring out how seriously to take the Y2K problem is a problem in itself," wrote Leslie Nicholson in the January 17, 1999 edition of the Philadelphia Inquirer. "There is simply no precedent."

Pending economic and societal collapse fueled the second pop-up industry: survivalist suppliers. As people stocked up on canned goods, bottled water, flashlights, and generators, miniature societies like Ark Two began to spring up.

While the panic surrounding Y2K was dismissed by some as unwarranted, there was always fuel to add to the fire. The United States and Russia convened to monitor ballistic missile activity in the event a glitch inadvertently launched a devastating weapon. People were warned checks might bounce and banking institutions could freeze. The Federal Reserve printed $70 billion in cash in case people began hoarding currency. Even the Red Cross chimed in, advising Americans to stock up on supplies. Y2K was being treated like a moderate-category storm.

Adding to the concern was the fact that credible sources were sounding alarms. Edward E. Yardeni, then-chief economist at Deutsche Morgan Grenfell/C.J. Lawrence, predicted that there was a 60 percent chance of a major worldwide recession.

As New Year's Eve 2000 approached, it became clear that Y2K had evolved beyond a software hiccup. Outside of war and natural disasters, it represented one of the few times society seemed poised for a dystopian future. People watched their televisions as clocks hovered close to midnight, waiting to see if their lights would flicker or their landline phones would continue to ring.

A software program is represented by a series of ones and zeroes
iStock.com/alengo

Of course, nothing happened. So many resources had been extended toward the problem that the majority of software-reliant businesses and infrastructures were prepared. There were no power outages, no looting, and no hazards. The only notable event of January 1, 2000 was the reporting of the resignation of Boris Yeltsin and the arrival of Vladimir Putin as Russia's new president.

With the benefit of hindsight, pundits would later observe that much of the Y2K concern was an expression of a more deeply rooted fear of technology. Subconsciously, we may have been primed to recoil at the thought of computers dominating our society to the extent that their failure could have catastrophic consequences.

All told, it's estimated that approximately $100 billion was spent making upgrades to offset any potential issues. To put that into context: South Florida spent $15.5 billion rebuilding after the mass destruction caused by Hurricane Andrew in 1992.

Was it all worth it? Experts seem to think so, citing the expedited upgrades of old software and hardware in federal and corporate environments.

That may be some small comfort to Japan, which could be facing its own version of Y2K in April 2019. That's when Emperor Akihito is expected to abdicate the throne to his son, Naruhito, the first such transition since the dawn of the information age. (Akihito has been in power since January 1989, following the death of his father.) That's significant because the Japanese calendar counts up from the coronation of a new emperor and uses the name of each emperor's era. Akihito's is known as the Heisei era. Naruhito's is not yet named, which means that things could get tricky as the change in leadership—and the need for a calendar update—comes closer.

It's hard to predict what the extent of the country's problems will be as Akihito steps down. If history is any guide, though, it's likely to mean a lot of software upgrades, and possibly some SPAM.

Love Is On the Air: How The Dating Game Changed Television

Wikimedia Commons // Public Domain
Wikimedia Commons // Public Domain

Chuck Barris had a problem. As the creator and producer of a new ABC game show titled The Dating Game, Barris had thought it would be entertaining to see three men vie for the affections of a woman who quizzed them from behind a screen. Because they'd be unable to rely on visual cues or physical attraction, the contestant and her would-be suitors would have to assess their chemistry based on verbal interplay, and wouldn't see each other face-to-face until she selected a winner.

Unfortunately, early tapings of the game in 1965 had not gone well. Barris later recalled that both the men and women had tasteless responses, answering the contestant's questions with profane remarks full of sexual innuendo that would be unacceptable for daytime television. The shows could not be aired.

Then Barris had an idea. He asked a friend of his who was an actor to dress in a hat and raincoat to give the appearance of a law enforcement official. The man walked into the dressing room where the bachelors were waiting to go on air. He lied and told them that any profanity or overt sexual references would be a violation of Federal Communications Commission (FCC) policy, a federal offense. They might even get sentenced to jail time.

From that point on, there were no more problems with people uttering expletives on The Dating Game, a long-running series that acted as a precursor to The Bachelor as well as a host of other dating shows. Recognizable for its campy 1960s set, host Jim Lange blowing kisses at the audience, and its inane questioning of contestants, the show marked a pivotal shift away from game shows that offered monetary gain and instead offered a potentially greater reward: true love.

Barris, a game show legend who would go on to create The Newlywed Game and The Gong Show, was an ABC executive at the time. As head of daytime programming, he spent much of his time fielding what he thought were many ill-conceived pitches for shows from producers. He told fellow daytime executive Leonard Goldberg that he could come up with something better. But when Goldberg told him to try, Barris replied he had a wife and child and couldn’t spare the time. Goldberg offered to listen to an informal pitch. Barris came up with The Dating Game.

Some have observed the genesis of the show came as a result of Helen Gurley Brown’s 1962 book, Sex and the Single Girl, which posited that women could enjoy more casual relationships without the prospect of marriage looming over their heads. In the more sexually adventurous ‘60s, a show about a simple courtship—particularly one steered by a woman—was still seen as progressive.

At the time, game shows were relegated to contests that typically featured a prize, or at least bragging rights to having won. Jeopardy! and The Price is Right were on the air handing out cash and cars. But Barris was more interested in an intangible benefit. Though the woman and her chosen suitor would be sent out for a dinner date, the expense was minimal, and no one was paid to appear on the show. For viewers, it was about who would find love—or at least the appearance of it.

To select contestants to appear on the series, Barris devised a referral system. After recruiting an initial round of potential participants, his staff had them fill out several forms consisting of their personal information. One of the sheets was reserved for people they already knew and who they felt would be a good fit for the series; a blue form was used for bachelors; and pink for single women. Staffers would be on the phone all day, calling candidates and ushering them in for further evaluation.

For Barris, a contestant on The Dating Game needed to be gregarious, glib, and able to elaborate on answers. If questions weren’t up to snuff, his writers would help craft queries meant to elicit slightly salacious—but never profane—responses. (The questions ranged from perceptive to queries like, “If men are what they eat, which vegetable do you consider yourself?”) Test games would be held in Barris’s Hollywood offices. Out of a pool of 1000 possible contestants, the show would decide on 132 of them to fill their taping needs.

 

For a host, Barris chose Jim Lange, a popular radio personality, to move the game along. Each episode consisted of two complete games, usually a woman interrogating three men—though the format was soon changed to allow for a switch in roles, with three women vying for one man. Barris also enlisted celebrities or soon-to-be celebrities like John Ritter, Farrah Fawcett, Arnold Schwarzenegger, and Tom Selleck, as well as occasionally sprinkling in a crush, work colleague, or someone else the contestant might know in their private life.

The show was an immediate hit on daytime when it premiered in December 1965. The series soon expanded to primetime in 1966 with a slight change in format: The “dates” now included travel to romantic hotspots like Paris and Rome in an effort to broaden the scope of the show. These trips involved the use of chaperones—a necessity, Barris said, because few parents would allow their young daughters out of the country with a veritable stranger.

The Dating Game aired on ABC through 1973 and entered syndication for one year. In 1978, it went into syndication again (Barris was no longer directly involved), with Lange returning as host. This version, however, was perceived as lewd, with contestants and producers making less of an effort to stifle the sexual wordplay. (“Let’s hear about your tool chest” was among the less-than-clever prompts offered by contestants.) Various other iterations have aired over the years, morphing into the more elaborate find-a-mate series like The Bachelor, which not only expects contestants to have chemistry but eventually wed. Strangely, the conceit seems more old-fashioned than the show that started the genre.

Those shows owe quite a debt to Barris, who eventually left television altogether after feeling as though he was becoming pigeonholed by his game show successes. Barris later penned his 1984 autobiography, Confessions of a Dangerous Mind (which was adapted into a 2002 movie starring Sam Rockwell, directed by George Clooney, and written by Charlie Kaufman), in which he claimed he was an assassin for the CIA and executed targets while chaperoning winners of The Dating Game. That sensational assertion is in doubt, but Barris’s contributions to romance as a television commodity are not. The notion of dating as entertainment goes back to his original idea, a simple partition, and a man in a raincoat.

The Unkindest Cut: A Short History of the Mullet

Peter Parks, AFP/Getty Images
Peter Parks, AFP/Getty Images

Jerry Seinfeld wore it on primetime television for nine years. Brad Pitt thinks his career got off the ground because he wore one to his Thelma & Louise audition. Peter Dinklage’s high school photo went viral as a direct result of the bold choice.

For all of these men and millions of others, the mullet has had profound and lasting effects on their lives. Famously described as being “business in the front, party in the back” and sometimes referred to as a “squirrel pelt” or the “ape drape,” the short-front, long-backed hairstyle might be the most controversial cut in the history of grooming. What started it? And can anything kill it?

A man shows off his mullet
Peter Parks, AFP/Getty Images

Although it doesn’t have quite the same archaeological provenance as hieroglyphs or dinosaur bones, mullet historians believe there’s ample evidence to suggest that the hairstyle has been with mankind for centuries. Neanderthals may have favored it to keep hair out of their eyes and protect their necks from wind and rain. Greek statues dating back to the 6th century BCE sport the cut. Ancient civilizations in Mesopotamia and Syria rocked it.

Most of these populations embraced the cut for practical purposes: protection from the elements and visibility. But the direct lineage of the mullet to the modern day might be traceable from Native Americans, who often wore their hair short in front and kept it long in the back as a sign of their spiritual strength. The style was eventually appropriated by Western culture and made its way to settlements; colonial wigs, particularly George Washington’s, look a little mullet-esque.

The mullet remained dormant for much of the 20th century. Conformity led to sharp, practical cuts for men and traditional styles for women. That began to change in the 1960s, when counterculture movements expressed their anti-establishment leanings in their mode of dress. Long hair on guys became commonplace. In the 1970s, entertainers looking to appear even more audacious pushed their stage presence to extremes. For David Bowie, that meant a distinctive hairstyle that was cropped over the eyes and ears and left hanging in the back.

 David Bowie performs his final concert as Ziggy Stardust at the Hammersmith Odeon, London on July 3, 1973
Express/Express/Getty Images

Bowie’s popularity drew fresh attention to the mullet, although it didn’t yet have a name. The arrival of MTV led to even more exposure, which soon migrated to other mediums. Richard Marx’s blow-dried variant led to George Clooney’s The Facts of Life sculpt. Patrick Swayze’s ‘do in 1989’s Road House deserved equal screen billing. Mel Gibson raced through three Lethal Weapon movies with a well-insulated neck. John Stamos consoled his widowed brother-in-law on Full House with an epic mullet. Richard Dean Anderson diffused bombs on MacGyver for years with the “Arkansas waterfall.” Some fads last months. The mullet seemed to be hanging on for the long term.

But public derision was brewing. The style began to be appropriated by a demographic fond of trucker hats and sandals. The death blow came when the Beastie Boys mocked the cut on their 1994 track “Mullet Head,” a song the Oxford English Dictionary credits with naming the fad. (A “mullet head” had long been an insult used to label someone lacking in common sense: Mark Twain used it in 1884’s Adventures of Huckleberry Finn.) Suddenly, mullet-wearers were objects of ridicule and scorn, their locks outdated. For 1998’s Lethal Weapon 4, Gibson lost his trademark cut. It was the end of an era.

A man shows off his mullet
Peter Parks, AFP/Getty Images

Like most things in fashion, that would not be the end of the mullet. The cut has made periodic resurgences over the years, with people adopting ironic takeoffs or making legitimate attempts to return the coonskin cap-like look to its former glory. In Moscow, young men suddenly began sporting the look in 2005, which became ground zero for a follicular virus. Some less flexible countries even became proactively anti-mullet: Iran banned it, among other Western styles, in 2010.

Men aren't the only ones to have rocked the style: Scarlett Johansson and Rihanna have both sported the look—albeit a decade apart.

Hairstylists generally avoid the waves of attention the mullet can sometimes provoke. “It's for people who are slightly confused, who believe they like long hair but don't want the image that they associate with long hair," celebrity hairstylist Jose Eber told the Los Angeles Times in 2001. He declared it "nonsense."

Dacre Montgomery in 'Stranger Things'
Dacre Montgomery rocks a mullet as Billy Hargrove in Stranger Things.
Netflix

But try telling that to the hairstyle's latest throng of fans, many of whom have been inspired to go back in time for the short-long look by Netflix's Stranger Things. "I cut at least one or two a week,” London hairstylist Idalina Domingos, who sports a shaggy-styled mullet herself, told The Guardian in August 2019. "There are these modern mullets, people are coming round to the idea. It’s a fun haircut to have and it's only going to get more popular."

For others, the cut is timeless. Kurri Kurri, a small mining town in Australia, is hosting its third annual Mulletfest, a celebration of all things badly shorn, on February 29, 2020. “We have so many mullets in town,” co-organizer Sarah Bedford said. “My father-in-law had one for 60 years."

SECTIONS

arrow
LIVE SMARTER