7 Misconceptions About the ‘80s

Simon Le Bon of Duran Duran rocks a mull—er, bi-level haircut.
Simon Le Bon of Duran Duran rocks a mull—er, bi-level haircut. / Keystone/Hulton Archive/Getty Images
facebooktwitterreddit

Whether you lived through the ‘80s or merely have absorbed some details via Stranger Things, take a look at some of the myths surrounding the “me decade,” adapted from Misconceptions on Youtube.

1. Misconception: Mullets were actually called “mullets.”

1980s couple, one of whom is sporting a mullet hairstyle
One of these kids is a real mullet head. / Leon Morris/GettyImages

Of all the oversized, regrettable hairstyles of the 1980s, one bad choice sits above the rest: the mullet, a.k.a the squirrel pelt. The Arkansas waterfall. The ape drape. The practice of cutting your hair short in the front and sides and keeping it long in the back. It’s a look that says you know how to party and still show up for work more or less sober the next day.

And it’s a look that absolutely no one actually called a mullet in the ‘80s. It wasn’t until 1994, when the Beastie Boys released a song called “Mullet Head,” that the unfortunate hairstyle was given its equally unfortunate name. The phrae mullet head, as an insult for a stupid person, dates back to 1855. But before the Beastie Boys’s song, the ‘do was sometimes called a bi-level.

2. Misconception: “Stranger danger” plagued the country.

Children Learn to Avoid Kidnappers
Children learn to avoid kidnappers. / David McNew/GettyImages

If you could fit inside a car trunk in the ‘80s, you were constantly warned about the perils of interacting with strangers. Newscasts and newspapers were rife with stories about missing kids and cautionary tales about child abductions. It was even given a catchy name: stranger danger. But was there really an epidemic of kidnappings?

There was not. There were some unfortunate circumstances that led the public to be afraid of one, though. In the early 1980s, a number of missing children—including two paperboys in Iowa named Johnny Gosch and Eugene Wade Martin—received a great deal of media attention. The disappearance of Adam Walsh in 1981 only added to the concern. More than 38 million viewers tuned into a 1983 TV movie about his abduction. Cartoons had warnings about talking to strange adults. One survey estimated kids up to fifth grade were about as afraid of being kidnapped as they were of nuclear war. 

The widespread coverage of these incidents made it seem like the danger was omnipresent. At one point, the media was reporting up to 50,000 children were being abducted annually, and the sight of missing kids on milk cartons meant anyone having breakfast was being confronted with the possibility of a child—maybe their child—going missing.

But even back in 1985, the Los Angeles Times was reporting data that cast some serious doubt on the supposed spate of child abductions. The FBI had reports of 67 stranger kidnappings that year, and the National Center for Missing and Exploited Children said they had “firm records” of 142 cases.

Obviously, every one of those cases is one too many, but the media focus on stranger danger risks misled the public about the actual risks posed to children. In 2018, for example, the National Center for Missing and Exploited Children reported helping law enforcement with 25,000 missing children cases.  Of those, 23,500 were runaways and 1000 had been abducted by family members, some of which may have been related to parental custody issues. In other words, there wasn’t, statistically speaking, that much danger from strangers—just a relative handful of high-profile cases that captured the public’s imagination and a much larger number of unfortunate, but less sensationalistic, stories not involving strangers at all.

In 2017, the center even called for an end to the phrase stranger danger, citing statistics that most crimes involved people the child knew and that at times it might actually be beneficial for a kid to reach out to a stranger if they need help. And in extreme circumstances, it’s even OK to approach someone with a mullet.

3. Misconception: The 1980s were all about greed.

Reading About Wall Street Crash
Reading about the Wall Street crash on the New York City subway. / James Marshall/GettyImages

Everyone who remembers the 1980s remembers a decade of excess. Cocaine. Money. A questionable number of leg warmers. But did people in the ‘80s really have an unquenchable thirst for wealth?

Probably no more so than in any other decade. One way to define greed is by the amount of charitable giving being done, or lack thereof. By that metric, the ‘80s saw unprecedented generosity. In 1980, Americans gave roughly $65 billion to charity. By the end of the decade, that number had grown to over $100 billion. As a percentage of national income, that’s far higher than it was in the 25 years prior to 1980.

Was all that generosity a result of greater wealth? Could be. But the growth in charitable giving outpaced what people in the ‘80s were spending on material goods. Giving grew 68 percent that decade over decades prior, while total consumer spending grew 48 percent.

It’s easy to see why people stereotype the ‘80s as the “me decade.” In the United States, income tax rates were slashed on the highest earners—but for much of the decade they were still higher than today’s top rates.

In the 1980s, the number of millionaires in the country went from 2.8 million to 3.2 million. But twice as many new millionaires were minted in the 1990s. 

Yes, many brokers liked flashy watches and suits. Madonna had a hit with “Material Girl.” But does flashy equal greedy? Greed typically means hoarding as much as you can. Record charitable giving doesn’t support that idea.

4. Misconception: Pay phones were untraceable criminal hotlines.

Cheerleader talking on a pay phone in 1987
An L.A. Rams cheerleader makes a phone call at halftime. / George Rose/Getty Images

Before the proliferation of smartphones, making a call while outside of your home typically meant using a pay phone—those virtually indestructible public phones in booths or installed on streets that seemed to scream out, “Please use me to conduct illegal activity.” Many people thought that no one could trace a public phone, allowing drug dealers to cover their tracks. Some communities even lobbied to have pay phones removed, citing concerns over criminal activity.

But public pay phones actually worked a lot like regular landline phones. Inserting a coin and dialing a number created the same record of the date, time, and recipient of the phone call, making for a handy reference for law enforcement.

It’s true that some company’s pay phones didn’t keep such records, but others did. And since most criminals didn’t bother making the distinction, anyone relying on a pay phone to conduct illegal business was taking a chance that their illicit activity would be discovered. The caller might be able to remain anonymous, but most everything else, like the time and length of the call, and the number on the other end, was fair game. Some cities even removed the ability for a pay phone to receive an inbound call in order to make it more difficult for dealers to treat the phone booth like a remote office. The phones simply weren’t a foolproof method of concealing a person’s identity.

Because of the stigma, though, a lot of pay phones were removed from places where they were of actual use to law-abiding citizens. Removing them likely did far more to keep innocent people from making innocuous calls than it did to help criminals keep themselves anonymous.

Interestingly, in 1946, only half of U.S. homes had a home phone. In some neighborhoods, one pay phone might service multiple homes. And yes, criminals were up to payphone mischief back then, too. Wise guys sometimes tied strings to coins to try and pull them back out of the machine after making calls. These would-be freeloaders were often thwarted, though, by string cutters inside the phones, a low-tech security measure that started to appear around the 1930s.

5. Misconception: Big ‘80s hair damaged the ozone layer.

A woman with hugely teased-out blonde hair in the 1980s.
The higher the hair, the closer to God. / Independent News and Media/GettyImages

Mullets were not the only questionable follicular choice of the ‘80s. Many men and women teased, preened, and shaped their hair into wavy cascades using voluminous amounts of hairspray.

In 1985, this vanity seemed to have brought the world to the brink of destruction. That’s when scientist Joseph Farman and others disclosed that the atmospheric ozone over Antarctica had been reduced by approximately 40 percent. Ozone, or trioxygen, is a gas that protects us from the sun’s potent UV rays. It’s nature’s sunscreen.

Farman and others pointed the finger at chlorofluorocarbons, or CFCs, a type of chemical that had been commonly used in hairspray, air conditioners, and refrigerators; levels of CFCs had risen high enough to damage the ozone layer.

But even though that theory was confirmed in the 1980s, it had actually been developed in the 1970s. It was in that decade that manufacturers voluntarily stopped using CFCs and the United States banned CFC use in aerosol products, except in the case of certain medical applications like inhalers. So those super-high hairdos in the ‘80s did not actively contribute to the hole in the ozone layer.

We don’t hear about the hole much anymore since the passing of the Montreal Protocol in 1987, which banned most ozone-depleting substances from use on a global level. With some luck, the ozone could be fully replenished in the next few decades.

6. Misconception: Everyone hated the taste of New Coke.

Billboard for the New Coke
A billboard for New Coke. / Todd Gipstein/GettyImages

It’s considered one of the biggest consumer products blunders of all time. In April 1985, after months of research, Coca-Cola unveiled a drink they dubbed New Coke. It was a sweeter, more syrupy version of their classic recipe, one they hoped would better compete with the surging rivals at Pepsi. This wasn’t just an alternative; it was a replacement.

Why was Coca-Cola so confident in switching up one of the most beloved soft drinks in the world? Taste tests. Extensive market research demonstrated that subjects preferred a slightly less fizzy and slightly sweeter Coke. And this wasn’t a few people they cornered at a shopping mall. The company conducted a reported 190,000 taste tests, and the results prompted the new formula.

Unfortunately, what Coca-Cola didn’t count on was the emotional connection people had with the taste of OG Coke. New Coke was quickly condemned by soft-drink enthusiasts, and common wisdom has it that Coke pulled the drink from shelves almost immediately owing to mass outrage.

While the drink had plenty of detractors, though, none was as vocal as Gay Mullins, a semi-retired real estate agent who found New Coke so off-putting he sunk $100,000 into a campaign against it. Mullins was often cited in the media, giving interviews and buzzworthy quotes like calling the lack of soda choice “un-American” and the new formula “unbelievably wimpy.” He sent out bumper stickers and set up telephone hotlines. Gay Mullins was waging a war against Coca-Cola, and he was winning.

It turns out that his motives may not have been entirely altruistic. Mullins later admitted he was hoping to cause enough commotion for Coca-Cola to pay him in hush money, or even inspire Pepsi to feature him in a campaign. When Coke finally relented and withdrew New Coke as its primary offering in June, Mullins said he’d be happy to speak on their behalf—for $200,000 per appearance. In the ultimate sign Mullins may not have been a true devotee, he couldn’t tell the difference between Coca-Cola Classic and New Coke in a blind taste test.

One other big misconception about New Coke: It didn’t actually go away in the ‘80s. Coca-Cola left it on shelves and let consumers decide which flavor they preferred. The company kept production of the product rolling until 2002, under the name Coke II.

7. Misconception: Grunge killed hair bands.

Kurt Cobain, Nirvana
Kurt Cobain of Nirvana. / KMazur/GettyImages

Everyone knows the story. The ‘80s were ruled by Mötley Crüe, Poison, Van Halen—gods of rock who sported the kind of hair that could theoretically destroy the ozone layer. And then, in the early 1990s, the Seattle sound took over. Spandex pants were traded for cardigans and bands like Nirvana and Alice in Chains sounded the death knell for flashy rock bands.

Of course, grunge grew popular, but it wasn’t exactly at the expense of hair bands. Vince Neil of Mötley Crüe has said that he bought Nirvana’s Nevermind and passed it around, encouraging people to listen to it, and that the business of his band didn’t change. Grunge offered a new sound, but it wasn’t like New Coke. It wasn’t replacing other genres. 

There also wasn’t really any rivalry. Kurt Cobain reportedly bought and loved Too Fast for Love by the Crüe. Alice in Chains opened for both Poison and Van Halen.

So what really happened to hair rock? Dee Snider of Twisted Sister once opined that hair bands did themselves in and were already in decline by the time grunge took over. “It became too commercialized, and then it got unplugged and [became] nothing but power ballads and acoustic songs, and it wasn’t metal anymore, it had to go, it had to change,” he said.

So why did the media portray a grunge takeover? Well, it made for a pat story. But it may have also been that hair band listeners were simply aging out of their ‘80s tastes and looking for something else, which they would have done with or without grunge. Cultural tastes change constantly. After all, you can’t rock a mullet forever. Unless you hold out just long enough for them to come back in style.