6 Misconceptions About the ’90s

The last decade of the second millennium gave us a lot: the rise of the World Wide Web, the fall of the USSR, and multiple rom-coms starring Julia Roberts. And with such seismic shifts in technology and culture came a lot of misconceptions.

A Furby.
A Furby. / Getty Images/Handout
facebooktwitterreddit

Christmas Day, 1998. You’re splayed out on the floor of your family room, surrounded by a sea of crumpled wrapping paper and a fleet of toys and games. Your older sister’s stationed at the bathroom mirror, testing out a new eyeshadow. It’s blue. You can hear the tinny, muffled voices of Beastie Boys emanating from the headphones of your older brother’s new Sony Walkman. You’ve eaten six Christmas cookies and it’s not even 10 a.m. Life is good. 

Just then, a team of federal agents bursts through the front door and makes a beeline right toward you. You’re confused. You’re only 9 years old! What could they possibly want with you?

But they don’t want you. They want your Furby.

The last decade of the second millennium gave us a lot: the rise of the World Wide Web, the fall of the USSR, and multiple rom-coms starring Julia Roberts. With such seismic shifts in technology and culture came a fair number of misconceptions, from the idea that Furbys could function as spies to the myth that OJ Simpson got acquitted because of some ill-fitting gloves. Let’s take a closer look at a few, adapted from an episode of Misconceptions on YouTube.

1. Misconception: Furbys could record your conversations.

In 1997, designers David Hampton and Caleb Chung conceived a Tamagotchi-type toy that you could pet. They imbued this ambiguous creature with the power of (limited) speech and called it a Furby—a variation on their working title, Furball. The invention soon got sold to Tiger Electronics, which itself soon got sold to Hasbro. By the 1998 holiday season, Furby was the hottest toy on the market.

But the finer workings of this technological wonder were lost on the casual consumer, and rumors exaggerated its capabilities. The simplest misunderstanding was that Furby could record conversations, which it couldn’t. In January 1999, the National Security Agency went so far as to ban the toy from its Maryland headquarters. 

According to NSA’s memo, or “Furby Alert,” as the media dubbed it, “Personally owned photographic, video, and audio recording equipment are prohibited items. This includes toys, such as Furbys, with built-in recorders that repeat the audio with synthesized sound to mimic the original signal.”

The fear was that Furbys would pick up classified intel on the premises and start spewing it out whenever NSA employees took them off site. Tiger Electronics president Roger Shiffman did his best to set the record straight, saying that “Although Furby is a clever toy, it does not record or mimic voices. The NSA did not do their homework. Furby is not a spy!”

Furby worries weren’t confined to the NSA. Some health care professionals thought Furbys could mess with medical equipment, and the FAA banned them during airplane takeoffs and landings just in case they could somehow screw up plane instruments.

And not all the misconceptions were tech-based. Another prevalent claim was that Furby fur came from dogs and cats. The fur was acrylic. As one company spokesperson joked, “Yep, a lot of acrylics were killed in the name of Furbys.”

2. Misconception: O.J. Simpson was acquitted because of ill-fitting gloves.

On June 15, 1995, in the middle of a Los Angeles courtroom, O.J. Simpson tried on a pair of leather gloves, one of which was bloodstained. They were kind of small.

And in the words of Simpson’s defense attorney Johnnie Cochran, if it doesn’t fit, you must acquit—which is exactly what the jury did in early October. Some 150 million people tuned in to watch the former football star be declared not guilty of the murders of his ex-wife Nicole Brown Simpson and Ronald Goldman.

But the verdict was never just about the gloves—nor was the statement “If it doesn’t fit, you must acquit.”

Sure, it seemed strange that Simpson would stab anyone to death wearing a pair of gloves that he could hardly wriggle his fingers into. So were a other details about the case. 

Actually, the whole process to collect and secure evidence was kind of a mess—photos were unlabeled, multiple items got tossed in the same bag, and police draped one of Nicole Brown Simpson’s own blankets over her body, potentially comprising any evidence that might have been collected, since her husband presumably could have used the blanket previously.

As the Crime Museum pointed out, “sloppy maneuvering at the scene caused more bloody shoe prints to be left behind by LAPD than by the perpetrator.”

A cross-contamination nightmare from start to finish—and a golden opportunity for the defense to poke holes in the prosecution’s case. Cochran even went so far as to float the notion that blood and other evidence had been planted

As Cochran explained, the ill-fitting gloves were a metaphor of sorts for all the other ways in which the prosecution’s theories just didn’t add up:

“Like the defining moment in this trial, the day [co-prosecutor Christopher Darden] asked Mr. Simpson to try on those gloves and the gloves didn’t fit, remember these words: If it doesn’t fit, you must acquit.”

This concept suggested to Cochran by a colleague, who wrote that while the glove wasn’t a “home run from an evidentiary standpoint,” it gave them a theme for closing arguments, pointing out that the standard of beyond a reasonable doubt could be paraphrased as ‘If it doesn’t fit, you must acquit.’ Saying “the evidence must fit the interpretation of the facts that leads to the conclusion of guilt . . . The problem with the prosecution’s case is that there are lots of places where the evidence doesn’t fit the picture they want the jury to see.” The glove itself might not have been as important as leaving the jury with some kind of earworm. According to Cochran, at one point the team toyed with ‘If it doesn’t make sense, you must find for the defense.” 

The catchy couplet they landed on became a mantra that Cochran intoned multiple times during his closing argument in late September. By the end, it couldn’t have been clearer to jurors that if they had any doubts about Simpson’s guilt, they were legally required to find him innocent. Which, of course, they did.

The glove was definitely a factor in the decision, but jury members weren’t swayed by that alone. As juror Lionel Cryer told reporters at the time, “It was garbage in, garbage out. … We felt there were a lot of opportunities for either contamination of evidence, samples being mixed or stored together.” 

If it’s garbage in, garbage out, that might equate to reasonable doubt.

3. Misconception: Y2K fears were completely unfounded. 

Mastercard's Y2K command center in January 2000
Mastercard's Y2K command center in January 2000. / Bill Greenblatt/GettyImages

In the spirit of conserving digital storage space, developers set a lot of early software to use dates with two-digit years, rather than the full four. The year 1975 was just “75,” 1989 was “89,” and so on. As the ’90s progressed, people began to wonder what would happen in the year 2000. Any program that involved calendar dates would surely interpret the year as 1900, leading to, well, problems.

And people weren’t just worried that this “Y2K bug” would cause a major administrative headache—they were worried about any machines or systems that functioned with computer chips. Could there be plane crashes, power grid failures, medical equipment malfunctions? A full-fledged apocalypse? It wasn’t clear exactly how the glitch might manifest. Though that didn’t stop enterprising individuals from hawking Y2K survival kits and guides.

But companies and government entities had been working hard to update systems to preempt disaster. In 1999, the Department of Commerce estimated that the price tag for this endeavor for the U.S. alone between 1995 and 2001 would hit $100 billion. Plenty of economists felt that was too low.

So it all seemed pretty silly when January 2000 came and went more or less without incident—though there were some incidents. The U.S. Naval Observatory website proclaimed the date to be January 1, 19100. A Danish newborn was briefly 100 years old in hospital records. A New York video store charged a guy more than $91,000 for a century-overdue VHS rental of The General’s Daughter. (He wasn’t expected to pay the fine.)

The temporary errors help prove that the Y2K bug wasn’t totally baseless. And tech experts generally agree that there would have been many more issues if effort hadn’t been put into preventing them.

As technology forecaster Paul Saffo told TIME, “The Y2K crisis didn’t happen precisely because people started preparing for it over a decade in advance. And the general public who was busy stocking up on supplies and stuff just didn’t have a sense that the programmers were on the job.”

4. Misconception: ’90s kids are Millennials.

Spend enough time on social media and you’ll eventually scroll past a picture of, say, Dunkaroos, or Pogs, or a stack of Disney VHS tapes in clamshell cases, with some variation of this caption: “Only ’90s kids will remember.” 

Let’s talk about the term ’90s kid

Some people consider anyone born between 1990 and 1999 to fit the bill. To others, a true ’90s kid is someone whose formative childhood years fell during that decade. Since there’s no formal definition, neither can be called right or wrong. But using ’90s kid as a straight-up synonym for Millennial is not accurate.

Anybody born between 1981 and 1996 is officially a Millennial, and plenty of those people do see themselves as ’90s kids. If 1981 is your birth year, you spent many of your formative years in the decade, but someone born in 1978 could say the same, and they’re a member of Gen X. 

If you showed up between 1997 and 1999, on the other hand, you’re a Gen Z’er. But you can still claim the ’90s kid title on the grounds that you were born that decade. Plus, it’s not like all the cultural touchstones of the ’90s vanished on January 1, 2000. Dunkaroos weren’t discontinued in the US until 2012. (And, we’re happy to report, they’ve been brought back.)

In other words, elder Gen Z’ers have roughly as much right to call themselves ’90s kids as someone who was born in ’95 or ’96. Just don’t do it on Facebook, or elder Millennials will probably start arguing with you in the comments. Wait, who are we kidding? Gen Z isn’t on Facebook.

5. Misconception: Al Gore claimed he invented the internet.

In March 1999, then-Vice President Al Gore appeared on CNN’s Late Edition to talk about his presidential aspirations. When host Wolf Blitzer asked why Democrats should back Gore for the nomination over former New Jersey senator Bill Bradley, this was part of his answer

“During my service in the United States Congress, I took the initiative in creating the internet.”

Largely thanks to a WIRED article that spotlighted that portion of the interview, politicians and pundits started roasting Gore for the claim that he “invented the internet.” 

What he probably should’ve said was something like “I helped secure important government funding for the internet at a key point in its development.” Which he did—by sponsoring the 1991 High-Performance Computing and Communications Act. 

Some of the cool $600 million that the act granted to high-performance computing went to the National Center for Supercomputing Applications. Its developers—including future Netscape co-founder Marc Andreessen—ended up making Mosaic, the first popular web browser to feature inline images.  

So if Al Gore didn’t invent the internet, who did?

Back in the 1960s, the Department of Defense developed the Advanced Research Projects Agency Network, or ARPANET. Considered a precursor of today’s internet, ARPANET used phone lines to connect physically distant computers within the Pentagon’s research network.

Vinton Cerf, who worked on ARPANET, and his collaborator Robert Kahn are sometimes cited as the fathers of the internet. In the 1970s, they devised two new processes that worked together to package and transmit data: Transmission Control Protocol and Internet Protocol—often abbreviated as TCP/IP. The protocols didn’t rely on phone lines or require computers to be the same brand or even on the same network to swap information. It was a whole new mode of communication for the world’s computers.

For what it’s worth, Cerf and Kahn aren’t salty about Gore taking credit for making the internet a thing. They actually supported his claim. In the wake of the controversy, the pair released a statement detailing his role as an early internet champion dating back to the 1970s. 

“Al Gore was the first political leader to recognize the importance of the internet and to promote and support its development. … The vice president deserves credit … for his long-term and consistent articulation of the potential value of the internet to American citizens and industry and, indeed, to the rest of the world.”

Gore might have been guilty of inflating his role in the birth of the internet in a single interview. But his contributions were important enough to earn him a place in the Internet Hall of Fame. He, Cerf, and Kahn were all part of the very first induction class in 2012.

Also on the list that year was Tim Berners-Lee, the British computer scientist who invented the World Wide Web in 1989. Not long after that, he developed HTML, the markup language used to create web pages. 

By the way, World Wide Web isn’t a cute, catchy synonym for internet. Basically, the internet is a massive global network where information gets shared between computers. The World Wide Web is one place you can go to access that information. In other words, the World Wide Web comprises websites full of content that gets transmitted across the internet.

6. Misconception: Mary-Kate and Ashley Olsen are identical twins.

Some vindication for all the people who have spent the last 30 years or so boasting about how easy it is to tell Mary-Kate and Ashley Olsen apart. The break-out stars of Full House and countless tween movies aren’t identical twins at all. 

They’re fraternal (or dizygotic) twins, meaning they came from two separate eggs. They share only half their genome, and should only look as similar as any other two close-in-age siblings.

Apparently they’re slightly different heights. They don’t have the same dominant hand, either: Ashley is right-handed, and Mary-Kate is left-handed.