CLOSE
Original image

6 CES Technologies Ahead of Their Time

Original image

Every year since 1967, the Consumer Electronics Show (CES) has been an ideal place for companies to present their groundbreaking audio, video, computer, and video game products. Not every gadget on display in Las Vegas will succeed, but sometimes they flop because they're simply too far ahead of their time. Here are six technologies displayed at CES that, for one reason or another, weren't a hit then, but have since become a part of our daily lives.

1. Sony Data Discman (1991 Summer CES)

A hot topic among book lovers today is the potential demise of the printed page now that e-readers have become so popular. But people were having the same conversation in 1991, when Sony debuted the first e-reader, the Data Discman, at a VIP-only party at the Four Seasons Hotel during Summer CES in Chicago.

The Data Discman was about the size of a drugstore paperback, weighed just under 2lbs, featured a monochrome LCD screen, and a full QWERTY keyboard. Users could search books - mainly dictionaries, encyclopedias, travel guides, and other reference materials - loaded onto 3.5” CDs that held up to 80,000 pages of text or 32,000 pictures. And when you were done reading, you could plug in your headphones and listen to a music CD, too.

Sony released several different versions of the Data Discman with varying features, like a flip-top screen. However, at $450 for the base model, it didn't catch on in America or Europe. (It was a hit in Japan.)

2. AT&T VideoPhone 2500 (1993 Winter CES)

While the concept of a videophone is almost as old as the telephone itself, and a handful of high-priced models aimed at businesses have been available since the late-1960s, AT&T’s VideoPhone 2500 was the first model marketed to the home consumer.

Although available in 1992, AT&T used the 1993 Winter CES to kickstart a large-scale campaign to promote the phone and its full-color, 3.3” LCD screen that could show video conversations over regular telephone lines.

Of course for the video to work, both callers had to have their own VideoPhone. And at $1,599 each, it was not a small investment. Even a price drop to $999 just 13 months after its release didn't help sales. But perhaps the main reason the VideoPhone didn't take off was that consumers simply didn't want to see each other every time they picked up the phone. Naturally AT&T tried to convince them otherwise with some clever marketing ideas. For example, VideoPhones were placed inside the lobbies of 150 Hilton Hotels for use by traveling salespeople. The salesperson's family could visit a local AT&T store to talk to their road warrior on the VideoPhone, or even rent a model for a few days to try it at home. However, these efforts couldn’t sway public opinion, and the VideoPhone was discontinued in 1995.

Today, of course, we carry smartphones in our pockets that feature Skype, Google Hangouts, Apple Facetime, and plenty of other apps that let us talk face-to-face using full-motion video as fast as our 3 or 4G cellular networks can handle. However, even now, video calls aren’t the norm. Maybe the videophone is a solution looking for a problem.

3. Sega Activator (1993 Winter CES)

Considered one of the worst video game controllers ever made, the Sega Activator, which debuted at CES in 1993, was an early, but severely flawed attempt at motion-based gameplay for the Sega Genesis.

The Activator was a flat, octagonal frame that sat on the floor in front of the TV. Each section of the frame emitted an infrared beam that corresponded with a button on the standard Genesis controller. Players stood inside the frame and, waving their hands and feet, broke the path of the beam that corresponded with the button they wanted to push, making their video game avatar move accordingly. In theory, anyway.

The controls were less than intuitive, and the beams weren't very responsive, so the player usually flailed around like one of those dancing windsock men in front of a local car dealership, with few intended responses from the on-screen character.

The Activator's poor functionality, coupled with the fact that it cost $150 – nearly as much as the Genesis itself - meant that motion-controlled video games would have to wait until 2006 when Nintendo released its wildly successful Wii console.

Here's the training video that came with the controller:

4. AT&T Edge 16 (1993 Winter CES)

When Xbox Live debuted in 2002, it revolutionized video games. With Xbox Live and the similar PlayStation Network, gamers can not only play head-to-head against each other, they can talk via headset microphones, and download exclusive game content like new characters or in-game equipment. Did you know Sega was offering the same thing back during the Clinton Administration?

In 1993, Sega partnered with AT&T to create a new device called the Edge 16. The Edge peripheral plugged into the cartridge slot of the Genesis console, and then a 2-player Sega game fit into the Edge. The device featured a telephone port so that two Edge owners could play against each other. This was possible because button mashes were transmitted over the phone line and the Edge device fooled the game into thinking the remote player was using the second controller on the Genesis. If the opponents plugged a telephone handset or hands-free headset into the Edge, they could call each other names as they played.

The Edge also had memory slots for storage cards capable of saving custom video game characters that could be used on other Edge-enabled consoles. Game makers could even develop special edition memory cards with exclusive characters, levels, or equipment, or make these extras available for download to an existing card.

Despite these advanced features, the Edge 16 never caught on with consumers. It was so unceremoniously canceled that I couldn’t even find any information on its demise. One possible stumbling block was that game makers had to tweak their code for the Edge device to work, adding to production costs.

5. Commercial Brake (1994 Winter CES)

Remember when you got your first TiVo? Remember how awesome it was to be able to easily skip past all those commercials? If you'd been at Winter CES in 1994, you could have been skipping commercials long before TiVo with Arista Technologies' Commercial Brake.

The $160 device sat between the VCR and the TV, and worked by looking for the black frame inserted before and after commercial breaks during the broadcast. The Brake would mark these points on an unused portion of the VHS tape and then, during playback, would blank out the screen and automatically fast-forward between them. Although the Commercial Brake was an add-on peripheral, Arista hoped to have the technology integrated into new VCRs over the coming years.

After CES, the Commercial Brake received a fair amount of buzz in the consumer electronics field. However, it couldn't capitalize on the publicity, because Arista became mired in a lengthy legal battle with the actual inventor of the commerical-sensing technology. The device's release onto the market was delayed until 1996, the same year that DVD debuted to much fanfare at CES, signaling the death knell of the VCR.

6. The Listen Up Player (1997 Winter CES)

At the 1997 Winter CES, the trade show floor was abuzz with excitement about the Listen Up Player from Audio Highway. The $299 gadget even won the CES Innovations '97 Award. And considering you probably use a descendent of the Listen Up every day at the office, at the gym, or during your commute, there's no doubt it was innovative, even if no one remembers it.

With special “AudioWiz” software installed on their desktop PC, users downloaded previously recorded MP3s, ranging from newspaper and magazine articles, movie and music reviews, or even their own emails that were recorded via a text-to-voice translator. The MP3s were then copied to the Listen Up, a small, portable, battery-powered device that played the audio back through standard headphones. This all sounds like pretty standard stuff today, but it was groundbreaking in 1997, because the Listen Up was the first portable MP3 player on the market.

While it might have been the first, it wasn't the first successful one. According to Time Magazine, only about 25 Listen Up Players were produced and an unknown number were ever actually sold. It would seem that the Listen Up Player was just a little too soon for consumers. Only a year later, the Diamond Rio PMP300 portable MP3 player debuted and went on to sell over 200,000 units.

Original image
Netflix
arrow
entertainment
5 Things We Know About Stranger Things Season 2
Original image
Netflix

Stranger Things seemed to come out of nowhere to become one of television's standout new series in 2016. Netflix's sometimes scary, sometimes funny, and always exciting homage to '80s pop culture was a binge-worthy phenomenon when it debuted in July 2016. Of course, the streaming giant wasn't going to wait long to bring more Stranger Things to audiences, and a second season was announced a little over a month after its debut—and Netflix just announced that we'll be getting it a few days earlier than expected. Here are five key things we know about the show's sophomore season, which kicks off on October 27.

1. WE'LL BE GETTING EVEN MORE EPISODES.

The first season of Stranger Things consisted of eight hour-long episodes, which proved to be a solid length for the story Matt and Ross Duffer wanted to tell. While season two won't increase in length dramatically, we will be getting at least one extra hour when the show returns in 2017 with nine episodes. Not much is known about any of these episodes, but we do know the titles:

"Madmax"
"The Boy Who Came Back To Life"
"The Pumpkin Patch"
"The Palace"
"The Storm"
"The Pollywog"
"The Secret Cabin"
"The Brain"
"The Lost Brother"

There's a lot of speculation about what each title means and, as usual with Stranger Things, there's probably a reason for each one.

2. THE KIDS ARE RETURNING (INCLUDING ELEVEN).

Stranger Things fans should gear up for plenty of new developments in season two, but that doesn't mean your favorite characters aren't returning. A November 4 photo sent out by the show's Twitter account revealed most of the kids from the first season will be back in 2017, including the enigmatic Eleven, played by Millie Bobby Brown (the #elevenisback hashtag used by series regular Finn Wolfhard should really drive the point home):

3. THE SHOW'S 1984 SETTING WILL LEAD TO A DARKER TONE.

A year will have passed between the first and second seasons of the show, allowing the Duffer brothers to catch up with a familiar cast of characters that has matured since we last saw them. With the story taking place in 1984, the brothers are looking at the pop culture zeitgeist at the time for inspiration—most notably the darker tone of blockbusters like Gremlins and Indiana Jones and the Temple of Doom.

"I actually really love Temple of Doom, I love that it gets a little darker and weirder from Raiders, I like that it feels very different than Raiders did," Matt Duffer told IGN. "Even though it was probably slammed at the time—obviously now people look back on it fondly, but it messed up a lot of kids, and I love that about that film—that it really traumatized some children. Not saying that we want to traumatize children, just that we want to get a little darker and weirder."

4. IT'S NOT SO MUCH A CONTINUATION AS IT IS A SEQUEL.

When you watch something like The Americans season two, it's almost impossible to catch on unless you've seen the previous episodes. Stranger Things season two will differ from the modern TV approach by being more of a sequel than a continuation of the first year. That means a more self-contained plot that doesn't leave viewers hanging at the end of nine episodes.

"There are lingering questions, but the idea with Season 2 is there's a new tension and the goal is can the characters resolve that tension by the end," Ross Duffer told IGN. "So it's going to be its own sort of complete little movie, very much in the way that Season 1 is."

Don't worry about the two seasons of Stranger Things being too similar or too different from the original, though, because when speaking with Entertainment Weekly about the influences on the show, Matt Duffer said, "I guess a lot of this is James Cameron. But he’s brilliant. And I think one of the reasons his sequels are as successful as they are is he makes them feel very different without losing what we loved about the original. So I think we kinda looked to him and what he does and tried to capture a little bit of the magic of his work.”

5. THE PREMIERE WILL TRAVEL OUTSIDE OF HAWKINS.

Everything about the new Stranger Things episodes will be kept secret until they finally debut later this year, but we do know one thing about the premiere: It won't take place entirely in the familiar town of Hawkins, Indiana. “We will venture a little bit outside of Hawkins,” Matt Duffer told Entertainment Weekly. “I will say the opening scene [of the premiere] does not take place in Hawkins.”

So, should we take "a little bit outside" as literally as it sounds? You certainly can, but in that same interview, the brothers also said they're both eager to explore the Upside Down, the alternate dimension from the first season. Whether the season kicks off just a few miles away, or a few worlds away, you'll get your answer when Stranger Things's second season debuts next month.

arrow
Food
The Gooey History of the Fluffernutter Sandwich

Open any pantry in New England and chances are you’ll find at least one jar of Marshmallow Fluff. Not just any old marshmallow crème, but Fluff; the one manufactured by Durkee-Mower of Lynn, Massachusetts since 1920, and the preferred brand of the northeast. With its familiar red lid and classic blue label, it's long been a favorite guilty pleasure and a kitchen staple beloved throughout the region.

This gooey, spreadable, marshmallow-infused confection is used in countless recipes and found in a variety of baked goods—from whoopie pies and Rice Krispies Treats to chocolate fudge and beyond. And in the beyond lies perhaps the most treasured concoction of all: the Fluffernutter sandwich—a classic New England treat made with white bread, peanut butter, and, you guessed it, Fluff. No jelly required. Or wanted.

There are several claims to the origin of the sandwich. The first begins with Revolutionary War hero Paul Revere—or, not Paul exactly, but his great-great-great-grandchildren Emma and Amory Curtis of Melrose, Massachusetts. Both siblings were highly intelligent and forward-thinkers, and Amory was even accepted into MIT. But when the family couldn’t afford to send him, he founded a Boston-based company in the 1890s that specialized in soda fountain equipment.

He sold the business in 1901 and used the proceeds to buy the entire east side of Crystal Street in Melrose. Soon after he built a house and, in his basement, he created a marshmallow spread known as Snowflake Marshmallow Crème (later called SMAC), which actually predated Fluff. By the early 1910s, the Curtis Marshmallow Factory was established and Snowflake became the first commercially successful shelf-stable marshmallow crème.

Although other companies were manufacturing similar products, it was Emma who set the Curtis brand apart from the rest. She had a knack for marketing and thought up many different ways to popularize their marshmallow crème, including the creation of one-of-a-kind recipes, like sandwiches that featured nuts and marshmallow crème. She shared her culinary gems in a weekly newspaper column and radio show. By 1915, Snowflake was selling nationwide.

During World War I, when Americans were urged to sacrifice meat one day a week, Emma published a recipe for a peanut butter and marshmallow crème sandwich. She named her creation the "Liberty Sandwich," as a person could still obtain his or her daily nutrients while simultaneously supporting the wartime cause. Some have pointed to Emma’s 1918 published recipe as the earliest known example of a Fluffernutter, but the earliest recipe mental_floss can find comes from three years prior. In 1915, the confectioners trade journal Candy and Ice Cream published a list of lunch offerings that candy shops could advertise beyond hot soup. One of them was the "Mallonut Sandwich," which involved peanut butter and "marshmallow whip or mallo topping," spread on lightly toasted whole wheat bread.

Another origin story comes from Somerville, Massachusetts, home to entrepreneur Archibald Query. Query began making his own version of marshmallow crème and selling it door-to-door in 1917. Due to sugar shortages during World War I, his business began to fail. Query quickly sold the rights to his recipe to candy makers H. Allen Durkee and Fred Mower in 1920. The cost? A modest $500 for what would go on to become the Marshmallow Fluff empire.

Although the business partners promoted the sandwich treat early in the company’s history, the delicious snack wasn’t officially called the Fluffernutter until the 1960s, when Durkee-Mower hired a PR firm to help them market the sandwich, which resulted in a particularly catchy jingle explaining the recipe.

So who owns the bragging rights? While some anonymous candy shop owner was likely the first to actually put the two together, Emma Curtis created the early precursors and brought the concept to a national audience, and Durkee-Mower added the now-ubiquitous crème and catchy name. And the Fluffernutter has never lost its popularity.

In 2006, the Massachusetts state legislature spent a full week deliberating over whether or not the Fluffernutter should be named the official state sandwich. On one side, some argued that marshmallow crème and peanut butter added to the epidemic of childhood obesity. The history-bound fanatics that stood against them contended that the Fluffernutter was a proud culinary legacy. One state representative even proclaimed, "I’m going to fight to the death for Fluff." True dedication, but the bill has been stalled for more than a decade despite several revivals and subsequent petitions from loyal fans.

But Fluff lovers needn’t despair. There’s a National Fluffernutter Day (October 8) for hardcore fans, and the town of Somerville, Massachusetts still celebrates its Fluff pride with an annual What the Fluff? festival.

"Everyone feels like Fluff is part of their childhood," said self-proclaimed Fluff expert and the festival's executive director, Mimi Graney, in an interview with Boston Magazine. "Whether born in the 1940s or '50s, or '60s, or later—everyone feels nostalgic for Fluff. I think New Englanders in general have a particular fondness for it."

Today, the Fluffernutter sandwich is as much of a part of New England cuisine as baked beans or blueberry pie. While some people live and die by the traditional combination, the sandwich now comes in all shapes and sizes, with the addition of salty and savory toppings as a favorite twist. Wheat bread is as popular as white, and many like to grill their sandwiches for a touch of bistro flair. But don't ask a New Englander to swap out their favorite brand of marshmallow crème. That’s just asking too Fluffing much.

SECTIONS

arrow
LIVE SMARTER
More from mental floss studios