Why Do We Eat Candy on Halloween?

Jupiterimages/iStock via Getty Images
Jupiterimages/iStock via Getty Images

On October 31, hordes of children armed with Jack-o'-lantern-shaped buckets and pillow cases will take to the streets in search of sugar. Trick-or-treating for candy is synonymous with Halloween, but the tradition had to go through a centuries-long evolution to arrive at the place it is today. So how did the holiday become an opportunity for kids to get free sweets? You can blame pagans, Catholics, and candy companies.

Historians agree that a Celtic autumn festival called Samhain was the precursor to modern Halloween. Samhain was a time to celebrate the last harvest of the year and the approach of the winter season. It was also a festival for honoring the dead. One way Celtics may have appeased the spirits they believed still walked the Earth was by leaving treats on their doorsteps.

When Catholics infiltrated Ireland in the 1st century CE, they rebranded many pagan holidays to fit their religion. November 1 became the “feasts of All Saints and All Souls," and the day before it was dubbed "All-Hallows'-Eve." The new holidays looked a lot different from the original Celtic festival, but many traditions stuck around, including the practice of honoring the dead with food. The food of choice for Christians became "soul cakes," small pastries usually baked with expensive ingredients and spices like currants and saffron.

Instead of leaving them outside for passing ghosts, soul cakes were distributed to beggars who went door-to-door promising to pray for souls of the deceased in exchange for something to eat. Sometimes they wore costumes to honor the saints—something pagans originally did to avoid being harassed by evil spirits. The ritual, known as souling, is believed to have planted the seeds for modern-day trick-or-treating.

Souling didn't survive the holiday's migration from Europe to the United States. In America, the first Halloween celebrations were a way to mark the end-of-year harvest season, and the food that was served mainly consisted of homemade seasonal treats like caramel apples and mixed nuts. There were no soul cakes—or candies, for that matter—to be found.

It wasn't until the 1950s that trick-or-treating gained popularity in the U.S. Following the Great Depression and World War II, the suburbs were booming, and people were looking for excuses to have fun and get to know their neighbors. The old practice of souling was resurrected and made into an excuse for kids to dress up in costumes and roam their neighborhoods. Common trick-or-treat offerings included nuts, coins, and homemade baked goods ("treats" that most kids would turn their noses up at today).

That changed when the candy companies got their hands on the holiday. They had already convinced consumers that they needed candy on Christmas and Easter, and they were looking for an equally lucrative opportunity to market candy in the fall. The new practice of trick-or-treating was almost too good to be true. Manufacturers downsized candies into smaller, bite-sized packages and began marketing them as treats for Halloween. Adults were grateful to have a convenient alternative to baking, kids loved the sweet treats, and the candy companies made billions.

Today, it's hard to imagine Halloween without Skittles, chocolate bars, and the perennial candy corn debates. But when you're digging through a bag or bowl of Halloween candy this October, remember that you could have been having eating soul cakes instead.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

Has An Element Ever Been Removed From the Periodic Table?

lucadp/iStock via Getty Images
lucadp/iStock via Getty Images

Barry Gehm:

Yes, didymium, or Di. It was discovered by Carl Mosander in 1841, and he named it didymium from the Greek word didymos, meaning twin, because it was almost identical to lanthanum in its properties. In 1879, a French chemist showed that Mosander’s didymium contained samarium as well as an unknown element. In 1885, Carl von Weisbach showed that the unknown element was actually two elements, which he isolated and named praseodidymium and neodidymium (although the di syllable was soon dropped). Ironically, the twin turned out to be twins.

The term didymium filter is still used to refer to welding glasses colored with a mixture of neodymium and praseodymium oxides.

One might cite as other examples various claims to have created/discovered synthetic elements. Probably the best example of this would be masurium (element 43), which a team of German chemists claimed to have discovered in columbium (now known as niobium) ore in 1925. The claim was controversial and other workers could not replicate it, but some literature from the period does list it among the elements.

In 1936, Emilio Segrè and Carlo Perrier isolated element 43 from molybdenum foil that had been used in a cyclotron; they named it technetium. Even the longest-lived isotopes of technetium have a short half-life by geological standards (millions of years) and it has only ever been found naturally in minute traces as a product of spontaneous uranium fission. For this reason, the original claim of discovery (as masurium) is almost universally regarded as erroneous.

As far as I know, in none of these cases with synthetic elements has anyone actually produced a quantity of the element that one could see and weigh that later turned out not to be an element, in contrast to the case with didymium. (In the case of masurium, for instance, the only evidence of its existence was a faint x-ray signal at a specific wavelength.)

This post originally appeared on Quora. Click here to view.

Can You Ever Truly Lose Your Accent?

DGLimages, iStock via Getty Images
DGLimages, iStock via Getty Images

You may be able to pull off a Spanish accent when showing off your Antonio Banderas impression, but truly losing your native accent and replacing it with a new one is a lot harder to do. The way you speak now will likely stick with you for life.

According to Smithsonian, our accent develops as early as 6 months old—accents being the pronunciation conventions of a language shaped by factors like region, culture, and class. When a baby is learning the words for nap and dad and play, they're also learning how to pronounce the sounds in those words from the people around them. Newborn brains are wired to recognize and learn languages just from being exposed to them. By the time babies start talking, they know the "right" pronunciations to use for their native language or languages.

As you get older, your innate understanding of foreign accents and languages gets weaker. If you're an English speaker raised in Boston, you may think that the way someone from Dallas speaks English sounds "wrong" without being able to articulate what it is that makes them sound different. This is why pulling off a convincing foreign accent can be so difficult, even if you've heard it many times before.

Around age 18, your ability to learn a second language takes a steep nosedive. The same may be true with your ability to speak in a new accent. If you immerse yourself in a foreign environment for long enough, you may pick up some ticks of the local accent, but totally adopting a non-native accent without making a conscious effort to maintain it is unlikely as an adult.

There is one exception to this rule, and that's Foreign Accent Syndrome. Following a head injury or stroke, some people have reported suddenly speaking in accents they didn't grow up using. The syndrome is incredibly rare, with only 100 people around the world having been diagnosed with it, and medical experts aren't sure why brain injuries cause it. But while patients may be pronouncing their words differently, they aren't exactly using foreign accents in the way most people think of them; the culprit may be subtle changes to muscle movements in the jaw, tongue, lips, and larynx that change the way patients pronounce certain vowels.

[h/t Smithsonian]

SECTIONS

arrow
LIVE SMARTER