The Spiritual Purpose Behind Shrunken Heads

Kenzo Tribouillard/AFP/GettyImages
Kenzo Tribouillard/AFP/GettyImages

If you’ve ever visited a museum like New York's American Museum of Natural History, you may have noticed a strange—yet oddly fascinating—relic on display: a shrunken human head. Artifacts like these may appear to be bloody battle trophies, but as the Smithsonian Channel explains, they once served as protective talismans for the Shuar people of Ecuador.

The Shuar are an indigenous people who live in the remote jungles of the Amazon. Long ago, they beheaded their enemies, and shrunk their over-the-shoulder remains by defleshing, simmering, and searing them with hot stones and sand. They also sewed the eyes closed, and pegged or sewed the mouth and nostrils shut. These creations were known as tsantsas.

“The Shuar believe in spirits,” explains Anna Dhody, a forensic anthropologist and curator of the Mütter Museum in Philadelphia, in the video below. “They believed that the spirit of their enemy could still harm them after death, and that they had to take preventative measures. So by taking the head of their enemy and creating these very special tsantsas, they could actually, effectively seal the spirit of their defeated enemy in the head.”

Learn more about the history of the practice below.

Has An Element Ever Been Removed From the Periodic Table?

lucadp/iStock via Getty Images
lucadp/iStock via Getty Images

Barry Gehm:

Yes, didymium, or Di. It was discovered by Carl Mosander in 1841, and he named it didymium from the Greek word didymos, meaning twin, because it was almost identical to lanthanum in its properties. In 1879, a French chemist showed that Mosander’s didymium contained samarium as well as an unknown element. In 1885, Carl von Weisbach showed that the unknown element was actually two elements, which he isolated and named praseodidymium and neodidymium (although the di syllable was soon dropped). Ironically, the twin turned out to be twins.

The term didymium filter is still used to refer to welding glasses colored with a mixture of neodymium and praseodymium oxides.

One might cite as other examples various claims to have created/discovered synthetic elements. Probably the best example of this would be masurium (element 43), which a team of German chemists claimed to have discovered in columbium (now known as niobium) ore in 1925. The claim was controversial and other workers could not replicate it, but some literature from the period does list it among the elements.

In 1936, Emilio Segrè and Carlo Perrier isolated element 43 from molybdenum foil that had been used in a cyclotron; they named it technetium. Even the longest-lived isotopes of technetium have a short half-life by geological standards (millions of years) and it has only ever been found naturally in minute traces as a product of spontaneous uranium fission. For this reason, the original claim of discovery (as masurium) is almost universally regarded as erroneous.

As far as I know, in none of these cases with synthetic elements has anyone actually produced a quantity of the element that one could see and weigh that later turned out not to be an element, in contrast to the case with didymium. (In the case of masurium, for instance, the only evidence of its existence was a faint x-ray signal at a specific wavelength.)

This post originally appeared on Quora. Click here to view.

Graham Crackers Were Invented to Combat the Evils of Coffee, Alcohol, and Masturbation

tatniz/iStock via Getty Images
tatniz/iStock via Getty Images

Long before they were used to make s’mores or the tasty crust of a Key lime pie, graham crackers served a more puritanical purpose in 19th-century America. The cookies were invented by Sylvester Graham, an American Presbyterian minister whose views on food, sex, alcohol, and nutrition would seem a bit extreme to today's cracker-snackers. Much like the mayor in the movie Chocolat, Graham and his thousands of followers—dubbed Grahamites—believed it was sinful to eat decadent foods. To combat this moral decay, Graham started a diet regimen of his own.

Graham ran health retreats in the 1830s that promoted a bland diet that banned sugar and meat. According to Refinery29, Graham's views ultimately inspired veganism in America as well as the “first anti-sugar crusade.” He condemned alcohol, tobacco, spices, seasoning, butter, and "tortured" refined flour. Caffeine was also a no-no. In fact, Graham believed that coffee and tea were just as bad as tobacco, opium, or alcohol because they created a “demand for stimulation.” However, the worst vice, in Graham's opinion, was overeating. “A drunkard sometimes reaches old age; a glutton never,” he once wrote.

Graham’s austere philosophy was informed by the underlying belief that eating habits affect people’s behaviors, and vice versa. He thought certain foods were "overstimulating" and led to impure thoughts and passions, including masturbation—or “self-pollution,” as he called it—which he believed to be an epidemic that caused both blindness and insanity.

Illustration of Sylvester Graham
Library of Congress, Public Domain, Wikimedia Commons

Graham's views directly influenced Victorian-era corn flake inventor John Harvey Kellogg, who was born a year after Graham died. Like his predecessor, Kellogg also believed that meat and some flavorful foods led to sexual impulses, so he advocated for the consumption of plain foods, like cereals and nuts, instead. (Unsurprisingly, the original recipes for both corn flakes and graham crackers were free of sinful sugar.)

In one lecture, Graham told young men they could stop their minds from wandering to forbidden places if they avoided “undue excitement of the brain and stomach and intestines.” This meant swearing off improper foods and substances like tobacco, caffeine, pepper, ginger, mustard, horseradish, and peppermint. Even milk was banned because it was “too exciting and too oppressive.”

So what could Graham's followers eat? The core component of Graham’s diet was bread made of coarsely ground wheat or rye, unlike the refined white flour loaves that were sold in bakeries at that time. From this same flour emerged Graham's crackers and muffins, both of which were common breakfast foods. John Harvey Kellogg was known to have eaten the crackers and apples for breakfast, and one of his first attempts at making cereal involved soaking twice-baked cracker bits in milk overnight.

Slices of rye bread, a jug of milk, apples and ears of corn on sackcloth, wooden table
SomeMeans/iStock via Getty Images

However, Kellogg was one of the few remaining fans of Graham’s diet, which began to fall out of favor in the 1840s. At Ohio’s Oberlin College, a Grahamite was hired in 1840 to strictly enforce the school’s meal plans. One professor was fired for bringing a pepper shaker to the dining hall, and the hunger-stricken students organized a protest the following year, arguing that the Graham diet was “inadequate to the demands of the human system as at present developed.” Ultimately, the Grahamite and his tyrannical nutrition plan were kicked out.

Much like Kellogg’s corn flakes, someone else stepped in and corrupted Graham’s crackers, molding them into the edible form we now know—and, yes, love—today. In Graham’s case, it was the National Biscuit Company, which eventually became Nabisco; the company started manufacturing graham crackers in the 1880s. But Graham would likely be rolling in his grave if he knew they contained sugar and white flour—and that they're often topped with marshmallows and chocolate for a truly decadent treat.

SECTIONS

arrow
LIVE SMARTER