6 Terrifying Beauty Practices from History

Photo illustration, Mental Floss. Woman, iStock. Corset X-Ray, Public Domain.
Photo illustration, Mental Floss. Woman, iStock. Corset X-Ray, Public Domain.

Chemical peels that burn layers of skin from your face. Appetite suppressants that come with a risk of heart failure. Cosmetic surgeries that change the appearance of a woman’s most intimate parts. There are plenty of modern cosmetic practices that run the gamut from physically painful to medically risky. But most don’t hold a candle to the hazardous cosmetic techniques of yore. Check out these historic beauty practices that are even scarier than modern ones.

1. WEARING CORSETS

You know what really turns men off? When women take deep breaths. In the 1800s, the invention of metal eyelets allowed women to cinch their corsets tighter than ever before, with acute medical consequences. In fairness, not all women tightened their corsets to the point of injury, and probably none of them achieved the 14-inch waist advertised in 19th century fashion magazines. But the stylish undergarments were often laced so tightly that they restricted women’s breathing. In the long term, wearing corsets caused muscle atrophy, deformed the ribcage, and misaligned the spine. And extreme corset use wasn’t just limited to women, as indicated by the warped ribs of a 19th-century Englishman whose body was excavated in the early 2000s. The study authors felt that it was likely an orthopedic corset, but noted “corset use to obtain a fashionable silhouette cannot be ruled out.”

2. EATING ARSENIC

In the 19th century and earlier, some people (mainly in Styria, a region that encompassed parts of modern Austria and Slovenia) consumed arsenic to “produce a blooming complexion, a brilliant eye, and an appearance of embonpoint [sexy stoutness],” according to one 1857 magazine article on the practice. There were safety rules, of course: You were only supposed to take it while the moon was waxing, and you could only eat only a dose as big as a single grain of millet at first. If you took more than that before you built up a tolerance, you could die. Once you began eating arsenic regularly, though, if you ever stopped, you’d suffer from painful withdrawal symptoms like vomiting and muscle spasms. But wait, there was another downside—because arsenic interferes iodine necessary for thyroid function, eating it gave people goiters. Blooming, brilliant, embonpoint goiters.

3. FOOT BINDING

A tradition that likely started around the late 10th century, foot binding was designed to turn a woman’s feet into 3-inch-long “golden lotuses” by folding the toes under and binding them tightly. The extremely painful practice began when a child was as young as 3 to 4 years old and continued into adulthood. The resulting wobbly walk and doll-like feet were considered highly attractive and vital to a woman’s marriage prospects. This one isn’t limited to the distant past, either: Foot binding wasn’t completely stamped out until China’s Communist Revolution in 1949, and there are still living Chinese women who feet were bound as children.

4. APPLYING RADIOACTIVE FACE CREAM

In the early 20th century, before anyone knew about the health risks of radiation, radioactive consumer products were all the rage. In the 1930s, an enterprising doctor named Alfred Curie capitalized his association with the famous radioactive researchers (who he definitely wasn’t related to) to launch Tho-radia, a French cosmetics brand whose products featured radioactive chemicals like thorium chloride and radium bromide. Advertisements for his face cream claimed that the radioactive formula could stimulate “cellular vitality,” firm up skin, cure boils and pimples, even out redness and pigmentation, erase wrinkles, stop aging, and help retain the “freshness and brightness of the complexion.” It’s all vitality and brightness until someone’s jaw falls off.

5. MAKING EYEDROPS OUT OF DEADLY NIGHTSHADE

Deadly nightshade is also called belladonna, or “beautiful woman,” a likely reference to its role in the cosmetic routines of ladies in Renaissance Italy and beyond. Italian women—and later, women in Victorian England—would squeeze drops of deadly nightshade into their eyes to dilate their pupils for a striking, wide-eyed look they thought was seductive. Unfortunately, the side effects included blurry vision, vertigo, and headaches. And the blindness reported to result from its extended use? Worth it, as long as you got the watery-eyed look of a consumptive. The active ingredient in deadly nightshade, atropine, is still used today to dilate the eyes during eye exams, but unlike the cosmetic belladonna drops of the past, the highly diluted modern versions won’t blind you.

6. USING LEAD MAKEUP

The 1700s were rough on the complexion. Even if you don’t count the miasmic filth in which even the richest people lived, there was smallpox to contend with—by the end of the 18th century, an estimated 400,000 Europeans were dying of it every year. If you were lucky enough to survive, the disease left severe scarring. The best way to cover these pockmarks and other cosmetic imperfections was lead face powder, and both men and women took advantage of it. It's great stuff—inexpensive and easy to make, coats well, and has a silky finish. Except even then, it was known to be wildly toxic. Not only did it cause eye inflammation, tooth rot, and baldness, but it also made the skin blacken over time, requiring yet more of the noxious powder to achieve the pure white face, shoulders, and chest that were so fashionable. Ah yes, and then there was the fact that using it could eventually kill you.

BONUS: EATING TAPEWORMS (MAYBE)

This controversial fad diet—which may or may not have actually existed—was not only dangerous, but also really gross. In the early 1900s, several newspaper accounts reported that women were eating pills filled with tapeworm eggs as a way to lose weight. The tapeworm eggs would supposedly hatch and take up residence in the intestine of their poor, plump host, consuming the nutrients that would otherwise be digested. This would keep the person malnourished and thin. However, even a century ago, doctors doubted people would subject themselves to this kind of pain to look good. In 1912, The Washington Post ran an article called “Tapeworm Pills For Fat People Merely A Wild Yarn, Say Experts.” But as we know, people have done crazier things in the name of beauty.

A version of this story ran in 2013.

Has An Element Ever Been Removed From the Periodic Table?

lucadp/iStock via Getty Images
lucadp/iStock via Getty Images

Barry Gehm:

Yes, didymium, or Di. It was discovered by Carl Mosander in 1841, and he named it didymium from the Greek word didymos, meaning twin, because it was almost identical to lanthanum in its properties. In 1879, a French chemist showed that Mosander’s didymium contained samarium as well as an unknown element. In 1885, Carl von Weisbach showed that the unknown element was actually two elements, which he isolated and named praseodidymium and neodidymium (although the di syllable was soon dropped). Ironically, the twin turned out to be twins.

The term didymium filter is still used to refer to welding glasses colored with a mixture of neodymium and praseodymium oxides.

One might cite as other examples various claims to have created/discovered synthetic elements. Probably the best example of this would be masurium (element 43), which a team of German chemists claimed to have discovered in columbium (now known as niobium) ore in 1925. The claim was controversial and other workers could not replicate it, but some literature from the period does list it among the elements.

In 1936, Emilio Segrè and Carlo Perrier isolated element 43 from molybdenum foil that had been used in a cyclotron; they named it technetium. Even the longest-lived isotopes of technetium have a short half-life by geological standards (millions of years) and it has only ever been found naturally in minute traces as a product of spontaneous uranium fission. For this reason, the original claim of discovery (as masurium) is almost universally regarded as erroneous.

As far as I know, in none of these cases with synthetic elements has anyone actually produced a quantity of the element that one could see and weigh that later turned out not to be an element, in contrast to the case with didymium. (In the case of masurium, for instance, the only evidence of its existence was a faint x-ray signal at a specific wavelength.)

This post originally appeared on Quora. Click here to view.

Graham Crackers Were Invented to Combat the Evils of Coffee, Alcohol, and Masturbation

tatniz/iStock via Getty Images
tatniz/iStock via Getty Images

Long before they were used to make s’mores or the tasty crust of a Key lime pie, graham crackers served a more puritanical purpose in 19th-century America. The cookies were invented by Sylvester Graham, an American Presbyterian minister whose views on food, sex, alcohol, and nutrition would seem a bit extreme to today's cracker-snackers. Much like the mayor in the movie Chocolat, Graham and his thousands of followers—dubbed Grahamites—believed it was sinful to eat decadent foods. To combat this moral decay, Graham started a diet regimen of his own.

Graham ran health retreats in the 1830s that promoted a bland diet that banned sugar and meat. According to Refinery29, Graham's views ultimately inspired veganism in America as well as the “first anti-sugar crusade.” He condemned alcohol, tobacco, spices, seasoning, butter, and "tortured" refined flour. Caffeine was also a no-no. In fact, Graham believed that coffee and tea were just as bad as tobacco, opium, or alcohol because they created a “demand for stimulation.” However, the worst vice, in Graham's opinion, was overeating. “A drunkard sometimes reaches old age; a glutton never,” he once wrote.

Graham’s austere philosophy was informed by the underlying belief that eating habits affect people’s behaviors, and vice versa. He thought certain foods were "overstimulating" and led to impure thoughts and passions, including masturbation—or “self-pollution,” as he called it—which he believed to be an epidemic that caused both blindness and insanity.

Illustration of Sylvester Graham
Library of Congress, Public Domain, Wikimedia Commons

Graham's views directly influenced Victorian-era corn flake inventor John Harvey Kellogg, who was born a year after Graham died. Like his predecessor, Kellogg also believed that meat and some flavorful foods led to sexual impulses, so he advocated for the consumption of plain foods, like cereals and nuts, instead. (Unsurprisingly, the original recipes for both corn flakes and graham crackers were free of sinful sugar.)

In one lecture, Graham told young men they could stop their minds from wandering to forbidden places if they avoided “undue excitement of the brain and stomach and intestines.” This meant swearing off improper foods and substances like tobacco, caffeine, pepper, ginger, mustard, horseradish, and peppermint. Even milk was banned because it was “too exciting and too oppressive.”

So what could Graham's followers eat? The core component of Graham’s diet was bread made of coarsely ground wheat or rye, unlike the refined white flour loaves that were sold in bakeries at that time. From this same flour emerged Graham's crackers and muffins, both of which were common breakfast foods. John Harvey Kellogg was known to have eaten the crackers and apples for breakfast, and one of his first attempts at making cereal involved soaking twice-baked cracker bits in milk overnight.

Slices of rye bread, a jug of milk, apples and ears of corn on sackcloth, wooden table
SomeMeans/iStock via Getty Images

However, Kellogg was one of the few remaining fans of Graham’s diet, which began to fall out of favor in the 1840s. At Ohio’s Oberlin College, a Grahamite was hired in 1840 to strictly enforce the school’s meal plans. One professor was fired for bringing a pepper shaker to the dining hall, and the hunger-stricken students organized a protest the following year, arguing that the Graham diet was “inadequate to the demands of the human system as at present developed.” Ultimately, the Grahamite and his tyrannical nutrition plan were kicked out.

Much like Kellogg’s corn flakes, someone else stepped in and corrupted Graham’s crackers, molding them into the edible form we now know—and, yes, love—today. In Graham’s case, it was the National Biscuit Company, which eventually became Nabisco; the company started manufacturing graham crackers in the 1880s. But Graham would likely be rolling in his grave if he knew they contained sugar and white flour—and that they're often topped with marshmallows and chocolate for a truly decadent treat.

SECTIONS

arrow
LIVE SMARTER