7 Very Victorian Ways to Die

A circa 1860s lithograph titled "Fire: The horrors of crinoline & the destruction of human life."
A circa 1860s lithograph titled "Fire: The horrors of crinoline & the destruction of human life."

In the 19th century, the Grim Reaper was seemingly around every corner. A glass of water, a beautiful dress, or a brightly colored piece of wallpaper could all spell your doom. Poor sanitation, dangerous working practices, and widespread poisons meant that even those in their prime of life were not immune to sudden death. Thankfully, today's scientific advances—and better regulation—have massively improved life expectancy, although some of these dangers still lurk.

1. Flammable Fashion

In the 1850s and '60s, the trend for huge crinoline skirts boomed. These large structured petticoats covered with fabric gave the impression of a voluminous skirt, whereas previously, the look had been achieved by wearing numerous layers of skirts, which was both hot and cumbersome. Crinolines became popular in part because they were light and easy to maneuver.

There was, however, a downside to their design—crinolines, often made of diaphanous materials such as silk and muslin, were highly flammable. Numerous newspapers reported on the scores of women who had the misfortune to get too close to a naked flame. Fanny Longfellow, wife of Henry Wadsworth Longfellow, died in 1861 after her dress went up in flames when a lighted match or small piece of paper fell on her. Longfellow himself attempted to extinguish the flames, but his wife's skirts were so flammable it proved impossible to save her life. Another sad example was Archduchess Mathilde of Austria, who in 1867 is said to have pulled the classic teenage move of hiding a cigarette from her father behind her back and inadvertently set her dress ablaze.

Newspaper reports abounded with editorials on the perils of flouncy fashion, and offered various solutions (sometimes perhaps in jest). The Tablet in 1858 recommended, “We would … suggest that every lady wearing a crinoline, should be accompanied by a footman with a pail of water.” Needless to say, this was not a practical solution, but trends soon moved away from crinolines and the threat of fire lessened.

2. Opium Overdoses

A satirical engraving of an unscrupulous chemist selling a child arsenic and laudanum (tincture of opium)
A satirical engraving of an unscrupulous chemist selling a child arsenic and laudanum (tincture of opium)

Quieting fractious babies has always proved a challenge, but in the 19th century a seemingly wonderful solution was offered: opium. Tinctures of opium, such as Godfrey’s Cordial, were widely used as method to soothe sickly or teething infants. Although it might seem horrifying by modern standards to drug children into listlessness, in the 19th century opium was an extremely popular medicine and, before the days of aspirin, was commonly used as a painkiller and sleeping aid.

Godfrey’s Cordial was especially popular among working-class mothers who often had to return to work soon after the birth of a child. It became not uncommon to dose babies with Godfrey’s to make sure the child remained in a stupor until the mother returned from work. Unfortunately, accidental overdoses were frequent—in 1854 it was estimated that, in Britain, three-quarters of all deaths attributed to opium were of children under 5 years old. Fortunately, better regulation has meant that children’s medicines are now tightly controlled today.

3. Cholera Contamination

Many of us take it for granted that we can turn on the faucet and drink a glass of clean water. However, in the 19th century, as the populations in Europe and America ballooned and increasing numbers of people moved to cities, the infrastructure struggled to cope. Many slums had open sewers in the streets and an unreliable water supply, and communal wells and water pumps were often contaminated with raw sewage. This meant that water-borne diseases such as cholera and typhus became rife.

The cholera outbreaks of the 19th century originated in India, but with the growth of global trade networks it soon spread around the world. A pandemic around 1832 ensued when the disease reached Britain and America for the first time. Several other pandemics swept the world, killing 23,000 people in Britain in 1854 alone. Physician John Snow mapped the cases of cholera in London's Soho that year, and traced the cause to a single water pump that was located near a cesspool. The pump was removed, and cholera cases dropped dramatically. As scientific understanding of the spread of water-borne diseases improved, public water supplies were cleaned up, and the last documented cholera outbreak in the U.S. was in 1911.

4. Arsenic Poisoning

A jar of poisonous Paris Green
Chris goulet, Wikimedia // CC BY-SA 3.0

Colorful green wallpaper was the height of fashion in the Victorian era, largely spearheaded by pre-Raphaelite artists and designers. The green pigment often used, known as Scheele’s Green, had first been developed in 1775 by German-Swedish chemist Carl Wilhelm Scheele, and the key to its vibrant shade was the use of arsenic. Although arsenic was known to be poisonous if eaten, at the time it was thought to be safe as a color pigment.

In 1862 an investigation was carried out after several children from the same family sickened and died within weeks of each other in Limehouse, London. Dr. Thomas Orton investigated the case and concluded that the children had been poisoned by the arsenic in their bedroom's green wallpaper. Arsenic coloring was also used for dresses, hats, upholstery, and cravats. The poison was sprayed on vegetables as insecticide, and even added to beer. Restrictions on its use in food and drink were only added in 1903. Today, historic houses have had their arsenic wallpaper removed, and arsenic-dyed clothes in museum collections are generally kept safely behind glass.

5. Fatal Factories

By the 19th century, rapid industrialization across Europe and America had led to thousands of factories producing everything from fabric to munitions. Numerous adults—and children—were employed in these factories, providing ample opportunity for death and injury.

The cotton factories of Manchester, England, for example, could kill you in a number of ways. First, the air was thick with cotton fibers, which over time built up in workers’ lungs, causing breathing difficulties and lung disease. Then there were the whirling, grinding machines that might catch your sleeve or hair, dragging you into the loom. Children were employed to clean under the machines and retrieve dropped spindles because their small size allowed them to move about under the moving machines—but a trip or a loss of concentration often proved fatal. The huge number of accidents and deaths in factories eventually led to increased regulation—reducing working hours, restricting child labor, and making the machines themselves safer.

6. Sudden Spontaneous Combustion

Some Victorian scientists believed that alcoholism could cause spontaneous combustion. This idea caught the public imagination, and the theory was used by Charles Dickens in Bleak House (1853) to explain the death of the drunken rag and bone man Mr. Krook. In Victorian accounts, the victims were typically overweight and were heavy drinkers, and their bodies had seemingly burst into flame, leaving only their legs intact. Needless to say, the threat of spontaneous combustion was soon seized upon by the temperance movement, who used the supposed link to alcoholism to scare people away from the demon drink.

For example, The Anatomy of Drunkenness by Robert Macnish (1834) described the various types of drunk and devoted a whole chapter to the risk of spontaneous combustion. Macnish recounted a number of case studies, including that of Mary Clues—an inveterate drinker who was found almost entirely incinerated excepting one leg, while the room around her was more or less undamaged. Despite the widespread discussion of spontaneous combustion in the Victorian era, it's now generally considered highly unlikely if not impossible. Modern forensic science has in part explained the phenomena through the “wick effect,” wherein a body on fire produces melted fat that seeps into the clothes, causing a long, slow, self-contained burn that may look like the result of spontaneous combustion—but almost certainly began with an external source.

7. Pestilent Pox

Smallpox has been around for over 12,000 years. Europeans brought the disease to North and South America in the Age of Exploration, killing up to 90 percent of indigenous populations. Smallpox was still prevalent in the 19th century and killed about 30 percent of its victims. Those that survived were often blinded or badly scarred by the virulent pustules. To give some idea of the scale of fatalities, in just one year, 1871, over 50,000 people died of smallpox in Great Britain and Ireland alone.

In 1796 the English doctor Edward Jenner noticed that milkmaids who had caught cow pox appeared to be immune to smallpox. This led Jenner to create the world’s first vaccine. As with many new developments, it took a number of years for vaccination to catch on, but once it did the incidence of smallpox began to fall. In 1980 the World Health Organization declared the disease exterminated—the first virus ever to be completely eradicated world over—thanks to a sustained program of vaccination.

Why Do We Eat Pumpkin Pie at Thanksgiving?

gjohnstonphoto/iStock via Getty Images
gjohnstonphoto/iStock via Getty Images

While it’s possible—even probable—that pumpkins were served at the 1621 harvest festival that’s now considered the predecessor to Thanksgiving, attendees definitely didn’t dine on pumpkin pie (there was no butter or wheat flour to make crust).

The earliest known recipes for pumpkin pie actually come from 17th-century Europe. Pumpkins, like potatoes and tomatoes, were first introduced to Europe in the Columbian Exchange, but Europeans were more comfortable cooking with pumpkins because they were similar to their native gourds.

By the 18th century, however, Europeans on the whole lost interest in pumpkin pie. According to HowStuffWorks, Europeans began to prefer apple, pear, and quince pies, which they perceived as more sophisticated. But at the same time pumpkin pie was losing favor in Europe, it was gaining true staple status in America.

In 1796, Amelia Simmons published American Cookery, the first cookbook written and published in the New World colonies. Simmons included two recipes for “pompkin pudding” cooked in pastry crust. Simmons’s recipes call for “stewed and strained” pumpkin, combined with a mixture of nutmeg, allspice, and ginger (yes, it seems our pumpkin spice obsession dates back to at least the 1500s).

But how did pumpkin pie become so irrevocably tied with the Thanksgiving holiday? That has everything to do with Sarah Josepha Hale, a New Hampshire-born writer and editor who is often called the “Godmother of Thanksgiving.” In her 1827 abolitionist novel Northwood, Hale described a Thanksgiving meal complete with “fried chicken floating in gravy,” broiled ham, wheat bread, cranberry sauce, and—of course—pumpkin pie. For more than 30 years, Hale advocated for Thanksgiving to become a national holiday, writing regular editorials and sending letters to five American presidents. Thanksgiving was a symbol for unity in an increasingly divided country, she argued [PDF].

Abraham Lincoln eventually declared Thanksgiving a national holiday in 1863 (to near-immediate outcry from Southerners, who viewed the holiday as an attempt to enforce Yankee values). Southern governors reluctantly complied with the presidential proclamation, but cooks in the South developed their own unique regional traditions. In the South, sweet potato pie quickly became more popular than New England’s pumpkin pie (mostly because sweet potatoes were easier to come by than pumpkins). Now, pumpkin pie reigns supreme as the most popular holiday pie across most of the United States, although the Northeast prefers apple and the South is split between apple and pecan, another Southern staple.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

Anthony Blunt: The Art Historian/Russian Spy Who Worked at Buckingham Palace

Samuel West portrays Anthony Blunt in The Crown.
Samuel West portrays Anthony Blunt in The Crown.
Des Willie, Netflix

*Mild spoilers for season 3 of The Crown on Netflix ahead.

Viewers of the third season of The Crown on Netflix will likely have their curiosity piqued by Anthony Blunt, the art historian who is revealed to be a spy for the Russians during his 19 years of service to the Queen at Buckingham Palace. Instead of getting the boot once he was discovered, however, Blunt went on to remain under Her Majesty's employ for eight more years—until his official retirement. While treason never looks good on a resume, the royal class had good reason to keep him on.

Blunt, who was born and raised in England, visited the Soviet Union in 1933 and was indoctrinated as a spy after being convinced of the benefits of Communism in fighting fascism. He began recruiting his university classmates at Cambridge before serving during World War II and leaking information about the Germans to the KGB. Blunt was one of five Cambridge graduates under Soviet direction. Two of them, diplomats Donald Maclean and Guy Burgess, relocated to the Soviet Union in 1951. Another, Kim Philby, went undetected until 1961. John Cairncross escaped notice, too, but was eventually outed.

However, it was Blunt who had a post at Buckingham Palace. After being tipped off by American intelligence, MI5 interrogated Blunt. He confessed to his treachery in 1964 and was granted immunity from prosecution. Why was he able to remain employed? One theory has it that British intelligence was so embarrassed by Blunt's ability to circulate in the upper levels of the monarchy that firing him would have raised too many questions. Another thought has Blunt having knowledge of some bizarrely congenial wartime correspondence between Adolf Hitler and the Duke of Windsor (a.k.a. King Edward VIII, whose abdication led to Elizabeth's eventual ascension to the throne).

Whatever the case, the Queen was advised by MI5 to keep Blunt around. In his role as art curator, he had no access to classified information. Blunt was at the Palace through 1972 and spent another seven years roaming London giving lectures. His actions remained a tightly guarded secret until Margaret Thatcher disclosed his treason in 1979.

As for that speech seen in The Crown, where Olivia Colman's Queen Elizabeth makes some not-so-subtle digs at Blunt at the opening of a new exhibition, there's no record of such a takedown ever happening. While the two reportedly kept their distance from each other in private, according to Miranda Carter's Anthony Blunt: His Lives:

“Blunt continued to meet the Queen at official events. She came to the opening of the Courtauld’s new galleries in 1968, and in 1972 she personally congratulated Blunt on his retirement, when the Lord Chamberlain, knowing nothing of his disgrace, offered him the honorary post of Adviser on the Queen’s pictures—inadvertently continuing his association with the Palace for another six years.”

Stripped of his knighthood as a result of the truth about his actions being made known, Blunt became a recluse and died of a heart attack in 1983. His memoirs, which were made public by the British Library in 2009, indicated his regret, calling his spy work "the biggest mistake of my life."

SECTIONS

arrow
LIVE SMARTER