Ice Cream Lickers Are Prompting Grocery Stores to Fight Back

Jamie Squire, Getty Images
Jamie Squire, Getty Images

While many retail foods come in packaging that can display evidence of possible product tampering, ice cream may not be one of them. In some brands, cartons tops can easily come off and be put back on, a bit of a design flaw that has led to a recent spate of “ice cream lickers” who flick their tongues across the flavored frozen treats and then place them back in the freezer aisle. Now, some grocery stores are fighting back.

According to Thrillist, a number of stores around the country have taken preventive measures to avoid falling victim to ice cream vandalism. Some have been locking their freezer doors and putting up signs directing customers to find an employee for assistance.

One Walmart in Corpus Christi, Texas, stationed an employee in the ice cream aisle over the July 4 holiday weekend, though it wasn’t entirely clear whether he was there for an entire shift or whether the store was staging his presence for a social media post.

The concern over tampering stems from a viral video shot at a Walmart in Lufkin, Texas that shows a woman licking a tub of Blue Bell ice cream and then putting it back on the shelf. She was later located by Lufkin police and her case was turned over to the Texas Juvenile Justice Department. Blue Bell has said that their cartons are sealed in such a way that makes tampering evident if the lid has been removed, but the company is still facing criticism for not having a seal or wrapper on its cartons.

Ice cream lickers are taking a considerable risk. Tampering with a consumer product is a crime that can carry a prison term of up to 20 years and a $10,000 fine.

[h/t Thrillist]

Maine Man Catches a Rare Cotton Candy Lobster—For the Second Time

RnDmS/iStock via Getty Images
RnDmS/iStock via Getty Images

Just three months after a cotton candy lobster was caught off the coast of Maine, another Maine resident has reeled in one of the rare, colorful creatures.

Kim Hartley told WMTW that her husband caught the cotton candy lobster off Cape Rosier in Penobscot Bay—and it’s not his first time. Four years ago, he caught another one, which he donated to an aquarium in Connecticut. While the Hartleys decide what to do with their pretty new foster pet, it’s relaxing in a crate on land.

Though the chances of finding a cotton candy lobster are supposedly one in 100 million, Maine seems to be crawling with the polychromatic crustaceans. Lucky the lobster gained quite a cult following on social media after being caught near Canada’s Grand Manan Island (close to the Canada-Maine border) last summer, and Portland restaurant Scales came across one during the same season. You can see a video of the discovery in Maine from last August below:

According to National Geographic, these lobsters’ cotton candy-colored shells could be the result of a genetic mutation, or they could be related to what they’re eating. Lobsters get their usual greenish-blue hue when crustacyanin—a protein they produce—combines with astaxanthin, a bright red carotenoid found in their diet. But if the lobsters aren’t eating their usual astaxanthin-rich fare like crabs and shrimp, the lack of pigment could give them a pastel appearance. It’s possible that the cotton candy lobsters have been relying on fishermen’s bait as their main food source, rather than finding their own.

While these vibrant specimens may look more beautiful than their dull-shelled relatives, even regular lobsters are cooler than you think—find out 25 fascinating facts about them here.

[h/t WMTW]

7 Very Victorian Ways to Die

A circa 1860s lithograph titled "Fire: The horrors of crinoline & the destruction of human life."
A circa 1860s lithograph titled "Fire: The horrors of crinoline & the destruction of human life."

In the 19th century, the Grim Reaper was seemingly around every corner. A glass of water, a beautiful dress, or a brightly colored piece of wallpaper could all spell your doom. Poor sanitation, dangerous working practices, and widespread poisons meant that even those in their prime of life were not immune to sudden death. Thankfully, today's scientific advances—and better regulation—have massively improved life expectancy, although some of these dangers still lurk.

1. Flammable Fashion

In the 1850s and '60s, the trend for huge crinoline skirts boomed. These large structured petticoats covered with fabric gave the impression of a voluminous skirt, whereas previously, the look had been achieved by wearing numerous layers of skirts, which was both hot and cumbersome. Crinolines became popular in part because they were light and easy to maneuver.

There was, however, a downside to their design—crinolines, often made of diaphanous materials such as silk and muslin, were highly flammable. Numerous newspapers reported on the scores of women who had the misfortune to get too close to a naked flame. Fanny Longfellow, wife of Henry Wadsworth Longfellow, died in 1861 after her dress went up in flames when a lighted match or small piece of paper fell on her. Longfellow himself attempted to extinguish the flames, but his wife's skirts were so flammable it proved impossible to save her life. Another sad example was Archduchess Mathilde of Austria, who in 1867 is said to have pulled the classic teenage move of hiding a cigarette from her father behind her back and inadvertently set her dress ablaze.

Newspaper reports abounded with editorials on the perils of flouncy fashion, and offered various solutions (sometimes perhaps in jest). The Tablet in 1858 recommended, “We would … suggest that every lady wearing a crinoline, should be accompanied by a footman with a pail of water.” Needless to say, this was not a practical solution, but trends soon moved away from crinolines and the threat of fire lessened.

2. Opium Overdoses

A satirical engraving of an unscrupulous chemist selling a child arsenic and laudanum (tincture of opium)
A satirical engraving of an unscrupulous chemist selling a child arsenic and laudanum (tincture of opium)

Quieting fractious babies has always proved a challenge, but in the 19th century a seemingly wonderful solution was offered: opium. Tinctures of opium, such as Godfrey’s Cordial, were widely used as method to soothe sickly or teething infants. Although it might seem horrifying by modern standards to drug children into listlessness, in the 19th century opium was an extremely popular medicine and, before the days of aspirin, was commonly used as a painkiller and sleeping aid.

Godfrey’s Cordial was especially popular among working-class mothers who often had to return to work soon after the birth of a child. It became not uncommon to dose babies with Godfrey’s to make sure the child remained in a stupor until the mother returned from work. Unfortunately, accidental overdoses were frequent—in 1854 it was estimated that, in Britain, three-quarters of all deaths attributed to opium were of children under 5 years old. Fortunately, better regulation has meant that children’s medicines are now tightly controlled today.

3. Cholera Contamination

Many of us take it for granted that we can turn on the faucet and drink a glass of clean water. However, in the 19th century, as the populations in Europe and America ballooned and increasing numbers of people moved to cities, the infrastructure struggled to cope. Many slums had open sewers in the streets and an unreliable water supply, and communal wells and water pumps were often contaminated with raw sewage. This meant that water-borne diseases such as cholera and typhus became rife.

The cholera outbreaks of the 19th century originated in India, but with the growth of global trade networks it soon spread around the world. A pandemic around 1832 ensued when the disease reached Britain and America for the first time. Several other pandemics swept the world, killing 23,000 people in Britain in 1854 alone. Physician John Snow mapped the cases of cholera in London's Soho that year, and traced the cause to a single water pump that was located near a cesspool. The pump was removed, and cholera cases dropped dramatically. As scientific understanding of the spread of water-borne diseases improved, public water supplies were cleaned up, and the last documented cholera outbreak in the U.S. was in 1911.

4. Arsenic Poisoning

A jar of poisonous Paris Green
Chris goulet, Wikimedia // CC BY-SA 3.0

Colorful green wallpaper was the height of fashion in the Victorian era, largely spearheaded by pre-Raphaelite artists and designers. The green pigment often used, known as Scheele’s Green, had first been developed in 1775 by German-Swedish chemist Carl Wilhelm Scheele, and the key to its vibrant shade was the use of arsenic. Although arsenic was known to be poisonous if eaten, at the time it was thought to be safe as a color pigment.

In 1862 an investigation was carried out after several children from the same family sickened and died within weeks of each other in Limehouse, London. Dr. Thomas Orton investigated the case and concluded that the children had been poisoned by the arsenic in their bedroom's green wallpaper. Arsenic coloring was also used for dresses, hats, upholstery, and cravats. The poison was sprayed on vegetables as insecticide, and even added to beer. Restrictions on its use in food and drink were only added in 1903. Today, historic houses have had their arsenic wallpaper removed, and arsenic-dyed clothes in museum collections are generally kept safely behind glass.

5. Fatal Factories

By the 19th century, rapid industrialization across Europe and America had led to thousands of factories producing everything from fabric to munitions. Numerous adults—and children—were employed in these factories, providing ample opportunity for death and injury.

The cotton factories of Manchester, England, for example, could kill you in a number of ways. First, the air was thick with cotton fibers, which over time built up in workers’ lungs, causing breathing difficulties and lung disease. Then there were the whirling, grinding machines that might catch your sleeve or hair, dragging you into the loom. Children were employed to clean under the machines and retrieve dropped spindles because their small size allowed them to move about under the moving machines—but a trip or a loss of concentration often proved fatal. The huge number of accidents and deaths in factories eventually led to increased regulation—reducing working hours, restricting child labor, and making the machines themselves safer.

6. Sudden Spontaneous Combustion

Some Victorian scientists believed that alcoholism could cause spontaneous combustion. This idea caught the public imagination, and the theory was used by Charles Dickens in Bleak House (1853) to explain the death of the drunken rag and bone man Mr. Krook. In Victorian accounts, the victims were typically overweight and were heavy drinkers, and their bodies had seemingly burst into flame, leaving only their legs intact. Needless to say, the threat of spontaneous combustion was soon seized upon by the temperance movement, who used the supposed link to alcoholism to scare people away from the demon drink.

For example, The Anatomy of Drunkenness by Robert Macnish (1834) described the various types of drunk and devoted a whole chapter to the risk of spontaneous combustion. Macnish recounted a number of case studies, including that of Mary Clues—an inveterate drinker who was found almost entirely incinerated excepting one leg, while the room around her was more or less undamaged. Despite the widespread discussion of spontaneous combustion in the Victorian era, it's now generally considered highly unlikely if not impossible. Modern forensic science has in part explained the phenomena through the “wick effect,” wherein a body on fire produces melted fat that seeps into the clothes, causing a long, slow, self-contained burn that may look like the result of spontaneous combustion—but almost certainly began with an external source.

7. Pestilent Pox

Smallpox has been around for over 12,000 years. Europeans brought the disease to North and South America in the Age of Exploration, killing up to 90 percent of indigenous populations. Smallpox was still prevalent in the 19th century and killed about 30 percent of its victims. Those that survived were often blinded or badly scarred by the virulent pustules. To give some idea of the scale of fatalities, in just one year, 1871, over 50,000 people died of smallpox in Great Britain and Ireland alone.

In 1796 the English doctor Edward Jenner noticed that milkmaids who had caught cow pox appeared to be immune to smallpox. This led Jenner to create the world’s first vaccine. As with many new developments, it took a number of years for vaccination to catch on, but once it did the incidence of smallpox began to fall. In 1980 the World Health Organization declared the disease exterminated—the first virus ever to be completely eradicated world over—thanks to a sustained program of vaccination.

SECTIONS

arrow
LIVE SMARTER