4 Diseases Caused by a Lack of Essential Vitamins and Minerals
Companies pushing products with added vitamins and minerals can fool people into thinking that they’re eating a “healthy” food when they’re not—but it’s not like those vitamins and minerals are there for no reason. For much of human history, diseases of nutrient deficiency were the norm, and in some parts of the world, they still persist. Even into the 20th century, conditions caused by a lack of certain vitamins or minerals were endemic to North America and Europe. Artificially added nutrients may not make a food “healthy,” but they do stave off several debilitating, and sometimes fatal, diseases of malnutrition. Here are a few of those maladies.
The disease of pirates: the grey-death. Scurvy is caused by a lack of vitamin C, whose chemical name, ascorbic acid, is derived from the Latin term for scurvy, scorbutus. Even though the disease was known since ancient times (described by Hippocrates around 400 BCE), it was not a scourge to those who were largely land-bound. Even though its causes were unknown, many cultures realized that eating certain herbs could reverse the symptoms, and as long as there was access to fresh food, it was generally kept under control.
Scurvy didn’t become a significant problem until the Age of Discovery (beginning in the 15th century), when people at sea were not able to access that much-needed fresh food for months at a time. Preserved meats and carbohydrates contained no vitamin C, and unlike most animals, the human body is not able to create vitamin C on its own.
The early symptoms of scurvy include spongy gums, pain in the joints, and blood spots appearing under the skin. As the disease progressed, the teeth would become loose, extreme halitosis (bad breath) would develop, the afflicted would become too weak to walk or work, be in too much pain to eat, and would die “mid-sentence,” often from a burst blood vessel. Many of the early explorers lost great numbers of men to scurvy: Vasco de Gama lost 116 out of 170 men in 1499, and in 1520, Magellan lost 208 out of 230. A few deaths were attributable to other causes, but the vast majority were due to scurvy.
Despite not being able to pinpoint the exact cause of scurvy, in the 18th century, naval physician James Lind was able to prove, in what’s considered to be the first controlled scientific experiment, that scurvy could be prevented (and cured) by incorporating citrus fruits such as limes and oranges into the diet of sailors. Although his findings weren’t widely accepted at first, the British Navy eventually began issuing standard rations of lemon juice, and later, limes, to their sailors—which gave rise to the term “limey” in reference to the British.
These days, scurvy is an extremely rare condition, almost exclusively caused by someone eating a completely unvaried diet. In most cases, high levels of oral supplementation of vitamin C are enough to reverse the condition in a matter of weeks, and death by scurvy is almost unheard of.
This condition is brought on by a lack of vitamin D, which causes the body to be unable to absorb or deposit calcium. Less commonly, it can also be caused by a lack of calcium or phosphorus, but vitamin D deficiency is by far the most common cause. Unlike vitamin C, the human body is able to produce vitamin D, but only if it has the metabolic precursors available to it.
When the skin is exposed to ultraviolet light (such as from the sun), cholesterol in the skin reacts and forms cholecalciferol, which is then processed in the liver and kidneys to create the active form of vitamin D. Even with a nominally healthy diet, without enough sun exposure, the body can’t produce the vitamin D precursors on its own. This is actually re-emerging as a health concern among some increasingly-indoor groups of people, and is one of the few hypovitaminosis (lack of vitamin) conditions not considered to be a “disease of the past.” Luckily, when the deficiency is recognized, cholecalciferol can be directly taken as a vitamin supplement or acquired from eating organ meats and oils, such as cod liver oil, allowing the body to resume producing vitamin D.
Rickets is a condition of children, as the deficiency’s most severe effects are on developing bones; in adults, “bone-softening,” or osteomalacia, can be caused by the same vitamin deficiency. But in adults, it both takes significantly longer to develop and tends to cause tip-off signs that something is wrong before bone warping sets in, such as extreme pain in the bones, and unexplained muscle weakness. In children, especially those that don’t or can’t receive regular check-ups, deformity and debilitation by the deficiency is often only noticed after significant damage has been done to their developing skeletons.
The most telling symptoms of rickets are at the epiphyses (growth plates) of bones: The body is unable to lengthen bones by depositing calcium, and ends up with bones that flare outward in a “cupping” appearance. This leads to costochondral swelling, or what’s known as the “rachitic rosary” along the ribcage of the child, as well as widened wrists and “thick” joints. Before widened wrists or rachitic rosary appears, the softening of the skull bones can lead to “Caput Quadratum”—a square-headed appearance, and often the first sign of skeletal growth problems. If left untreated, rickets also can cause an extremely curved back, stunted growth, and frequent fractures—all of which can lead to permanent and debilitating deformity.
This condition is largely confined to Asia, especially in countries where boiled rice is a staple. The Sinhalese term “beri-beri” means, “I cannot, I cannot,” and derives from the inability to perform even the simplest of tasks once the polyneuritis (nerve inflammation) caused by the deficiency of vitamin B1 (thiamine) has permanently damaged the neurons, when the condition has progressed to the end-stage.
Although beriberi was known to exist in rice-eating countries several centuries back, its prevalence boomed with the introduction of steam-driven rice-polishing mills from Europe. The superior taste of the milled white rice led many locals to abandon the local (unpolished) brown rice, and in doing so, abandon their primary source of thiamine. From the 1860s to the turn of the 20th century, people whose plant consumption was limited to the polished white rice would often come down with weakness, pain, weight loss, difficulty walking, and emotional disturbances. Beriberi became one of the leading causes of mortality in the region.
In the 1880s, a doctor named Christiaan Eijkman began researching the causes of this epidemic at a laboratory in the Dutch East Indies (now Jakarta, Indonesia), and initially believed that the condition was caused by a bacterial infection. However, after years of study, he came to the conclusion that “white rice is poisonous.” He discovered this by feeding a group of chickens solely white rice, and another group unpolished brown rice. The chickens that ate the white rice came down with beriberi-like symptoms, while the others stayed healthy. Eijkman also discovered that when the chickens fed white rice were subsequently fed brown rice, they recovered from their illness! Later dietary testing on prisoners confirmed his results. Even though he didn’t know the cause of the condition, Eijkman proved that white rice was the culprit, and shared the 1929 Nobel Prize in Medicine for his discovery.
Beriberi is occasionally seen in the modern world, but its primary cause is chronic alcoholism—the poor diets of some chronic alcoholics, combined with the decreased absorption of what thiamine is consumed, leads to symptoms that unfortunately are sometimes left undiagnosed until it’s too late. Recently, beriberi was also seen in Haitian prisons when the prison system began buying imported polished rice from the United States, and stopped feeding their inmates the local brown rice.
What causes blistering of the skin in the sun, pale skin, a craving for raw meat, blood dripping from the mouth, aggression, and insanity? If you answered “vampirism,” you’re close—the myth of the vampire may have its roots in the condition known as “pellagra.”
Pellagra is caused by a lack of vitamin B3 (niacin). First identified and commonly diagnosed in the Asturian Empire (now Northern Spain), it was originally called “Asturian leprosy.” However, the condition was seen throughout Europe, the Middle East, and North Africa, wherever a large percentage of food energy was derived from corn, and fresh meat was not available. The area of highest prevalence was Northern Italy, where Francesco Frapoli of Milan called it “pelle agra,” meaning “sour skin.”
It was initially believed that either the corn itself, or some insect associated with corn, was causing pellagra. This belief was reinforced when much of France eliminated corn as a food staple and virtually eradicated the condition. Between the era that corn was introduced to Europe (the early 16th century) and the late 19th century, pellagra was found almost everywhere that poor people subsisted on cornmeal and little else.
Around the turn of the 20th century, people began to notice that despite subsisting on just as much corn as poor Europeans, poor Mesoamerican natives didn’t come down with the condition. It was eventually discovered that this was because the traditional processing of corn in the Americas involved “nixtamalization,” in which the kernels were soaked in limewater before hulling them. The alkali solution freed up the niacin that was present in the grain, but previously inaccessible.
Despite the extensive work of Dr. Joseph Goldberger in the 1910s and 1920s, which proved that pellagra wasn’t caused by a germ but by a dietary deficiency, the condition was occurring in epidemic proportions in the rural Southern US until the 1940s.
Today, pellagra is most common in the poorest regions of the world, especially places that rely upon food aid programs. Some countries still ship unfortified cornmeal rather than corn masa (nixtamalized corn) or fortified cornmeal to developing countries or to their own impoverished populations. China, parts of Africa, Indonesia, and North Korea all have endemic pellagra among their lowest classes.
The discovery of important vitamins and how to produce them has been so significant to human health that many of those who were integral to the discoveries have been awarded the Nobel Prize in Medicine; more than 10 Nobel Prizes have been divided among almost 20 eminent scientists for the discovery or isolation of vitamins A, B1, B12, C, D, E, and K. Over the second half of the 20th century, after the beginning of widespread supplementation to everyday food items, the incidences of the conditions covered here went down dramatically across much of the world.
Of course, the minerals essential to the human body play similarly important roles in maintaining health. However, humans have not historically had a widespread significant problem acquiring these nutrients, as most plants absorb many minerals from the soil. With the increased processing of our food throughout the 20th century, however, some of these minerals have been lost, and have had to be re-added to the average Western diet through supplementation. In the rest of the world, displacement due to war, and unfortified food from aid programs, has left survivors with enough calories, but not enough nutrients. Supplementation of assistance food and local fortification of salt and flour is beginning to help give displaced people (especially displaced children) a new chance at life without these and other nutritional diseases.
In the developed world, you won’t be the healthiest bloke on the block if you eat nothing but breakfast cereal and cartons of juice—but the food industry has ensured that you at least won’t die of malnutrition. Even people with healthy diets benefit from the supplementation of vitamins and minerals in common foodstuffs, and adding the nutrients costs next to nothing. Doctors and nutritionists still agree that the healthiest way to acquire your necessary vitamins and minerals is by eating a balanced diet and spending time outdoors each day, but in the course of modern life, that’s not always possible, and if people are going to eat poorly either way, we may as well keep them from dropping dead of scurvy!