4 Foods That Shaped Human History

The choices were made with a healthy dose of humility.
The choices were made with a healthy dose of humility. / Jasmin Merdan/Moment/Getty Images
facebooktwitterreddit

Which food has done the most to shape human development? Like a lot of interesting questions, it’s kind of impossible to answer, but it raises a number of interesting questions of its own. How do you define human development? How do you define food?

For our purposes, food is something you eat—as in, no beverages. But even sticking to solids leaves a crowded field of contenders. We batted ideas around and asked the AskFoodHistorians subreddit for their insight. As a result, we’re focusing on just four types of food—tubers, meat, sugar, and cereal grains (especially wheat and barley)—but the choices were made with a healthy dose of humility. You could make a convincing argument for rice or maize, salt or pepper.

No single ingredient could ever tell the entire history of human development, or of food’s role in it, but hopefully each one of our choices tells us something interesting about the way that food and humanity have influenced one another.

1. and 2. Tubers and Meat

Potatoes Photo Taken August 1999
Tubers helped shape human history. / Sean Gallup/GettyImages

In June 1822, a man named Alexis St. Martin was accidentally shot in the stomach at Fort Mackinac, Michigan. When a surgeon named William Beaumont arrived on the scene, the situation was dire. As Beaumont described it, “A large portion of the side was blown off … [there was a] perforation made directly into the cavity of the stomach, through which food was escaping.” 

Beaumont eventually nursed his patient back to health, so successfully that St. Martin was able to one day paddle his family in a canoe from what is now Wisconsin to Montreal. 

Though St. Martin was able to live a relatively unencumbered life after the accident, the wound never completely closed up. For the rest of his life, as Richard Wrangham describes in his captivating book, Catching Fire, “the inner workings of his stomach were visible from the outside.”

Other than morbid curiosity, why should we care about a guy whose guts were publicly available information? In addition to being an effective surgeon, Beaumont knew an opportunity when it arose. In St. Martin, he had an almost-literal window into the workings of the human body. The information he gleaned through observation of his unique subject gave us insights that might otherwise have been impossible to learn about the digestive system. 

Alexis St. Martin.
Alexis St. Martin. / Wikimedia Commons // Public Domain

Beaumont studied his hole-y subject on and off for years, often introducing different foodstuffs tied to a string directly into St. Martin’s fistula. He’d then pull the strings back out to note, among other things, the time it took to digest various items. He drew a couple conclusions that would prove illuminating even decades later. For one, tender food with greater surface area—what he called “minuteness of division”—was digested faster. And cooked food, including potatoes and meat, was processed dramatically faster than raw food. 

Almost two centuries later, Professor Richard Wrangham came to St. Martin’s story from the perspective of a biological anthropologist—before he became fascinated with cooking, much of his work focused on the differences between human beings and other primates, especially chimpanzees. Wrangham uses Beaumont’s takeaways as one piece of what he would eventually call the “cooking hypothesis,” a fairly persuasive argument that cooking is perhaps the defining difference between us and our Homo habilis ancestors. 

“The cooking hypothesis says an ape became human because it learned to cook. Cooking transformed us—our ancestors, I should say—partly because it gave a lot of energy, and that energy was available for new activities, like traveling farther, like having babies faster, like having a better immune system that gave us better defense against diseases,” Wrangham says. “It also made our food softer, which meant that the mouth could be smaller, the teeth could be smaller, the gut could be smaller, because the food was also more digestible. And at the same time, by the way, cooking meant that fire was being used to heat the food, and the acquisition of the control of fire meant that our ancestors could for the first time safely sleep on the ground defended by the fire, so they no longer had to be adapted to climbing in trees. And that meant that they could really fully adapt for the first time to walking and running on the ground.”

In a 1999 piece published in Current Anthropology, Wrangham and his colleagues began to make this argument with a particular focus on tubers, rhizomes, and corms—think potatoes, cassava, taro root, and yams—which they call “underground storage organs.”

They were basically arguing that everyone had placed too much emphasis on meat-eating to explain the physiological and social changes that happened around 1.8 million years ago and led to the emergence of Homo erectus

Wrangham’s argument is really about focusing on a technology (fire) over an ingredient (meat). Fire isn’t a food, but tubers are basically a way to discuss cooking-with-fire separately from meat-eating. And it’s not like the researchers chose “underground storage organs” arbitrarily. For one, by the researchers’ estimation, tubers would have been a much more plentiful food source than meat—one that was unavailable to many other animals, since they likely would have required digging sticks to access. They also pointed toward evidence that ancestors a million years before the emergence of Homo erectus may have consumed meat without the corresponding physiological changes. Finally, they analyzed the hypothetical impact of increased meat consumption versus cooking underground storage organs and determined that cooking would provide a greater potential increase in daily energy intake, even over a dramatic rise in meat-eating.

The piece also suggested that cooking tubers could help explain the emergence of male-female bonding. Cooking changes the site of consumption. Rather than foraging and eating what you find in the place you find it, cooking involves bringing food back to a central location. It trades the security of eating your food immediately for the extra efficiency of letting the fire do some of the digestion for you. It’s a system fraught with risk for the one doing the cooking. Cooked food, the researchers argue, would be a more tempting target for would-be thieves, given its increased tenderness and relatively easy locatability. They outline a hypothetical path to male-female bonding that centers around females protecting their tuber-centric resources by forming alliances with males. 

This also, in their understanding, helps explain the persistent division of labor found between the sexes. The most traditional division—men hunt, women gather—is ultimately unworkable for Wrangham with a raw food diet. He imagines an unsuccessful hunter returning empty-handed at nightfall. Even if his female partner had gathered tubers for him to eat, he would have an entire night of chewing ahead of him. Amongst apes—and by Wrangham’s association, amongst our pre-cooking apelike ancestors—a large part of the day is spent chewing. (His interest in methods to ease this burden led Wrangham to conduct “an informal experiment in which friends and I chewed raw goat meat” with avocado leaves to help accelerate the breakdown of the raw meat.) Wrangham sees the use of fire, and the reduction in chewing time it entails, as a way to free up time in the day (and, indeed, time at night now illuminated by those fires). Hunting could then grow from a sporadic activity brought about by opportunity, as it generally functions amongst modern primates like chimpanzees, into a consistent enterprise diversifying the hominid larder.

Reading Wrangham’s counter-narrative is fascinating, but it received far from universal acceptance upon its publication. In a devastating academic putdown in the comments of that 1999 article, Professor C. Loring Brace thanks the study’s authors for inviting his comments despite the fact that “they were fully aware of the fact that I look upon their gambit as belonging more to the realm of anthropological folklore than to that of science” [PDF]. He goes on to damn them with the faint praise that their hypothesis “may not be science, but ... has the makings of an absolutely charming story.”  

Critically, Brace argues, the researchers ignored evidence for the use of certain tools that the hominids they’re discussing did, in fact, have access to. Those tools would provide an alternate means of externalizing digestion. Basically, instead of requiring fire to help digest meat, you could make a proto-steak-tartare by cutting meat off a scavenged auroch and bashing it with a club.  Evolutionary biologists Katherine D. Zink and Daniel E. Lieberman published a study in 2016 looking into exactly this kind of mechanical processing, by having people eat, among other things, more raw goat. 

They concluded that cooking wasn’t necessary to bring about the physiological changes that Wrangham was trying to account for, and that “meat eating was largely dependent on mechanical processing made possible by the invention of slicing technology.” So in their understanding, cooking might still have a place in this pivotal moment of human development, but our definition of cooking might have to expand to encompass preparation methods outside of applying heat. 

There’s a big reason this explanation is appealing. The cooking hypothesis requires a degree of fire control by hominids dating back 1.8 million years ago. At the time of that 1999 piece, the most compelling evidence for the human control of fire dated back only about 250,000 years. Intriguingly, the intervening years have seen new archeological discoveries that provide evidence for controlled use of fire well before that time frame, with compelling evidence pointing as far back as a million years ago. Unfortunately for Wrangham, that leaves about 800,000 years of supposed fire use unaccounted for in the archaeological record. If we accept even earlier estimates going back as far as 1.6 million years, there’s still a gap in the record.

Wrangham suggests that the absence of evidence isn’t necessarily evidence of absence. And while he’s logically correct, fire does have a habit of leaving visual traces of its existence—burnt ground, rings of stones, that kind of thing. As Anna K. Behrensmeyer, paleoecologist at the Smithsonian National Museum of Natural History, said back in 1999, “I think there would be evidence if it were [behind] as important an evolutionary leap as [Wrangham’s team] suggests.”

That doesn’t mean we have to throw out Wrangham’s insights, or his fascinating book. But it does mean that we might be better off thinking about a range of technological advances, from the development of tools to better hunting practices to the use of fire. That makes identifying a single food’s impact a bit tricky. Tubers and other underground storage organs were probably important, perhaps critically so, but they may not deserve any type of singularly exalted status in our discussion of cooking. 

Meat is obviously incredibly important to our development as a species—along with cooking, broadly defined, we might very well say it’s what made us human. But on a qualitative level, the life of a hominid from 1.8 million years ago seems to share more with the life of his apelike predecessor than it does with ours. Maybe we need to come closer to the present to identify a food that shaped life as we know it today.

3. Sugar

Sugar Fall
Sugar’s impact on history was more horrific. / Al Barry/GettyImages

When considering what food has most shaped human development, one of the first places your mind probably goes to is the spice trade. The search for spices like pepper (which is a dried berry) has undoubtedly had a huge impact on human development and exploration. It’s arguably responsible for permanently connecting Europe to its neighbors in the East and West. To give just a small taste of the ways the spice trade shaped the world: The Dutch famously traded Manhattan to the British for an Indonesian island with some nutmeg trees on it. That’s an oversimplification, but it speaks to the enormous value placed on spices at this time—though it’s hard to say that one spice jumps out far above the others in importance.

Salt could also make a super-convincing case for one of the most important foods ever, but its impact on the world is so far-ranging and constant throughout history that it’s a bit hard to tell a single story of its role in human development. We can’t live without it, but is there one pivotal change it's responsible for? It’s hard to say.

Like salt, sugar is both an ingredient we can buy in the store and a naturally occurring chemical compound. If we want to be really annoying, we could crown sugar our most important food champion hands-down simply for the vital role photosynthesis plays in the food chain. Super-simplified sixth grade science version: Plants take in sunlight, create sugar, provide oxygen. Animals eat plants, life thrives, everybody’s happy. As biochemist Albert Szent-Gyorgyi described it, “What drives life is thus a little electric current, kept up by the sunshine.” But that’s not why sugar makes the list. 

Sidney Mintz became the “father of food anthropology” largely through his seminal work on sugar, “Sweetness and Power.” Like the title suggests, Mintz looks at sugar not from a primarily culinary lens, but, in his words, as “an old commodity, basic to the emergence of a global market.” A major part of that global market was the triangle trade connecting Western Europe, Africa, and the so-called “New World” of the Americas. Sugar was an incredibly valuable resource for centuries, and the riches it produced are inextricable from the labor of enslaved Africans in sugar plantations. 

Though evidence of sugarcane domestication dates back as far as 10,000 years ago in New Guinea, it didn’t make its way to Europe in a big way until around the time of the Crusades, when Christian soldiers returned home with “sweet salt.” By then, large swaths of Asia and the Middle East had mastered the art of growing and refining sugar, using it to create desserts and medicinal concoctions, and even sugar sculptures that functioned as saccharine status symbols.

 “The true age of sugar” began when it was introduced to the New World, according to Marc Aronson and Marina Budhos’s book, Sugar Changed the World. And while sugar was certainly serving an important cultural role before that in Asia, it’s impossible to argue that the role of sugar changed immeasurably when it arrived in the “New World” via Christopher Columbus. In fact, according to historian Jack A. Goldstone, “The first documented revolt of African slaves in the Americas broke out around Christmas 1521 in southeastern Hispaniola on a sugar estate owned by the eldest son of Christopher Columbus.”

Growing and harvesting sugar is a labor-intensive process, and the rise of sugar plantations caused an intensification of the African slave trade. Millions of human beings were enslaved and brought to the New World to work on sugar plantations—some 5 million to the Caribbean alone. By the estimate of a white Barbados planter named Edward Littleton, someone forced to work on the sugar plantations had a lifespan of somewhere between 10 and 17 years on average.

The impact on Indigenous populations was also devastating. Though some Indigenous people initially were pressed into labor on the sugar plantations, “In the Caribbean the [Indigenous] population became virtually extinct within a generation” of European contact, according to Professor Linda A. Newson [PDF], through a combination of brutal treatment and the introduction of Old World diseases.

Loading cane, sugar plantation, Louisiana, USA.Artist: Underwood & Underwood
Loading cane, sugar plantation, Louisiana, USA. / Print Collector/GettyImages

According to a piece by Khalil Gibran Muhammad in The New York Times Magazine, in the United States, Louisiana’s sugar industry rose in step with its reliance on the labor of enslaved people, making the state the country’s second-richest in per capita wealth. There, the inhumane working conditions led to a pattern of “deaths exceeding births,” historian Michael Tadman found. Even after slavery was abolished, “plantation labor overshadowed Black people’s lives in the sugar region until well into the 20th century,” in the words of author and Stonehill College professor John C. Rodrigue. 

The interplay of sugar, wealth, and power wasn’t limited to the Caribbean or American South. Many historians argue that the United States’ annexation of Hawaii was tied closely to sugar production and the cheap labor it relied on. The McKinley Tariff, passed in 1890, made Hawaiian sugar uncompetitive in the American market. Whether the white Hawaiian planters’ motivation was purely monetary or was tied into a fear of losing their dominance over the poorly paid and numerous Asian laborers working in Hawaii, by 1893, Liliʻuokalani, the queen of Hawaii, had been deposed, and by 1898, Hawaii was annexed by the United States.

We continue to feel the impact of sugar in our world today, whether we look to the inequality faced by the descendants of enslaved people or to the deleterious effects of obesity and other health problems associated with excessive sugar consumption (health problems, it’s worth pointing out, that disproportionately affect Black Americans).

It’s clear that sugar changed the modern world—perhaps irrevocably so. But that leaves a question unanswered: How did the modern world arise? 

4. Cereal Grains

Wheat
Cereal grains like wheat undeniably changed human history. / HAUKE-CHRISTIAN DITTRICH/GettyImages

In his book Against the Grain, James C. Scott lays out the generally accepted story of civilization: sedentism, or the practice of living in one place for a long time, arose from the cultivation of cereal grains like wheat and barley—especially the need to irrigate arid climates, which takes lots of time and labor. Scott then spends a couple hundred pages dunking on most of the assumptions underpinning that story, and questions whether it’s at all appropriate to view the “civilizing process” as one of generally uninterrupted progress. 

Archaeological evidence actually indicates that sedentism predates crop field cultivation of grains by several millennia. Early settlers probably cultivated wild grains for thousands of years as part of a diverse food production strategy, but the shift to a near monoculture of deliberately domesticated cereal grains seems to have arisen much later. (Cereal grains, by the way, just refer to the edible grains of grasses. They account for about 50 percent of worldwide caloric consumption today, for both human beings and livestock.)

Research from landscape archaeologists like Jennifer Pournelle suggests the “arid” Southern Mesopotamia was, in fact, vastly different during the first instances of sedentism in the region. Rather than a dry area surrounded by rivers in need of labor-intensive irrigation, higher sea levels at the time rendered it “a forager’s wetland paradise,” in Scott’s understanding, full of diverse food sources. He notes that other early settlements, from coastal China to Teotihuacan near present-day Mexico City, also benefited from natural wetland abundance.  

In those regions, incidentally, the dominant cereal grains would have been rice and maize, respectively. Today, more people rely on rice for sustenance than any other grain, and corn is the most produced grain worldwide by tonnage. So while we’re going to follow Scott’s book and focus on wheat and barley for the insights they can give us about the emergence of the first known states in Mesopotamia, it’s worth remembering that the story has near analogues throughout the world with different cereal grains. You can practically add a parenthetical “and rice and corn” to almost everything here about wheat.

Scott’s central premise is that grain domestication didn’t lead to sedentism as much as it led to statehood, which he defines along a “stateness” continuum consisting of things like city walls, social hierarchy, soldiers, and—critically—taxation. Grains like wheat are uniquely disposed to taxation, in Scott’s telling. They are “visible, divisible, assessable, storable, transportable, and ‘rationable.’” Tubers may offer similar caloric density, but they could be safely hidden underground for years from the tax-man’s prying eyes. Lentils were grown widely, but don’t have a determinate harvest—because they can be picked at various times, they’re worse candidates for taxation.        

Against the Grain draws out some fascinating connections between grains, taxation, and statehood. The earliest writings from Mesopotamia, for example, were almost single mindedly concerned with state administration, especially as it pertains to the rations and taxation of barley. More than 500 years separate this type of administrative writing and literary or religious writing in the archaeological record, suggesting the critical role state-based accounting played in the emergence of written language. 

The other most frequent topics from early Mesopotamian tablets pertain to population. Here, too, Scott shows how grain might have influenced societal priorities. While acknowledging that slavery and war predate the early grain-based states, Scott sees highly organized farming and the rise of states as an incentive for both. In a society based around agriculture, an increase in population can provide a more or less direct increase in food production. Early legal codes are filled with injunctions discouraging and punishing people fleeing the state, and warfare of the time seems less interested in conquering territory than in increasing the population to produce for the state. 

Scott sees the emergence of city walls as the twofold result of a society based around agriculture. On the one hand, large quantities of stored grains would need to be protected from the so-called “barbarians” outside the city walls. Just as importantly, though, the walls kept the productive laborers of a city in. When Scott uses the term barbarian, he does so with tongue in cheek, aware that hunter-gatherer societies generally existed alongside early agricultural states, interacting and trading with them and enjoying a quality of life that was not necessarily any worse, in his estimation. Scott views the frequent collapse of early states not necessarily as a tragedy, but at times perhaps even an emancipation from the control of elite rulers within the state. 

Beyond the hours of labor required to maintain a state store of grain and the drawbacks of being forced to give up a portion of your wheat, these early states likely contributed to the spread of so-called “crowding diseases” like cholera, smallpox, and measles. Scott points to a confluence of factors arising from domesticated grain cultivation that would carry greater risk of disease: increased population density (and the greater concentration of feces that it entails); an increase of potentially disease-carrying domesticated animals; and a relatively monocultural diet, whose effects we can see in comparing early farmers’ skeletal remains with their hunter-gatherer contemporaries (according to one study, for example, adult height decreased during the transition from hunter-gatherer to agriculture). 

Given all these drawbacks, why would people allow themselves to be “civilized” in the first place? Scott points to climate change occurring around 3500 to 2500 BCE as one explanation. As the Mesopotamian region dried up, relatively low-labor “flood retreat” agriculture, making use of annual river flooding, was no longer a viable method of farming. There were fewer animals to hunt, and fewer crops to forage. Water now had to be carried or transported through dug canals, incentivizing people to live closer to the source, and by extension to one another. As people relied increasingly on grain and the security it could afford, it became a nearly self-fulfilling prophecy for the state to tax their production, use the proceeds to help develop new methods to increase productivity, and continually grow the state until its eventual collapse due to disease, drought, warfare or natural disaster.

Scott doesn’t go as far as an author like Jared Diamond, who called the move away from nomadism “the worst mistake in human history,” but his critics have accused him and his state-skeptical colleagues of romanticizing the life of the hunter-gatherer. While modern hunter-gatherer societies can be seen, in some ways, to evince more egalitarian principles than their state-bound counterparts, there is also compelling evidence of high homicide rates and infant mortality amongst these populations. And while some interesting research has indicated that hunter-gatherers didn’t suffer from famine as frequently as previously thought, it’s hard to be certain how food-secure hunter-gatherers were at the time of early statehood. 

For our purposes, though, a moral reading of the move to agriculture is beside the point. Whether you view it as an unmitigated boon to humanity or, in the words of historian Yuval Noah Harari, “history’s biggest fraud,” agriculture may not have caused sedentism, but it certainly accelerated it immeasurably through grain farming and the taxation that was born from it. 

This story was adapted from an episode of Food History on YouTube. For more fact-filled videos like this, be sure to subscribe to our YouTube channel here.