Through thousands of years of knowledge and learning, we’ve developed extremely advanced intelligence as a species, especially when compared to other animals. But what made us unique? What evolutionary paths did we take that others did not?
That, of course, is one of the million dollar questions of early human development. There’s no concrete way for us to know for sure (at least until we build time machines), but we can make some educated guesses, and they get pretty weird...
1. It All Came From One Human
In evolution, there are two separate paths that changes can take. One is microevolution: Small changes over a long time. The other is macroevolution: Large, abrupt changes that completely change a species.
To date, scientists have multiple theories on how the two interact, but one of the older theories that’s starting to make a comeback is what's known as macromutation, aka the "hopeful monster." Basically, it's a genetic aberration that is so different from its relatives that it's essentially a whole new species. (Think of the mutants in X-Men.)
A neurobiologist from Oxford University, Colin Blakemore, believes this very thing happened to humans. Some ancestor, somewhere (he posits it might even be Mitochondrial Eve) was born with a severe genetic defect that made him or her way smarter than other early humans. It was a total accident that just happened to be highly beneficial from a survival perspective, and this person (who could presumably still mate with other humans) passed on this mutation to his or her offspring.
2. It’s Because of a DNA Glitch
Scientists going over the results from the Human Genome Project found that humans have something completely unique: A duplicated gene named SRGAP2. Don’t worry about the weird name; just know that it’s responsible for brain development. No other primates (or animals at all, for that matter) have it. This could pretty much only occur as a “glitch” at some point in human history. It's not a natural evolutionary development, and duplicate genes happen all the time, except they’re almost always benign.
As a matter of fact, we have a few benign copies of SRGAP2 ourselves. They’re called SRGAP2B and SRGAP2D, and they’re just some of the random genetic junk that makes up a large portion of our DNA. SRGAP2C, however, is a fully functional (and enhanced) copy of SRGAP2.
That doesn't just mean we have double the brain development power, though, because SRGAP2C actually supersedes the original gene. When implanted in mice, SRGAP2C turns off the original gene and actually kind of supercharges their brains. If you think of it like computer software, SRGAP2C is brain development version 2.0 and it has to uninstall version 1.0 to work properly.
3. It’s an Accident Caused by Walking Upright
One of the unique things about humans is that our skulls aren't fused when we're born. Baby skulls don't solidify until the age of two because otherwise it'd be way harder to push them out of the birth canal. No other primate has this, but that's because they're not bipedal and so they have wider birth canals, so it’s not an issue for them.
Recently, scientists studying the well-preserved cranium of an Australopithecus child discovered that the genus, one of our first ancestors to walk bipedally, had larger brains than expected and also started out with the soft skulls that we have today. It was originally thought that we didn't develop non-fused skulls until much later in human development.
Scientists had always assumed that we developed bipedal locomotion as a result of our intelligence, since it's more efficient. Now it looks like the exact opposite may be true—we became bipedal on our own, which necessitated a reconfiguration of the birth canal, which led to the evolution of soft skulls in babies, and that accidentally led to us growing bigger brains, since the brain could now continue to grow until two years of age.
4. Our Human Ancestors Used a Lot of Drugs
One highly controversial (and definitely strange) theory about early human brains comes from Terence McKenna, an American philosopher, ecologist, and drug advocate. In the early 1990s, McKenna developed a theory popularly referred to as the “Stoned Ape” theory.
According to McKenna, early man, upon leaving the jungle and moving into the grasslands of north Africa, saw mushrooms growing on cow dung (something they hadn’t seen in the jungle) and decided to give them a try. He points out that modern apes will frequently eat dung beetles, so it’s not completely unheard of for primates to eat things typically found on or around excrement.
McKenna believes that those mushrooms, ancestors of today's “magic” mushrooms, probably increased visual ability at low doses (much like modern mushrooms), making them biologically useful. Further, at moderate doses, those same mushrooms are sexual stimulants, also handy for a burgeoning species. Lastly, large doses would promote conscious thinking and possibly assist brain growth. Thus, it was evolutionarily beneficial for humans to consume these mushrooms.
Don’t get too excited, though. McKenna’s theory has never been taken seriously by scientists or heavily studied, so there’s currently no real evidence to support it.
5. Meat and Fire Made Our Brains Grow
While it’s obvious that fire and meat-eating were a large part of everyday life for our ancestors, it appears likely that cooked meat may have also played a huge role in our brain development. Harvard University Biological Anthropologist Richard Wrangham has developed a theory that he thinks explains exactly how it worked.
Because brains like ours use up as much as 20 percent of our caloric intake, they require high-calorie foods to keep working. Since Twinkies weren’t around yet, cooked meat was the next best thing for early man. Cooking meat releases more calories, making it even better than raw meat, which we were probably already eating (judging from our appendixes).
Cooking also makes meat faster to eat and easier to digest. Our primate cousins, meanwhile, spent significantly more time eating fewer calories by consuming fruits and veggies. Those extra calories helped grow our brains.
But even an argument as straightforward as this one is contentious—science has yet to discover evidence that humans were capable of controlling fire at the time period specified by Wrangham’s theory.
6. Early Humans Were Schizophrenic
Back in the 1970s, psychologist Julian Jaynes was fascinated by the idea of consciousness and how it came to exist, and why human beings seem to have a much more advanced self-awareness than other animals.
The theory he developed in his 1976 book, The Origin of Consciousness in the Breakdown of the Bicameral Mind, was, to put it mildly, controversial. Jaynes’ Bicameral Mind Theory (as it came to be known) claimed that ancient humans actually weren’t self-aware at all. Instead, man’s brain operated sort of like two separate organs. The left brain was responsible for everyday actions, while the right brain supplied memories and problem-solving derived from experience.
The only problem with this system is that, unlike modern humans, Jaynes thought there was no direct link between the two hemispheres, and thus no consciousness or reflection was available to our ancestors. Instead, the right half communicated to the left through a now-vestigial portion of the language center in the brain, which expressed as auditory hallucinations.
Jaynes believed that early humans may have treated these hallucinations as the voices of their ancestors or even the gods. He used two famous ancient books as examples: The Iliad and the Old Testament of the Bible. Both refer frequently to hearing voices (of the Muses and God, respectively) while their follow-ups, The Odyssey (which was probably not actually written by the same person as The Iliad) and the New Testament, reference much fewer instances of this. This led Jaynes to believe that the change in our brains must have occurred very recently in human history, probably a few centuries after we formed complex societies and consciousness became more beneficial.
Jaynes didn’t just pull this theory out of thin air, either. His specialty as a psychologist was working with schizophrenic patients, and he based Bicameralism on the way that a schizophrenic’s mind works. That aforementioned vestigial language center in the brain appears to be fully-functional in sufferers of schizophrenia. Most interesting of all is that recent advances in neuroimaging seem to support Jaynes’s theory.