Your Brain’s Memory Capacity May Be as Big as the World Wide Web

iStock
iStock | iStock

In an attempt to understand and measure the brain’s synapses, whose shape and size have remained mysterious to scientists, researchers at the University of Texas, Austin and the Salk Institute worked together to determine that the brain’s memory capacity is much larger than previously understood. The results, published in the journal eLife, estimate that an individual human brain may store as much as a petabyte of information—perhaps 10 times more than previously estimated, and about the equivalent of the World Wide Web.

The study was the first attempt “to reconstruct in three dimensions every single synapse and associated structure in a brain region,” to try to understand “basic synaptic structure and local connectivity among neurons,” Kristen Harris, co-senior author of the study and professor of neuroscience at UT Austin, tells mental_floss.

Synapses communicate signals between neurons. They're formed when the cable-like axon from one neuron connects with a "spine" on a dendrite, a branch-like structure extending from the neural cell body, of another. To better understand the way synaptic storage is measured, consider that a computer’s memory is measured in bits, each of which can have a value of 0 or 1. "In the brain, information is stored in the form of synaptic strength, a measure of how strongly activity in one neuron influences another neuron to which it is connected,” write the authors. “The number of different strengths can be measured in bits. The total storage capacity of the brain therefore depends on both the number of synapses and the number of distinguishable synaptic strengths."

Researchers were able to see these synapses by analyzing thin slices of tissue from the hippocampus—the brain region connected to learning and memory—from three male adult rats using electron microscopy. Then, over several years, they used computer software to reconstruct in 3D every “structural process” and roughly 500 synapses found in a tiny section of brain tissue the size of a single red blood cell.

They identified places where two neurons were connected to each other through two synapses, called "axon-coupled pairs,” which allowed them to estimate new sizes of synapses. What they found were 26 different “bins” of synapses that can store 4.7 bits of information each.

Not only is the diversity of synapses they observed in such a small brain region surprising, the storage capacity of each is “markedly higher than previous suggestions,” write the authors. Prior to this, researchers believed an individual synapse was only capable of storing 1 to 2 bits of information. This suggests we may have underestimated the memory capacity of the brain, which has trillions of synapses, "by an order of magnitude."

According to lead author Terry Sejnowski, in whose lab the study was conducted, "Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web." 

The research provides researchers who study memory and learning with a deeper understanding of the brain’s memory capacity, and a new dataset to work with. “This is just the beginning—a tiny chink in the mysterious armor of the structure and function of synapses in the brain,” Harris says.