Brain's Storage Capacity May Be 10 Times Greater Than Previously Thought

Scientists Discover Surprising Insight into Information Storage in the Brain

A new study claims the brain can store up to 10 times more information than previously believed.

Similar to computers, the brain's memory is measured in "bits," which is based on the connections between brain cells called synapses.

Previously, scientists thought the number and size of synapses were limited, suggesting the brain's storage capacity was constrained. However, recent research has revealed the brain may be able to hold 10 times the amount of information we once assumed.

For the new study, scientists developed an accurate method to assess synaptic connections in a portion of a mouse brain. These synapses are fundamental to learning and memory, as brain cells transmit and receive information through these junctures to store and share data.

Understanding the strengthening and weakening of synapses helps scientists more precisely quantify the amount of information these connections can hold. The study's findings could lead to enhancements in learning capabilities, a deeper understanding of aging, and diseases that impair brain connectivity.

Synapses: The Information Exchange Highway of the Brain

The human brain has over 100 trillion synapses connecting its neurons. Neurotransmitters released across these junctions enable communication within the brain.

During learning, synaptic transmission increases, facilitating the storage of new information. Generally, synapses strengthen or weaken in response to the activity levels of the neurons they connect. The more they are used, the stronger they become.

Synaptic Decay in Aging and Disease

However, as we age or develop neurological conditions such as Alzheimer's, synapses become less active and weaken, leading to declines in cognitive function and the ability to store and retrieve memories.

Scientists can measure the strength of synapses through their physical properties, but this has not been straightforward in the past. The new study has changed that.

Measuring Synaptic Strength and Capacity

The research team harnessed information theory to measure the strength and plasticity, or the ability to transmit and store information, of a synapse. Their approach allowed them to quantify the information transmitted across synapses in bits. One bit represents the synapse transmitting a binary value of 2, while 2 bits allow for a value of 4, and so on.

The team analyzed pairs of synapses in the hippocampus of mice, a brain region central to learning and memory formation. Their analysis revealed that synapses in the hippocampus can store between 4.1 and 4.6 bits of information.

A previous study had come to a similar conclusion, but at the time, the data was processed using a less precise method. The new study corroborates what many neuroscientists had suspected: that individual synapses are capable of storing more than 1 bit of information.

Limitations and Future Applications

It is important to note that these findings are based on experiments in a small region of a mouse's hippocampus, so their applicability to the entire mouse brain or the human brain is yet to be determined.

In the future, the team's method could be utilized to compare the storage capacities of different brain regions and to assess these capabilities in both healthy and diseased brains. This opens up the possibility of elucidating the information storage capabilities of the brains of various animal species.

Summary

A groundbreaking study has revealed that the brain's information storage capacity may far exceed previous estimates. By developing a more accurate method to measure synaptic connections, scientists have determined that synapses may be able to hold up to 10 times more information than previously believed. This discovery has implications for understanding learning, aging, neurological diseases, and the information processing capabilities of the brain across species.