New technologies are allowing us to explore the brain as never before. We are entering a new era in neuroscience where our knowledge of the brain is beginning to match the urgent need to prevent and treat diseases of the brain.

Memory capacity of brain is 10 times more than previously thought

Terry Sejnowski and colleagues published work in eLife in January 2016 demonstrating critical insight into the size of neural connections and putting the memory capacity of the brain far higher than common estimates.

A key part of brain activity happens when branches of neurons, much like electrical wire, interact at certain junctions, known as synapses. Neurotransmitters travel across the synapse to tell the receiving neuron to convey an electrical signal to other neurons. Synapses are still a mystery, though their dysfunction can cause a range of neurological diseases.

While building a 3D reconstruction of rat hippocampus tissue, the Salk team observed a single axon (output ‘wire’) from one neuron formed two synapses reaching out to a single dendrite (input ‘wire’) of a second neuron, signifying that the first neuron seemed to be sending a duplicate message to the receiving neuron. The team measured the difference in size between these two synapses in the hopes of gleaning insight into the range between synaptic sizes, which so far had only been classified as small, medium and large. They were surprised to find that the synapses differed by only 8 percent.

A key part of brain activity happens when branches of neurons, much like electrical wire, interact at certain junctions, known as synapses.

Because the memory capacity of neurons is dependent upon synapse size, this 8 percent difference turned out to be a key number the team then plugged into algorithmic models of the brain to measure how much information could potentially be stored in synaptic connections.

It was known before that the range in sizes between the smallest and largest synapses was a factor of 60 and that most synapses were simply classified as small. But armed with the knowledge that synapses of all sizes could vary in increments as little as 8 percent between sizes within a factor of 60, the team determined there could be about 26 categories of sizes of synapses, rather than just a few. In computer terms, 26 sizes of synapses correspond to about 4.7 “bits” of information. Previously, it was thought that the brain was capable of just one to two bits for short and long memory storage in the hippocampus.

Aside from helping to better understand the brain, the discovery could also aid computer scientists in building ultraprecise—but energy-efficient— computers, particularly ones that employ “deep learning” and artificial neural nets, techniques capable of sophisticated learning and analysis.

Read News Release

As seen in

Sign up for our monthly newsletter.

Latest discoveries, events & more.