What's the memory capacity of human brains in bytes?

Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

Some people think that our brain stores information merely by switching on and off neurons or synapses. It's not like that. Our brain is a network, and networks store information through different configurations of the relationships between nodes.

Although the entropy (and therefore the amount of information stored) of a network can be computed mathematically (see my attempt here: Graph entropy: a definition and its relation to information), it's much more sensible, in my opinion, to derive the amount of information from the amount of energy the brain is able to extract from the environment.

Information measures, in the end, how much you decrease uncertainty. And that is not easy to define. How much uncertainty our brain decreases? Let's say enough to exploit systems in an unstable state, where a small quantity of energy releases a big amount of energy, so-called metastates (e.g. a match, which with just a scratch gives you fire).

Deduction through Physical principle

In physics, information and use of energy are related to each other, and if we know the latter, we know the former. Therefore, if we know how much energy we use during our whole life we can now how much information we store.

This principle is used by Leó Szilárd in his "On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings", where he quantifies the equivalence between information and the "ability to extract energy": he concluded that one needs about 40 exabytes to extract one calorie from a system at room temperature.  As we burn about 2,000 calories a day, and live for about 80 years, the amount of calories burnt in a lifetime is:

a life's energy = 2,000*365*80 kilocalories ~ 60M kilocalories

That's the minimum our body –from proteins, to cells and organs and up– must be able to get from the environment. That means that it must store at least  40 exabytes \cdot 60M = 2.4 \cdot 10^{30} bytes, or:

capacity of brain (physics) = 2,5 million of yottabytes

As said, this is the information stored in the whole body, not just the brain. Each of the 10^12 cells of our body actually store about 1Gb of information, from the same principle above.

Books

Here book you can read if you are interested in the subject. Minds behind the brain is the history of neuroscience through all the people, from ancient Greek to today, who studied the brain. Enjoyable and rich of information.

Information Theory, Evolution, and the Origin of Life, by Hubert P. Yockey is a must for anyone wanting to apply Information Theory to biology –I just quote a paragraph from the epilogue:

Galileo believed that the language of Nature is inherently mathematical and is essential to describing natural phenomena. Although there are many fields of biology that are essentially descriptive, with the application of information theory, theoretical biology can now take its place with theoretical physics without apology.

At the end a Biography of Claude Shannon which (shame on me!) I haven't read. Yet. If you do, please comment a review:)

  1.  Neuronal Connections and the Mind, The Connectome
  2. erg/bit : http://link.springer.com/article/10.1007%2FBF02477767#page-1
  3. consumption in c elegans: http://www.ncbi.nlm.nih.gov/pubmed/15809072
  4. weight and lifespan in c elegans: http://bmcecol.biomedcentral.com/articles/10.1186/1472-6785-9-14
  5. energy consumption in chimps: http://www.ncbi.nlm.nih.gov/books/NBK53561/

 

Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *