## What's the memory capacity of human brains in bytes?

Some people think that our brain stores information merely by switching on and off neurons or synapses. It's not like that. Our brain is a network, and networks store information through different configurations of the relationships between nodes. Although the entropy (and therefore the amount of information stored) of a…

## Kullback-Leibler Divergence and Cross-entropy Loss

Science is all about data and theories explaining those data. The theory behind coin tossing says that the probability of tail and head is the same, and that given N tosses we expect k heads with a probability $$binomial(trials=N, successes=k)$$. In Bayesian statistics, probability has a nice meaning –it means how strongly…

## Cyberchondria: Why Bayes is a must when looking for web-based diagnoses

The NY Times (and few others) wrote a story entitled Microsoft Finds Cancer Clues in Search Queries: Microsoft scientists have demonstrated that by analyzing large samples of search engine queries they may in some cases be able to identify internet users who are suffering from pancreatic cancer, even before they have received…

## How big can an error be when we estimate something?

Often, when you make an estimation based on many assumptions, people say "There might be errors in all your assumptions, and the error on the result, being the sum of all these errors, is going to be huge". In reality, errors compensate each others. You might overestimate one variable, but…

## Floating Points –Rounding Errors in Algebraic Processes

The floating point representation of numbers is dangerous. With floating points, the computer stores the digits and the position of the point. Imagine a "6-digit decimal computer". This is a computer that uses base 10 numbers, like we human, but can only store 6 digits, the position of the dot,…

## If heat is released from a system, will entropy increase?

Let's ask ourselves this question –will our knowledge about a system increase or decrease when the system cools down? When it's hot, there is high uncertainty about position and speed of the molecules composing the system. The more temperature goes down, the more precise your knowledge about position and speed will…

## Taming Complexity –The Magnum Ice Cream

The Magnum: As Simple as Possible Giacomo, in his "Cremeria Castiglione", explains to me how he is able to make one of the best gelatos in Bologna, and possibly in Italy. This means, for Italians at least, one of the best ice creams in the world. "Making gelato is…

## Can a lion jump 36 feet?

Using data from Rory Young's quora answer: yes, a lion definitely can jump 10+ meters. The horizontal distance D of a projectile with speed V, launched at an angle $$\theta$$ is: $$D = V^2 \cdot sin(2\theta) / g$$ Which gives, for a lion running at 54km/sec, i.e. V=15 m/s, jumping…

## How can entropy both decrease predictability and promote even distributions?

Consider this image: There is no uniformity, and the level of prediction is high. You immediately see that the person on the left is rich, the one on the right is poor. You know who'll have a good meal tonight. Who has higher life expectancy. And so on. Consider this…

## Graph entropy: a definition and its relation to information

A common question is if entropy and information theory are related. The answer is yes. How? And how are they related in graph theory? Von Neumann explains the relationship between entropy and information in a crystalline definition: "Entropy is the difference between the information provided by the macroscopic description and…