## Kullback-Leibler Divergence and Cross-entropy Loss

Science is all about data and theories explaining those data. The theory behind coin tossing says that the probability of tail and head is the same, and that given N tosses we expect k heads with a probability $$binomial(trials=N, successes=k)$$. In Bayesian statistics, probability has a nice meaning –it means how strongly…

## Cyberchondria: Why Bayes is a must when looking for web-based diagnoses

The NY Times (and few others) wrote a story entitled Microsoft Finds Cancer Clues in Search Queries: Microsoft scientists have demonstrated that by analyzing large samples of search engine queries they may in some cases be able to identify internet users who are suffering from pancreatic cancer, even before they have received…

## How big can an error be when we estimate something?

Often, when you make an estimation based on many assumptions, people say "There might be errors in all your assumptions, and the error on the result, being the sum of all these errors, is going to be huge". In reality, errors compensate each others. You might overestimate one variable, but…

## Floating Points –Rounding Errors in Algebraic Processes

The floating point representation of numbers is dangerous. With floating points, the computer stores the digits and the position of the point. Imagine a "6-digit decimal computer". This is a computer that uses base 10 numbers, like we human, but can only store 6 digits, the position of the dot,…

## If heat is released from a system, will entropy increase?

Let's ask ourselves this question –will our knowledge about a system increase or decrease when the system cools down? When it's hot, there is high uncertainty about position and speed of the molecules composing the system. The more temperature goes down, the more precise your knowledge about position and speed will…

## Complexity Theory and Organizations –The Magnum Ice Cream

You can very successfully use network theory to analyse complexity in business. It can be fun, and easy to visualize. I personally understood why businesses can become more complex, yet more successful, analysing the evolution the network of the most successful European brands of the past 25 years –the Magnum…

## Can a lion jump 36 feet?

Using data from Rory Young's answer: yes, a lion definitely can jump 10+ meters. The horizontal distance D of a projectile with speed V, launched at an angle is: Which gives, for a lion running at 54km/sec, i.e. V=15 m/s, jumping at an angle of 45 degrees, and assuming Considering…

## How can entropy both decrease predictability and promote even distributions?

Consider this image: There is no uniformity, and the level of prediction is high. You immediately see that the person on the left is rich, the one on the right is poor. You know who'll have a good meal tonight. Who has higher life expectancy. And so on. Consider this…

## Graph entropy: a definition and its relation to information

A common question is if entropy and information theory are related. The answer is yes. How? And how are they related in graph theory? Von Neumann explains the relationship between entropy and information in a crystalline definition: "Entropy is the difference between the information provided by the macroscopic description and…

## Why is the future and the past so different?

The "directionality" of time can be easily explained in terms of entropy, as you point out --and not vice-versa. It's also easy to understand, if you consider the statistical mechanics' definition of entropy. Slightly formal: Systems evolve from the past, where they find themselves in an improbable state, to the…