Kullback–Leibler divergence is a very useful way to measure the difference between two probability distributions. In this post we'll go over a simple example to help you better grasp this interesting tool from information theory.Read More
Your friends probably don't have a food allergy, but how sure are you?
How likely is it that your friends really have food allergies? More important should you believe them? In this post we look at using Bayes' theorem to model this everyday question.Read More