In this post we’ll explore how we can derive logistic regression from Bayes’ Theorem. Starting with Bayes’ Theorem we’ll work our way to computing the log odds of our problem and the arrive at the inverse logit function. After reading this post you’ll have a much stronger intuition for how logistic regression works!Read More
Kullback–Leibler divergence is a very useful way to measure the difference between two probability distributions. In this post we'll go over a simple example to help you better grasp this interesting tool from information theory.Read More
Your friends probably don't have a food allergy, but how sure are you?
How likely is it that your friends really have food allergies? More important should you believe them? In this post we look at using Bayes' theorem to model this everyday question.Read More