# Prior Probability in Logistic Regression

When you train a logistic model it learns the prior probability of the target class from the ratio of positive to negative examples in the training data. If the real world prior is not the same as your training data, this can lead to unexpected predictions from your model. Read this post to learn how to correct this even after the model has been trained!

# Logistic Regression from Bayes' Theorem

In this post we’ll explore how we can derive logistic regression from Bayes’ Theorem. Starting with Bayes’ Theorem we’ll work our way to computing the log odds of our problem and the arrive at the inverse logit function. After reading this post you’ll have a much stronger intuition for how logistic regression works!

# A Deeper look at Mean Squared Error

In this post we're going to take a deeper look at Mean Squared Error. Despite the relatively simple nature of this metric, it contains a surprising amount of insight into modeling.

# Kullback-Leibler Divergence Explained

Kullback–Leibler divergence is a very useful way to measure the difference between two probability distributions. In this post we'll go over a simple example to help you better grasp this interesting tool from information theory.

# A Guide to Bayesian Statistics

A guide for getting started with Bayesian Statistics! Start your way with Bayes' Theorem and end up building your own Bayesian Hypothesis test!

# Use Bayes' Theorem to Investigate Food Allergies

## Your friends probably don't have a food allergy, but how sure are you?

How likely is it that your friends really have food allergies? More important should you believe them? In this post we look at using Bayes' theorem to model this everyday question.