# Popular Posts

Confused by Hypothesis tesing used in A/B Tests? In this post we take a look at building an A/B from scratch the Bayesian way!

Monte Carlo simulations are very fun to write and can be incredibly useful for solving ticky math problems. In this post we explore how to write six very useful Monte Carlo simulations in R to get you thinking about how to use them on your own.

There is an iconic probability problem in The Empire Strikes Back! Han Solo is told that navigating an asteroid field is extremely unlikely to be successful. However not only does he navigate the asteroid field successfully, but we know he will. Learn how we can use Bayesian Priors to reconcile C3POs frequentist views on probability with our natural reasoning.

Find Bayes' Theorem confusing? Let's break down this famous formula using Lego to help you build up a better intuition for this foundational concept!

## All Posts

When you train a logistic model it learns the prior probability of the target class from the ratio of positive to negative examples in the training data. If the real world prior is not the same as your training data, this can lead to unexpected predictions from your model. Read this post to learn how to correct this even after the model has been trained!

In this post we’ll explore how we can derive logistic regression from Bayes’ Theorem. Starting with Bayes’ Theorem we’ll work our way to computing the log odds of our problem and the arrive at the inverse logit function. After reading this post you’ll have a much stronger intuition for how logistic regression works!

In this post we're going to take a deeper look at Mean Squared Error. Despite the relatively simple nature of this metric, it contains a surprising amount of insight into modeling.

Kullback–Leibler divergence is a very useful way to measure the difference between two probability distributions. In this post we'll go over a simple example to help you better grasp this interesting tool from information theory.

A guide for getting started with Bayesian Statistics! Start your way with Bayes' Theorem and end up building your own Bayesian Hypothesis test!

## Bayesian Reasoning with the Mystic Seer: Bayes' Factor, Prior Probabilities and Psychic Powers!

## Your friends probably don't have a food allergy, but how sure are you?

How likely is it that your friends really have food allergies? More important should you believe them? In this post we look at using Bayes' theorem to model this everyday question.

## Use Markov Chains and Linear Algebra to solve this holiday puzzle!

In this post we look at the classic Coupon Collector's problem as well as a more interesting variant of it!

Part 2 in our exploration of the Lebesgue Integral

In this post we use a strange tea party to explore a practical understanding of the Lebesgue integral

Using pictures to explain the idea of Probability Spaces in Rigorous Probability Theory

In this post we discuss an intuitive, high level view of measure theory and why it is important to the study of rigorous probability.

The Fundamental Theorem of Calculus is a beautiful thing, but it fails to describe some very simple integrals. In this post we look at how the Riemann Integral can solve some of these problems.

In this post we build an intuition for the Fundamental Theorem of Calculus by using computation rather than analytical models of the problem.

People often discuss the idea of being "90% certain" or "99% sure that this is true", but how big is the difference between these two values? It turns out that the difference between these values is similiar to the difference between having $10 and $100 in your wallet.

We have looked at working with a variety of analytical priors, but how can you sample from a prior probability that is not so mathematically pleasant to work with? In this post we learn about Rejection Sampling as one method of solving this problem.

Why is Variance related to X squared? To understand this we need to understand Moments of a Random Variable. Read on to learn about the relationship between Variance, Moments of a Random Variable and Jensen's inequality.

A short post playing with the idea of using a Recurrent Neural Network to automatically generate text from James Joyce's Finnegans Wake.

Some people like "light" beach reading during the summer. I prefer sinking my teeth into a massive, challenging and summer long book. Here are a few of my favorites.

Confused by Hypothesis tesing used in A/B Tests? In this post we take a look at building an A/B from scratch the Bayesian way!

Learn about Bayes Factor by recreated the famous Voight-Kampff test from the movie Blade Runner.

We've already looked at Bayesian Parameter Estimation, now we'll learn how to use Prior Probabilities in our Parameter Estimation to get better results.

Discover the fundamental of Bayesian Parameter estimation. Learn to use the Probability Density Function, Cumulative Distribution Function and Quantile Function to estimate unknown values in our data.

Many people find the ideas of Expectation and Variance confusing. In part this is because the way we view these concepts changes as our understanding grows in sophistication. In this post we'll look at the way these definitions change from their basic High School intro to the view of Rigorous Probability Theory.

Learn about Discrete and Continuous probability distributions as well as the types of questions that they can both answers. This post also discusses the relationship between the Binomial and Beta distributions.

Monte Carlo simulations are very fun to write and can be incredibly useful for solving ticky math problems. In this post we explore how to write six very useful Monte Carlo simulations in R to get you thinking about how to use them on your own.

There is an iconic probability problem in The Empire Strikes Back! Han Solo is told that navigating an asteroid field is extremely unlikely to be successful. However not only does he navigate the asteroid field successfully, but we know he will. Learn how we can use Bayesian Priors to reconcile C3POs frequentist views on probability with our natural reasoning.

Euler's number (the mathematical constant e) shows up in a variety of unexpected places. One of them is in Probability. A common way of expressing probabilities is to say "there's a 1 in a million chance!". In this post we find out how that way of viewing probabilities eventually leads us to Euler's number.

An explanation of Variance, Covariance and Correlation in rigorous yet clear terms providing a more general and intuitive look at these essential concepts.

## Use Markov Chains and Linear Algebra to solve this holiday puzzle!