Learn practical skills, build real-world projects, and advance your career
import numpy as np
import matplotlib.pylab as plt
from scipy.stats import uniform, norm, laplace, beta, bernoulli, expon
#import pandas as pd
from statsmodels.graphics.tsaplots import plot_acf
import warnings 
import pystan
import arviz
warnings.filterwarnings('ignore')

Introduction to Bayesian Modeling (Part 1b)

In this session, we will try to perform the Bayesian inference on simple examples. We won't rely on conjugate priors, but will explore the possibilities of MCMC-based inference. Tomorrow, we will learn more about MCMC.

Generally, we will proceed in three steps:

  1. We construct a model. That is, we need to decide the distribution of the data.
  2. We choose a convenient prior distribution for inference of unknown model parameters.
  3. We run estimation and analyze results.

Two additional software packages will be used:

  • stan and its python interface pystan
  • arviz for visualization of the results.

Example 1: Coin tossing

The first example introduces the basics of stan and pystan. The task is:

  1. simulate ndat tosses of a coin (pings, detections of something...) using a known probability of head (success, presences of a phenomenon...);
  2. try to infer this (here known) probability from the data.

We already know that the binary 0-1 data where 1 occurs with probability pp follows the Bernoulli distribution

xBernoulli(p).x \sim \mathrm{Bernoulli}(p).

The scipy.stats package contains the bernoulli class, whose rvs() method produces (pseudo)random sampling: