Learn practical skills, build real-world projects, and advance your career
Created 4 years ago
import numpy as np
import matplotlib.pylab as plt
from scipy.stats import uniform, norm, laplace, beta, bernoulli, expon
#import pandas as pd
from statsmodels.graphics.tsaplots import plot_acf
import warnings
import pystan
import arviz
warnings.filterwarnings('ignore')
Introduction to Bayesian Modeling (Part 1b)
In this session, we will try to perform the Bayesian inference on simple examples. We won't rely on conjugate priors, but will explore the possibilities of MCMC-based inference. Tomorrow, we will learn more about MCMC.
Generally, we will proceed in three steps:
- We construct a model. That is, we need to decide the distribution of the data.
- We choose a convenient prior distribution for inference of unknown model parameters.
- We run estimation and analyze results.
Two additional software packages will be used:
- stan and its python interface pystan
- arviz for visualization of the results.
Example 1: Coin tossing
The first example introduces the basics of stan and pystan. The task is:
- simulate
ndat
tosses of a coin (pings, detections of something...) using a known probability of head (success, presences of a phenomenon...); - try to infer this (here known) probability from the data.
We already know that the binary 0-1 data where 1 occurs with probability follows the Bernoulli distribution
The scipy.stats
package contains the bernoulli
class, whose rvs()
method produces (pseudo)random sampling: