# Nonparametric Bayes and Exchangeability

This tutorial took place as part of the Bootcamp preceding the MIFODS Workshop on Graphical models, exchangeable models, and graphons. See this link for the latest versions of all tutorials.
Part I: Sunday, August 18, 11:30 AM–12:15 PM

Part II: Wednesday, August 18, 1:45 PM–2:30 PM

**Instructor**:

Professor Tamara Broderick

Email:

## Description

This tutorial introduces nonparametric Bayes (BNP) as a tool for
modern data science and machine learning. The tutorial then examines exchangeability
as a tool for understanding when to use BNP (and other Bayesian) methods as well as when not to use them.
BNP methods are useful in a
variety of data analyses---including density estimation without
parametric assumptions and clustering models that adaptively determine
the number of clusters. We will demonstrate that BNP allows the data
analyst to learn more from a data set as the size of the data set
grows and see how this feat is accomplished. We will describe one of the most popular
BNP models: the Dirichlet process.
In the second part, we will look more closely at (infinite) exchangeability, the assumption
that a distribution over our data is unchanged by permutations of the data for any finite data size.
In particular, we will see how an exchangeability assumption implies the Kingman paintbox characterization in clustering, or partition, models -- and how such an assumption implies the Aldous--Hoover theorem for network, or graph, valued data. We will use these observations to critique models we saw in the first part of the tutorial as well as other models. We will briefly discuss potential solutions.
## Materials

- README for demos
- [Slides for Part I]
- Demo 1 [code]: Beta random variable and random distribution intuition
- Demo 2 [code]: Dirichlet random variable and random distribution intuition
- Demo 3 [code]: K large relative to N intuition; empty components
- Demo 4 [code]: K large relative to N intuition; growth of number of clusters
- Demo 5 [code]: GEM random distribution intuition

- [Slides for Part II]
- Demo 6 [code]: An exact DPMM simulator

## Prerequisites

Working knowledge of Bayesian data analysis. Know how
to use Bayes' Theorem to calculate a posterior for both discrete and
continuous parametric distributions.
### What we won't cover

Gaussian processes are an important branch of nonparametric Bayesian modeling, but we won't have time to cover them here. We'll be focusing on the discrete, or Poisson point process, side of nonparametric Bayesian inference.