Skip to main content
Ctrl
+
K
About this Jupyter Book
Course overview
Objectives
Topics
1. Basics of Bayesian statistics
1.1. Lecture 1
1.2. Checking the sum and product rules, and their consequences
1.3. Exploring PDFs
1.4. Lecture 2
1.5. Interactive Bayesian updating: coin flipping example
1.6. Lecture 3
1.7. Parameter estimation example: Gaussian noise and averages I
1.8. Radioactive lighthouse problem
1.9. Standard medical example by applying Bayesian rules of probability
1.10. Lecture 4: A couple of frequentist connections
1.11. Visualization of the Central Limit Theorem
2. Bayesian parameter estimation
2.1. Lecture 5: Parameter estimation
2.2. Parameter estimation example: fitting a straight line
2.3. Lecture 6
2.4. Amplitude of a signal in the presence of background
2.5. Linear Regression and Model Validation demonstration
2.6. Assignment: Follow-ups to Parameter Estimation notebooks
2.7. Linear Regression exercise
2.8. Linear algebra games including SVD for PCA
2.9. Follow-up: fluctuation trends with # of points and data errors
3. MCMC sampling I
3.1. Lecture 7
3.2. Metropolis-Hasting MCMC sampling of a Poisson distribution
3.3. Lecture 8
3.4. Parameter estimation example: Gaussian noise and averages II
3.5. Exercise: Random walk
3.6. Overview: MCMC Diagnostics
3.8. Assignment: 2D radioactive lighthouse location using MCMC
4. Why Bayes is better
4.1. Lecture 9
4.2. A Bayesian Billiard game
4.3. Lecture 10
4.4. Parameter estimation example: fitting a straight line II
4.5. Lecture 11
4.6. Error propagation: Example 3.6.2 in Sivia
4.7. Building intuition about correlations (and a bit of Python linear algebra)
4.8. Lecture 12
4.9. Lecture 13
4.10. Dealing with outliers
5. Model selection
5.1. Lecture 14
5.2. Lecture 15
5.3. Evidence calculation for EFT expansions
5.4. Lecture 16
5.5. Example: Parallel tempering for multimodal distributions
5.6. Example: Parallel tempering for multimodal distributions vs. zeus
6. MCMC sampling II
6.1. Lecture 17
6.2. Quick check of the distribution of normal variables squared
6.3. Liouville Theorem Visualization
6.4. Solving orbital equations with different algorithms
6.5. Lecture 18
6.6. PyMC Introduction
6.7. Getting started with PyMC3
6.8. Comparing samplers for a simple problem
6.9. zeus: Sampling from multimodal distributions
7. Gaussian processes
7.1. Lecture 19
7.2. Gaussian processes demonstration
7.3. Learning from data: Gaussian processes
7.4. Exercise: Gaussian Process models with GPy
7.5. Lecture 20
8. Assigning probabilities
8.1. Lecture 21
8.2. Ignorance pdfs: Indifference and translation groups
8.3. MaxEnt for deriving some probability distributions
8.4. Maximum Entropy for reconstructing a function from its moments
8.5. Making figures for Ignorance PDF notebook
9. Machine learning: Bayesian methods
9.1. Lecture 22
9.2. Bayesian Optimization
9.3. Lecture 23
9.4. What Are Neural Networks?
9.5. Neural networks
9.6. Neural network classifier demonstration
9.7. Bayesian neural networks
9.8. Lecture 24
9.9. Variational Inference: Bayesian Neural Networks
9.10. What is a convolutional neural network?
10. PCA, SVD, and all that
10.1. Lecture 25
10.2. Linear algebra games including SVD for PCA
Mini-projects
Mini-project I: Parameter estimation for a toy model of an EFT
Mini-project IIa: Model selection basics
Mini-project IIb: How many lines are there?
Mini-project IIIa: Bayesian optimization
Mini-project IIIb: Bayesian Neural Networks
Reference material
Bibliography
Related topics
Using Anaconda
Using GitHub
Python and Jupyter notebooks
Python and Jupyter notebooks: part 01
Python and Jupyter notebooks: part 02
Examples: Jupyter jb-book
Notebook keys
Checking the sum and product rules, and their consequences
Key
Standard medical example by applying Bayesian rules of probability
Key
Repository
Open issue
Index
A
|
T
A
A second term
T
Term one