Skip to content

Commit 9ef6be4

Browse files
committed
final slides
1 parent efbb8a2 commit 9ef6be4

File tree

4 files changed

+154
-0
lines changed

4 files changed

+154
-0
lines changed

slides/bayes.bib

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -57,6 +57,15 @@ @article{Degroot1973
5757
year={1973},
5858
publisher={Taylor \& Francis Group}
5959
}
60+
@incollection{Jaynes1976,
61+
title={Confidence intervals vs Bayesian intervals},
62+
author={Jaynes, Edwin T},
63+
editor={Kempthorne, Oscar},
64+
booktitle={Foundations of probability theory, statistical inference, and statistical theories of science},
65+
pages={175--257},
66+
year={1976},
67+
publisher={Springer}
68+
}
6069
@article{Bernardo1979,
6170
title={Reference posterior distributions for Bayesian inference},
6271
author={Bernardo, Jose M},
@@ -83,6 +92,16 @@ @article{Diaconis1980
8392
year={1980},
8493
publisher={JSTOR}
8594
}
95+
@article{Efron1986,
96+
title={Why isn't everyone a Bayesian?},
97+
author={Efron, Bradley},
98+
journal={The American Statistician},
99+
volume={40},
100+
number={1},
101+
pages={1--5},
102+
year={1986},
103+
publisher={Taylor \& Francis Group}
104+
}
86105
@article{Raftery1988,
87106
title={Inference for the binomial N parameter: A hierarchical Bayes approach},
88107
author={Raftery, Adrian E},
@@ -188,6 +207,16 @@ @article{Gelman2017
188207
year={2017},
189208
publisher={Multidisciplinary Digital Publishing Institute}
190209
}
210+
@article{Hennig2017,
211+
title={Beyond subjective and objective in statistics},
212+
author={Gelman, Andrew and Hennig, Christian},
213+
journal={Journal of the Royal Statistical Society: Series A (Statistics in Society)},
214+
volume={180},
215+
number={4},
216+
pages={967--1033},
217+
year={2017},
218+
publisher={Wiley Online Library}
219+
}
191220
@article{Simpson2017,
192221
title={Penalising model component complexity: A principled, practical approach to constructing priors},
193222
author={Simpson, Daniel and Rue, H{\aa}vard and Riebler, Andrea and Martins, Thiago G and S{\o}rbye, Sigrunn H},

slides/bayes_stats.pdf

17.1 KB
Binary file not shown.

slides/bayes_stats.tex

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -125,6 +125,7 @@
125125
\include{lecture_6}
126126
\include{lecture_7}
127127
\include{lecture_8}
128+
\include{lecture_11}
128129
\include{lecture_extra}
129130
%%%%%%%
130131
\begin{frame}[t, allowframebreaks]

slides/lecture_11.tex

Lines changed: 124 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,124 @@
1+
\section*{Bayesian rules}
2+
\begin{frame}{Why be Bayesian I: probabilistic representation}
3+
4+
We already reduce our uncertainty about phenomena to probability distributions for sampling distributions (likelihoods).
5+
6+
\begin{idea}[Probabilisation of uncertainty]
7+
\label{id:prob_uncertainty}
8+
Our statistical models are \textit{interpretations} of reality, rather than \textit{explanations} of it.
9+
Moreover,
10+
\begin{quote}
11+
`` ...the representation of unknown phenomena by a probabilistic model, at the observational level as well as at the parameter level, does not need to correspond effectively—or physically—to a generation from a probability distribution, nor does it compel us to enter a supradeterministic scheme, fundamentally because of the nonrepeatability of most experiments.''
12+
\end{quote}
13+
\cite{Robert2007}, pg 508.
14+
\end{idea}
15+
\end{frame}
16+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
17+
\begin{frame}{Why be Bayesian II: conditioning on OBSERVED data}
18+
Remember Idea~\ref{id:soul}: conditioning is the soul of (Bayesian) Statistics.
19+
\begin{idea}[Conditioning on what is actually observed]
20+
\label{id:obs_data}
21+
A quantitative analysis about the parameter(s) $\theta$ conditioning \textit{only} on the observed data, $x$ unavoidably requires a distribution over $\theta$.
22+
To this end, the \textbf{only} coherent way to achieve this goal starting from a distribution $\pi(\theta)$ is to use Bayes's theorem.
23+
\end{idea}
24+
25+
Frequentist arguments are, necessarily, about procedures that behave well under a given data-generating process and thus forcibly make reference to unobserved data sets that could, in theory, have been observed.
26+
\end{frame}
27+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
28+
\begin{frame}{Why be Bayesian III: priors as inferential tools}
29+
30+
A refreshing break from the strictly subjectivist view of Bayesianism can be had if we think about inference functionally.
31+
32+
\begin{idea}[The prior as a regularisation tool]
33+
\label{id:prior_tool}
34+
If one adopts a mechanistic view of Bayesian inference, the prior can be seen as an additional regularisation or penalty term that enforces certain model behaviours, such as sparsity or parsimony.
35+
A good prior both \textit{summarises} substantitve knowledge about the process and rules out unlikely model configurations.
36+
\end{idea}
37+
38+
In other words, sometimes it pays to use the prior to control what the model \textit{does}, rather than which specific values the parameter takes.
39+
40+
\end{frame}
41+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
42+
\begin{frame}{Why be Bayesian IV: embracing subjectivity}
43+
The common notion of ``objectivity'' is ruse.
44+
There is no such thing as a truly objective analysis, and taking objectivity as premise might hinder our ability to focus on actual discovery and explanation~\citep{Hennig2017}.
45+
\begin{idea}[The subjective basis of knowledge]
46+
\label{id:subjective}
47+
Knowledge arises from a confrontation between \texit{a prioris} and experiments (data).
48+
Let us hear what Poincaré\footnote{Jules Henri Poincaré (1854--1912) was a French mathematician and the quote is from \textit{La Science and l'Hypóthese} (1902).} had to say:
49+
\begin{quote}
50+
``It is often stated that one should experiment without preconceived ideas.
51+
This is simply impossible; not only would it make every experiment sterile, but even if we were ready to do so, we could not implement this principle.
52+
Everyone stands by [their] own conception of the world, which [they] cannot get rid of so easily.''
53+
\end{quote}
54+
\end{idea}
55+
\end{frame}
56+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
57+
\begin{frame}{Why be Bayesian V: principled inference}
58+
As we saw in the first lectures of this course, the Bayesian approach is coherent with a few very compelling principles, namely Sufficiency, Conditionality and the Likelihood principle.
59+
\begin{idea}[Bayesian inference follows from strong principles]
60+
Starting from a few desiderata, namely conditioning on the \textbf{observed} data, independence of stopping criteria and respecting the sufficiency, conditionality and likelihood principles, one arrives at a single approach: Bayesian inference using proper priors.
61+
\label{id:principled_inference}
62+
\end{idea}
63+
\end{frame}
64+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
65+
\begin{frame}{Why be Bayesian VI: universal inference}
66+
Bayesian Statistics provides an universal procedure for drawing inference about probabilistic models, something Frequentists can only dream of.
67+
\begin{idea}[Bayesian inference is universal]
68+
Starting from a sampling model, a (proper) prior and a loss (or utility) function, the Bayesian analyst can always derive an estimator.
69+
Moreover, and importantly, many optimal frequentist estimators can be recovered from Bayesian estimators or limits of Bayesian estimators.
70+
Paradoxically, this means that one can be a staunch advocate of Frequentism and still employ Bayesian methods (see, e.g. least favourable priors).
71+
\label{id:universal}
72+
\end{idea}
73+
\end{frame}
74+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
75+
\begin{frame}{How to be Bayesian I: clarity openness}
76+
Being subjective does not mean ``anything goes''.
77+
As a scientist, you are still bound by the laws of logic and reason.
78+
79+
\begin{idea}[State your prior elicitation clearly and openly]
80+
As we have seen, prior information does not always translate exactly into one unique prior choice.
81+
In other words, the same prior information can be represented adequately by two or more probability distributions.
82+
Make sure your exposition \textbf{clearly} separates which features of the prior come from substantitve domain expertise and which ones are arbitrary constraints imposed by a particular choice of parametric family, for instance.
83+
An effort must be made to state all modelling choices \textbf{openly}.
84+
Openly stating limitations is not a bug, it is a feature.
85+
\end{idea}
86+
\end{frame}
87+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
88+
\begin{frame}{How to be Bayesian II: ``noninformativeness'' requires care}
89+
Mathematically, Bayesian Statistics is all-encompassing (see Idea~\ref{id:universal}).
90+
One must be careful\footnote{Personally, I'm not opposed to reference priors and the like, and gladly employ them in my own research work, but I do think one needs to know very well what one is doing in order to employ them properly.} when employing so-called ``objective'' Bayesian methods.
91+
\begin{idea}[Beware of objective priors]
92+
\label{id:careful}
93+
In a functional sense, non-informative priors are a welcome addition to Bayesian Statistics because they provide~\textit{closure}, and confer its universality.
94+
On the other hand, reference priors and the like cannot be justified as summarising prior information.
95+
From a technical standpoint, many noninformative priors are also improper and thus impose the need to check propriety of the resulting posterior distribution.
96+
\end{idea}
97+
\end{frame}
98+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
99+
\begin{frame}{A word of caution}
100+
101+
A strong defence of the Bayesian paradigm should not cloud our view of the bigger picture.
102+
Statistics is the grammar of Science; whatever grammatical tradition you choose, be sure to employ it properly.
103+
\begin{idea}[Do not become a zealot!]
104+
\label{id:not_zealot}
105+
Statistics is about learning from data and making decisions under uncertainty.
106+
The key to a good statistical analysis is not which ideology underpins it, but how helpful it is at answering the scientific questions at hand.
107+
Ideally, you should know both\footnote{Here we are pretending for a second that there are only two schools of thought in Statistics.} schools well enough to be able to analyse any problem under each approach.
108+
\end{idea}
109+
\end{frame}
110+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
111+
\begin{frame}{So long, and thanks for all the fish!}
112+
Remember, kids:
113+
\begin{center}
114+
{\Huge Bayes rules!}
115+
\end{center}
116+
\end{frame}
117+
118+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
119+
\begin{frame}{Recommended reading}
120+
\begin{itemize}
121+
\item[\faBook] \cite{Jaynes1976},~\cite{Efron1986} and Ch 11 of~\cite{Robert2007}.
122+
% \item
123+
\end{itemize}
124+
\end{frame}

0 commit comments

Comments
 (0)