Skip to content

Commit 0a9d411

Browse files
committed
Updates from Overleaf
1 parent 83d2691 commit 0a9d411

File tree

8 files changed

+35
-7
lines changed

8 files changed

+35
-7
lines changed

slides/bayes_stats.tex

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -118,7 +118,7 @@
118118
\begin{document}
119119
\include{lecture_0}
120120
\include{lecture_1}
121-
% \include{lecture_2}
121+
\include{lecture_2}
122122
\include{lecture_3}
123123
\include{lecture_4}
124124
\include{lecture_5}

slides/compile.sh

100755100644
File mode changed.

slides/lecture_0.tex

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@ \section{Part I: Foundations}
1818
\begin{itemize}
1919
\item \cite{Robert2007};
2020
\item \cite{Hoff2009};
21+
\item \cite{Schervish1995};
2122
\item \cite{Bernardo2000}.
2223
\end{itemize}
2324
\end{itemize}

slides/lecture_11.tex

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ \section*{Bayesian rules}
4444
There is no such thing as a truly objective analysis, and taking objectivity as premise might hinder our ability to focus on actual discovery and explanation~\citep{Hennig2017}.
4545
\begin{idea}[The subjective basis of knowledge]
4646
\label{id:subjective}
47-
Knowledge arises from a confrontation between \texit{a prioris} and experiments (data).
47+
Knowledge arises from a confrontation between \textit{a prioris} and experiments (data).
4848
Let us hear what Poincaré\footnote{Jules Henri Poincaré (1854--1912) was a French mathematician and the quote is from \textit{La Science and l'Hypóthese} (1902).} had to say:
4949
\begin{quote}
5050
``It is often stated that one should experiment without preconceived ideas.

slides/lecture_2.tex

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
\subsection{Decision Theory basics}
2+
\begin{frame}{The decision-theoretic foundations of the Bayesian paradigm}
3+
\begin{defn}[Loss function]
4+
\label{def:loss_fn}
5+
\end{defn}
6+
\end{frame}
7+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
8+
\begin{frame}{Utility functions}
9+
Properties:
10+
\end{frame}
11+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
12+
\begin{frame}{}
13+
\begin{theo}[]
14+
\end{theo}
15+
See Proposition 4.3 in \cite{Bernardo2000} for a proof outline.
16+
Here we shall prove the version from~\cite{DeFinetti1931}.
17+
\begin{idea}[]
18+
\end{idea}
19+
\end{frame}
20+
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
21+
\begin{frame}{Recommended reading}
22+
\begin{itemize}
23+
\item[\faBook] \cite{Robert2007} Ch. 2. and $^\ast$\cite{Schervish2012} Ch.3;
24+
\item[\faForward] Next lecture: \cite{Hoff2009} Ch. 2 and $^\ast$\cite{Schervish2012} Ch.1;
25+
\end{itemize}
26+
\end{frame}

slides/lecture_3.tex

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ \subsection{Belief functions and exchangeability}
55
\begin{itemize}
66
\item [F] = \{votes for a left-wing candidate\} ;
77
\item [G] = \{is in the 10\% lower income bracket\} ;
8-
\item [H] = \{lives in a large\} ;
8+
\item [H] = \{lives in a large city\} ;
99
\end{itemize}
1010

1111
\begin{defn}[Belief function]
@@ -19,7 +19,7 @@ \subsection{Belief functions and exchangeability}
1919
\begin{itemize}
2020
\item $\be(F) > \be(G)$ means we would bet on $F$ being true over $G$ being true;
2121
\item $\be(F\mid H) > \be(G \mid H)$ means that, \textbf{conditional} on knowing $H$ to be true, we would bet on $F$ over $G$;
22-
\item $\be(F\mid G) > \be(F \mid H)$ means that if we were forced to be on $F$, we would be prefer doing so if $G$ were true than $H$.
22+
\item $\be(F\mid G) > \be(F \mid H)$ means that if we were forced to bet on $F$, we would be prefer doing so if $G$ were true than $H$.
2323
\end{itemize}
2424
\end{frame}
2525
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
@@ -184,6 +184,7 @@ \subsection{Belief functions and exchangeability}
184184
As the exchangeability results above clearly demonstrate, being able to use conditional independence is a handy tool.
185185
More specifically, knowing on what to condition so as to make things exchangeable is key to statistical analysis.
186186
\begin{idea}[Conditioning is the soul of Statistics\footnote{This idea is due to Joe Blitzstein, who did his PhD under no other than the great Persi Diaconis.}]
187+
\label{idea:conditioning_soul}
187188
Knowing on what to condition can be the difference between an unsolvable problem and a trivial one.
188189
When confronted with a statistical problem, always ask yourself ``What do I know for sure?'' and then ``How can I create a conditional structure to include this information?''.
189190
\end{idea}

slides/lecture_5.tex

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -117,7 +117,7 @@ \section*{Bayesian point estimation}
117117
for $\boldsymbol{G}$ a $p \times p$ non-negative symmetric matrix.
118118
In this case, we get
119119
$$ \delta_\pi = \frac{E_p[w(\theta)\theta]}{E_p[w(\theta)]}. $$
120-
Please \textbf{note} that there is no universal justification for quadratic loss other than (sometimes leading to increased) mathematical tractability
120+
Please \textbf{note} that there is no universal justification for quadratic loss other than (sometimes leading to increased) mathematical tractability.
121121
\end{frame}
122122
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
123123
\begin{frame}{Loss estimation}

slides/lecture_7.tex

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -82,9 +82,9 @@ \section*{Bayesian model choice}
8282
A nice consequence of the formulation we just saw is that the predictive distribution looks quite intuitive:
8383
\begin{align}
8484
\nonumber
85-
p(\tilde{x} \mid \boldsymbol{x}) &= \sum_{j} w_j \int_{\boldsymbol{\Theta}_j} f_j(\tilde{x} \mid t_j) f_j(\boldsymbol{x}\mid t_j)\pi_j(t_j)\,dt_j,\\
85+
p(\tilde{x} \mid \boldsymbol{x}) &= \sum_{j} w_j \frac{1}{m_j(\boldsymbol{x})}\int_{\boldsymbol{\Theta}_j} f_j(\tilde{x} \mid t_j) f_j(\boldsymbol{x}\mid t_j)\pi_j(t_j)\,dt_j,\\
8686
\label{eq:predictive_1}
87-
&= \sum_{j} \pr(\mathcal{M}_j \mid \boldsymbol{x}) m_j(\tilde{x}).
87+
&= \sum_{j} \pr(\mathcal{M}_j \mid \boldsymbol{x}) p_j(\tilde{x} \mid \boldsymbol{x}).
8888
\end{align}
8989
\end{frame}
9090
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

0 commit comments

Comments
 (0)