-
Notifications
You must be signed in to change notification settings - Fork 11
/
index.html
231 lines (148 loc) · 9.82 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
<!DOCTYPE html>
<html lang="en">
<head>
<!-- Basic Page Needs
–––––––––––––––––––––––––––––––––––––––––––––––––– -->
<meta charset="utf-8">
<title>Shravan Vasishth's Intro Bayes course home page</title>
<meta name="description" content="">
<meta name="author" content="Shravan Vasishth">
<!-- Mobile Specific Metas
–––––––––––––––––––––––––––––––––––––––––––––––––– -->
<meta name="viewport" content="width=device-width, initial-scale=1">
<!-- FONT
–––––––––––––––––––––––––––––––––––––––––––––––––– -->
<link href="//fonts.googleapis.com/css?family=Raleway:400,300,600" rel="stylesheet" type="text/css">
<!-- CSS
–––––––––––––––––––––––––––––––––––––––––––––––––– -->
<link rel="stylesheet" href="css/normalize.css">
<link rel="stylesheet" href="css/skeleton.css">
<!-- Favicon
–––––––––––––––––––––––––––––––––––––––––––––––––– -->
<link rel="icon" type="image/png" href="images/lorikeet.jpg">
</head>
<body>
<section>
<div class="container">
<div class="column">
<p>
<h3>Introduction to Bayesian data analysis (<a href="https://vasishth.github.io/smlp2019/">SMLP 2019</a>)</h3>
<br>
<h4>Instructor</h4>
<a href="http://www.ling.uni-potsdam.de/~vasishth/">Shravan Vasishth</a>
<br><br>
<h4>Dates and location</h4>
Taught at <a href="https://vasishth.github.io/smlp/">SMLP</a>.<br>
Every year in September. Haus 6, Griebnitzsee campus, University of Potsdam.
<br><br>
<h4>Overview</h4>
In recent years, Bayesian methods have come to be widely adopted in all areas of science. This is in large part due to the development of sophisticated software for probabilisic programming; a recent example is the astonishing computing capability afforded by the language Stan (mc-stan.org). However, the underlying theory needed to use this software sensibly is often inaccessible because end-users don't necessarily have the statistical and mathematical background to read the primary textbooks (such as Gelman et al's classic Bayesian data analysis, 3rd edition). In this course, we seek to cover this gap, by providing a relatively accessible and technically non-demanding introduction to the basic workflow for fitting different kinds of linear models using Stan. To illustrate the capability of Bayesian modeling, we will use the R package RStan and a powerful front-end R package for Stan called brms.
<br><br>
<h4>Prerequisites</h4>
We assume familiarity with R. Participants will benefit most if they have previously fit linear models and linear mixed models (using lme4) in R, in any scientific domain within linguistics and psychology. No knowledge of calculus or linear algebra is assumed (but will be helpful to know), but basic school level mathematics knowledge is assumed (this will be quickly revisited in class).
<br><br>
<h4>Please install the following software before coming to the course</h4>
We will be using the software <a href="http://cran.r-project.org/">R</a>,
and <a href="https://www.rstudio.com/">RStudio</a>,
so make sure you install these on your computer.
You should also install the R package <a href="https://github.com/stan-dev/rstan/wiki/RStan-Getting-Started">rstan</a>; the R package <a href="https://github.com/paul-buerkner/brms">brms</a>.
<br><br>
<h4>Outcomes</h4>
After completing this course, the participant will have become familiar with the foundations of Bayesian inference using Stan (RStan and brms), and will be able to fit a range of multiple regression models and hierarchical models, for normally distributed data, and for lognormal and Binomially distributed data. They will know how to calibrate their models using prior and posterior predictive checks; they will be able to establish true and false discovery rates to validate discovery claims. If there is time, we will discuss how to carry out model comparison using Bayes factors and k-fold cross validation.
<br><br>
<h4>Course materials</h4>
<a href="https://github.com/vasishth/IntroductionBayes/archive/master.zip">Click here to download everything</a>. If you use github, you can clone this repository: <a href="https://github.com/vasishth/IntroductionBayes">https://github.com/vasishth/IntroductionBayes</a><br>
Solutions to exercises are not publicly available; they will only be provided to participants.<br>
<strong>Draft textbook:</strong>
See
<a href="https://vasishth.github.io/Bayes_CogSci/">here</a>. PDF version available on request.
<br>
<strong>slides and exercises:</strong><br><b>
<strong>part 1</strong>
<ol>
<li><a href="slides/00_FrequentistReview.pdf">00 Frequentist Foundations (optional review)</a></li>
<li><a href="slides/01_foundations.pdf">01 Foundations</a></li>
<li><a href="slides/02_bayes.pdf">02 Introduction to Bayesian methods</a></li>
<li><a href="slides/02_Sampling.pdf">02 Sampling</a></li>
</ol>
<strong>part 2</strong>
<ol>
<li><a href="slides/03_linearmodeling.pdf">03 Linear Modeling</a></li>
<li><a href="slides/04_hlm.pdf">04 Hierarchical Linear Models</a></li>
<li><a href="slides/05_modelcomparison.pdf">05 Model Comparison using Bayes Factors</a></li>
</ol>
<strong>case studies:</strong>
<a href="case_studies.zip">Three case studies (zip archive): meta-analysis, measurement error models, and example of pre-registration</a>.
<br><br>
<h4>Tentative schedule</h4>
Depending on the class, I may go faster or slower, so I may not adhere to this exact schedule.
<ol>
<li><strong>Monday: Foundations of Bayesian inference</strong><br>
Probability theory and Bayes' rule, Probability distributions, Understanding and eliciting priors, Analytical Bayes: Beta-Binomial, Poisson-Gamma, Normal-Normal
</li>
<li><strong>Tuesday: Linear models</strong><br>
Basic theory of linear modeling. Generating prior predictive distributions using RStan and R, Fake-data simulation for model evaluation, Sampling methods will be skipped in class but please read the lecture notes later which cover: Inverse sampling, Gibbs sampling, Random Walk Metropolis, Hamiltonian Monte Carlo.
</li>
<li><strong>Wednesday: Hierarchical linear models</strong><br>
HLMs using RStan and brms, fake-data generation, true and false discovery rate, logistics mixed effects models, individual differences, shrinkage.
</li>
<li><strong>Thursday: HLMs continued, exercises</strong><br>
Here we will get some hands-on experience with real life problems.
</li>
<li><strong>Friday: keynote lectures</strong> <br>
Please see the <a href="https://github.com/vasishth/smlp2019/blob/master/SMLP2019_schedule_general.pdf">SMLP schedule</a>.</li>
</ol>
<br><br>
<h4>Additional readings</h4>
<strong>R programming</strong>
<ol>
<li><a href="http://ilustat.com/shared/Getting-Started-in-R.pdf">Getting started with R</a></li>
<li><a href="https://r4ds.had.co.nz/">R for data science</a></li>
<li><a href="https://csgillespie.github.io/efficientR/">Efficient R programming</a>.</li>
</ol>
<strong>Books</strong>
<ol>
<li>
<a href="https://www.amazon.co.uk/Students-Guide-Bayesian-Statistics/dp/1473916364">A Student's Guide to Bayesian Statistics, by Ben Lambert</a>: A good, non-technical introduction to Stan and Bayesian modeling.</li>
<li><a href="https://xcelab.net/rm/statistical-rethinking/">Statistical Rethinking, by Richard McElreath</a>: A classic introduction.</li>
<li><a href="http://www.indiana.edu/~kruschke/DoingBayesianDataAnalysis/">Doing Bayesian Data Analysis, Second Edition:
A Tutorial with R, JAGS, and Stan, By John Kruschke</a>: A good introduction specifically for psychologists.</li>
</ol>
<strong>Tutorial articles</strong>
<ol>
<li>
<a href="https://econpapers.repec.org/article/jssjstsof/v_3a080_3ai01.htm">brms tutorial by the author of the package, Paul Buerkner.</a>
<li><a href="https://psyarxiv.com/x8swp/">Ordinal regression models in psychological research: A tutorial, by Buerkner and Vuorre.</a></li>
<li>
<a href="https://arxiv.org/abs/1807.10451">Contrast coding tutorial, by Schad, Hohenstein, Vasishth, Kliegl.</a>
</li>
<li>
<a href="https://osf.io/b2vx9/">Bayesian workflow tutorial, by Schad, Betancourt, Vasishth</a>.
</li>
<li>
<a href="http://www.tqmp.org/RegularArticles/vol12-3/p175/p175.pdf">Linear mixed models tutorial, Sorensen, Hohenstein, Vasishth.</a>
</li>
<li>
<a href="https://osf.io/g4zpv/">brms tutorial for phonetics/phonology, Vasishth, Nicenboim, Beckman, Li, Kong.</a>
</li>
<li><a href="https://betanalpha.github.io/writing/">Michael Betancourt's resources</a>: These are a must if you want to get deeper into Stan and Bayesian modeling.</li>
<li><a href="https://chi-feng.github.io/mcmc-demo/app.html">MCMC animations/visualizations</a>,<a href="http://elevanth.org/blog/2017/11/28/build-a-better-markov-chain/">McElreath's blog post on MCMC</a></li>
</ol>
<strong>Some example articles from our lab and other groups that use Bayesian methods</strong>
<ol>
<li>
<a href="https://osf.io/g5ndw/">Example random-effects meta-analysis.</a>
</li>
<li>
<a href="https://mc-stan.org/events/stancon2017-notebooks/stancon2017-nicenboim-vasishth-retrieval-models.html">Example of finite mixture models using Stan.</a>
</li>
<li><a href="https://osf.io/eyphj/">Replication attempt of a published study.</a></li>
<li><a href="https://osf.io/mmr7s/">Bayesian analysis of relatively large-sample psycholinguistic experiment.</a></li>
<li><a href="https://avehtari.github.io/RAOS-Examples/">Examples of regression analyses by Vehtari and colleagues</a></li>
</ol>
</p>
</div>
</div>
</section>
</body>
</html>