当前位置:首页>>通知公告>> 正文通知公告
Short Course: Bayesian Statistical Learning
来源:  点击次数:次 发布时间:2021-09-28 编辑:统计与数学学院

Course outline

The Bayesian approach to statistical inference has gained widespread use in Statistics and Machine Learning. This course aims to provide a solid introduction to Bayesian data analysis via a mixture of theoretical and methodological concepts, with an emphasis in computer implementation of modern simulation algorithms. The course will be presented through four modules (see details below). The first module introduces the Bayesian paradigm and develops inferential tools for some simple models. The second module considers more advanced models, such as linear regression, spline regression and classification models. The third module focuses on computation and presents state-of-the-art algorithms to carry out Bayesian inference. Finally, the fourth module presents model comparison techniques and advanced topics such as Bayesian variable selection and hierarchical models. In each module, the students will be presented the underlying theory and methodology, which will then be demonstrated through an assignment the students will carry on individually. The computer lab session will then discuss possible solutions. Students will gain knowledge in modern state-of-the-art Bayesian methods that can be applied to solve complex problems in Statistics and Machine Learning. The course puts a lot of emphasis on computer implementation of the presented material. The recommended programming language to solve the computer labs is R because the material presented will use it. However, students may use any software they want (e.g. Python or Julia).

Schedule

The course consists total 32 hours, including 12 lectures of 2 hours each (24 hours lecture time in total) and 4 computer labs of 2 hours each (8 hours computer lab time in total). The proposed weekly schedule is 3 lectures followed by 1 computer lab. The course can then be taught over 4 weeks. The course is divided into 4 modules, so 1 module can be taught over 1 week. The lecturer will deliver the lectures and the computer labs will be interactive sessions.

Lecturer

Dr Matias Quirozis a Senior Lecturer at the School of Mathematical and Physical Sciences at the University of Technology Sydney.He is also an Associate Investigator in ARC Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS).DrQuiroz’sresearch interests lie in the area of Bayesian Statistics. In particular,he is interested in Bayesian computations, such as Monte Carlo methods and variational Bayes. His research output appeared in top tier journals such asJournal of the American Statistical Association,Journal of Computational and Graphical Statistics,Journal of Machine Learning Research,Proceedings of International Conference on Machine Learning (ICML).

Prerequisites

This course is suitable for students who have taken at least one course in statistics above the introductory level and have some programming experience. The students should be confident in their knowledge of concepts such as integrals, derivatives, probability distributions, conditional probability and expectations. The students should have some basic knowledge in linear algebra, including matrix operations such as multiplication and inverses, and concepts such as eigenvalues and eigenvectors. A good and compact text of the mathematical tools used in this subject can be found here:https://gwthomas.github.io/docs/math4ml.pdf. However, note that we will only encounter a subset of these, so students are not expected to know the whole content.

Register the course

Please scan this QR code to register. Registration is open till Oct 8 24:00.If you have any questions, please contact Feng Li .

本课程受中央财经大学引智项目支持。

Schedule and contents

Lecture

Time (Beijing time)

Topic

Module 1:Bayesianbasics.

1

October 10 Sunday

10:00-12:00

The Bayesian paradigm. Single parameter models

2

October 10 Sunday

13:00-15:00

Conjugate priors. Prior elicitation, Noninformative priors. Jeffreys’ prior

3

October 10 Sunday

15:00-17:00

Multi parameter models. Bayesian computation via simulation. Analytic marginalisation.

Lab 1

October 13 Wednesday

12:00-14:00


Module 2 : Regression models.

4

October 13 Wednesday

12:00-14:00

Bayesian prediction. Bayesian inference as a decision theory problem.

5

October 17 Sunday

10:00-12:00

Bayesian linear regression. Shrinkage priors. Bayesian spline regression.

6

October 17 Sunday

12:00-14:00

Asymptotics. Normal approximation. Bayesian classification. Generative models (naïve Bayes). Discriminative models (logistic regression).

Lab 2

October 20 Wednesday

12:00-14:00


Module 3: Bayesian computation.

7

October 23 Saturday

10:00-12:00

More on Bayesian computations. Monte Carlo integration. Importance sampling. Inverse cdf method. Rejection sampling.

8

October 23 Saturday

13:00-15:00

Markov processes. The Gibbs sampler. Data augmentation.

9

October 24 Sunday

10:00-12:00

The Metropolis and Metropolis-Hastings sampler. Efficiency of simulation. Assessing convergence.

Lab 3

October 27 Wednesday

12:00-14:00


Module 4: Model inference and hierarchical models.

10

October 27 Wednesday

14:00-16:00

Bayesian model comparison. Marginal likelihoods. Bayesian model averaging.

11

October 30 Saturday

10:00-12:00

Bayesian variable selection. Posterior predictive model evaluation.

12

October 30 Saturday

13:00-15:00

Hierarchical models. Pooling estimates. MCMC sampling with RStan.

Lab 4

October 31 Sunday

12:30-14:30


首页

版权所有:中央财经大学统计与数学学院
地址:北京市昌平区沙河高教园中央财经大学沙河校区1号学院楼 邮政编码:102206 电 话:(010)61776184
邮箱:samofcufe@cufe.edu.cn

学院公众号

Baidu
map