Course detail

Bayesian Models for Machine Learning (in English)

FIT-BAYaAcad. year: 2022/2023

Probability theory and probability distributions, Bayesian Inference, Inference in Bayesian models with conjugate priors, Inference in Bayesian Networks, Expectation-Maximization algorithm, Approximate inference in Bayesian models using Gibbs sampling, Variational Bayes inference, Stochastic VB, Infinite mixture models, Dirichlet Process, Chinese Restaurant Process, Pitman-Yor Process for Language modeling, Expectation propagation, Gaussian Process, Auto-Encoding Variational Bayes, Practical applications of Bayesian inference

Language of instruction

English

Number of ECTS credits

5

Mode of study

Not applicable.

Offered to foreign students

Of all faculties

Learning outcomes of the course unit

Not applicable.

Prerequisites

Not applicable.

Co-requisites

Not applicable.

Planned learning activities and teaching methods

Not applicable.

Assesment methods and criteria linked to learning outcomes

  • Mid-term exam (24 points)  

  • Submission and presentation of project (25 points) 

  • Final exam (51points) 

To get points from the exam, you need to get min. 20 points, otherwise the exam is rated 0 points.

Course curriculum

Not applicable.

Work placements

Not applicable.

Aims

To demonstrate the limitations of Deep Neural Nets (DNN) that have become a very popular machine learning tool successful in many areas, but that excel only when sufficient amount of well annotated training data is available. To present Bayesian models (BMs) allowing to make robust decisions even in cases of scarce training data as they take into account the uncertainty in the model parameter estimates. To introduce the concept of latent variables making BMs modular (i.e. more complex models can be built out of simpler ones) and well suitable for cases with missing data (e.g. unsupervised learning when annotations are missing). To introduce basic skills and intuitions about the BMs and to develop more advanced topics such as: approximate inference methods necessary for more complex models, infinite mixture models based on non-parametric BMs, or Auto-Encoding Variational Bayes. The course is taught in English.

Specification of controlled education, way of implementation and compensation for absences

Not applicable.

Recommended optional programme components

Not applicable.

Prerequisites and corequisites

Not applicable.

Basic literature

Not applicable.

Recommended reading

http://www.fit.vutbr.cz/study/courses/BAYa/public/ (EN)
C. Bishop: Pattern Recognition and Machine Learning, Springer, 2006 (EN)
P Orbanz: Tutorials on Bayesian Nonparametrics: http://stat.columbia.edu/~porbanz/npb-tutorial.html (EN)
S. J. Gershman and D.M. Blei: A tutorial on Bayesian nonparametric models, Journal of Mathematical Psychology, 2012. (EN)

Classification of course in study plans

  • Programme IT-MGR-1H Master's

    branch MGH , 1 year of study, winter semester, recommended course

  • Programme IT-MSC-2 Master's

    branch MGMe , 0 year of study, winter semester, compulsory-optional

  • Programme MIT-EN Master's 0 year of study, winter semester, compulsory-optional

  • Programme MITAI Master's

    specialization NADE , 0 year of study, winter semester, elective
    specialization NBIO , 0 year of study, winter semester, elective
    specialization NCPS , 0 year of study, winter semester, elective
    specialization NEMB , 0 year of study, winter semester, elective
    specialization NGRI , 0 year of study, winter semester, elective
    specialization NHPC , 0 year of study, winter semester, elective
    specialization NIDE , 0 year of study, winter semester, elective
    specialization NISD , 0 year of study, winter semester, elective
    specialization NISY up to 2020/21 , 0 year of study, winter semester, elective
    specialization NMAL , 0 year of study, winter semester, compulsory
    specialization NMAT , 0 year of study, winter semester, elective
    specialization NNET , 0 year of study, winter semester, elective
    specialization NSEC , 0 year of study, winter semester, elective
    specialization NSEN , 0 year of study, winter semester, elective
    specialization NSPE , 0 year of study, winter semester, elective
    specialization NVER , 0 year of study, winter semester, elective
    specialization NVIZ , 0 year of study, winter semester, elective
    specialization NISY , 0 year of study, winter semester, elective
    specialization NEMB up to 2021/22 , 0 year of study, winter semester, elective

Type of course unit

 

Lecture

26 hod., optionally

Teacher / Lecturer

Syllabus

  1. Probability theory and probability distributions 
  2. Bayesian Inference (priors, uncertainty of the parameter estimates, posterior predictive probability) 
  3. Inference in Bayesian models with conjugate priors 
  4. Inference in Bayesian Networks (loopy belief propagation) 
  5. Expectation-Maximization algorithm (with application to Gaussian Mixture Model) 
  6. Approximate inference in Bayesian models using Gibbs sampling 
  7. Variational Bayes inference, Stochastic VB 
  8. Infinite mixture models, Dirichlet Process, Chinese Restaurant Process 
  9. Pitman-Yor Process for Language modeling 
  10. Expectation propagation 
  11. Gaussian Process 
  12. Auto-Encoding Variational Bayes 
  13. Practical applications of Bayesian inference

Fundamentals seminar

13 hod., optionally

Teacher / Lecturer

Syllabus

Lectures will be immediately followed by demonstration exercises where examples in Python will be presented. Code and data of all demonstrations will be made available to the students and will constitute the basis for the project.

Project

13 hod., optionally

Teacher / Lecturer

Syllabus

The project will follow on the demonstration exercises and will make the student work on provided (simulated or real) data. The students will work in teams in "evaluation" mode and present their results at the final lecture/exercise.