Skip to content

42048 - Probablistic Graphical Models (MGP) [UB]


Type: S3 Course
Semester: Spring
ECTS: 6
Teaching Points:
Offer: Annual
Responsible Unit: UB
Responsible: Karina Gibert (UPC)
Language: English
Requirements:

GOALS

Many of the problems in artificial intelligence, statistics, computer systems, computer vision, natural language processing, and computational biology, among many other fields, can be viewed as the search for a coherent global conclusion from local information. The general graphical models framework provides an unified view for this wide range of problems, enabling efficient inference, decision-making and learning in problems with a very large number of attributes and huge datasets. This course will provide you with a strong foundation for both applying graphical models to complex problems and for addressing core research topics in graphical models

The class will cover three aspects: The core representation, including Bayesian and Markov networks and relational models; probabilistic inference algorithms, both exact and approximate; and, learning methods for both the parameters and the structure of graphical models. Students entering the class should have a pre-existing working knowledge of probability, statistics, and algorithms.


CONTENTS
See detailed current content here.

1. Introduction

2. Directed graphical models (Bayesian networks)

2.1. Representation

2.2. Semantics

2.3. MLE parameter learning

2.4. Structure learning for BNs - complete data

2.4.1. Constraint-based

2.4.2. Chow-Liu

2.4.3. Fixed-order

2.4.4. Structure search

2.5. Exact inference

2.5.1. Variable elimination

2.5.2. Junction trees

2.5.3. Context-specific independence

3. Undirected graphical models (Markov Random Fields, Factor graphs)

3.1. Undirected models - Representation

3.2. Factor graphs - unifying representation

3.3. Exponential family

4. Approximate inference

4.1. Sampling

4.1.1. Importance sampling

4.1.2. MCMC, Gibbs

4.2. Variational inference

4.3. Loopy belief propagation

4.3.1. Generalized belief propagation

4.3.2. Kikuchi

5. Learning revisited

5.1. Parameter estimation in BNs with missing data

5.1.1. EM

5.1.2. Gradient descent

5.2. Structure learning for BNs - missing data

5.3. Learning undirected graphical models

5.3.1. Gradient algorithms

5.3.2. IPF for tabular MRFs

5.3.3. Structure Learning

6. Advanced topics


BIBLIOGRAPHY

  • Daphne Koller and Nir Friedman, Bayesian Networks and Beyond, in preparation.
  • M. I. Jordan, An Introduction to Probabilistic Graphical Models, in preparation.
  • S. Lauritzen. Graphical Models. Oxford University Press.
  • David MacKay. Information Theory, Inference and Learning Algorithms.