Back close

Course Detail

Course Name Probabilistic Graphical Models
Course Code 24AI633
Program M. Tech. in Artificial Intelligence
Semester Soft Core
Credits 4
Campus Amritapuri ,Coimbatore

Syllabus

Introduction: Probability distributions, random variables, joint distributions, random process, graphs, undirected and Directed Graphical Models. Representation: Bayesian Networks – Independence in graphs – d-separation, I-equivalence, minimal I-maps. Undirected Graphical models: Gibbs distribution and Markov Networks, Markov models and Hidden Markov Models. From Bayesian to Markov and Markov to Bayesian networks, Triangulation and Chordal Graphs. Directed Gaussian graphical models. Exponential Family Models. Factor Graph Representation. Conditional Random Fields. Other special Cases: Chains, Trees.

Inference: Variable Elimination (Sum Product and Max-Product). Junction Tree Algorithm. Forward Backward Algorithm (for HMMs). Loopy Belief Propagation. Markov Chain Monte Carlo Metropolis Hastings. Importance Sampling. Gibbs Sampling. Variational Inference.

Learning Graphical models: Discriminative vs. Generative Learning., Density estimation, learning as optimization, maximum likelihood estimation for Bayesian networks, structure learning in Bayesian networks, Parameter Estimation in Markov Networks. Structure Learning. Learning undirected models- EM: Handling Missing Data. Applications in Vision, Web/IR, NLP and Biology. Advanced Topics: Statistical Relational Learning, Markov Logic Networks

CO-PO Mapping

 

COs

Description

PO1

PO2

PO3

PO4

PO5

CO1

Understand the process of encoding probability

distributions using graphs

3

2

3

1

2

CO2

Analyze the independence properties of the graph

structure

3

3

3

1

2

CO3

Understand and analyze Markov networks for the

graphical modeling of probability distributions

3

2

3

2

2

CO4

Familiarize methods that approximate joint distributions

3

2

2

CO5

Study and evaluate methods to learn the parameters of networks with known and unknown structures using real life data sets

3

3

4

2

4

CO-PO Mapping

 

COs

Description

PO1

PO2

PO3

PO4

PO5

CO1

Understand the process of encoding probability

distributions using graphs

3

2

3

1

2

CO2

Analyze the independence properties of the graph

structure

3

3

3

1

2

CO3

Understand and analyze Markov networks for the

graphical modeling of probability distributions

3

2

3

2

2

CO4

Familiarize methods that approximate joint distributions

3

2

2

CO5

Study and evaluate methods to learn the parameters of networks with known and unknown structures using real life data sets

3

3

4

2

4

Objectives and Outcomes

Preamble

Probabilistic graphical models use a graph-based representation for encoding complex distributions over a high-dimensional space. This course deals with representation, inference, and learning of probabilistic graphical models. Students will gain an in-depth understanding of several types of graphical models; basic ideas underlying exact inference in probabilistic graphical models and learning probabilistic models from data.

 

Course Objectives

  • To enable students to model problems using graphical models
  • To design inference algorithms
  • To learn the structure of the graphical model from the data set

 

Course Outcomes

 

COs

Description

CO1

Understand the process of encoding probability distributions using graphs

CO2

Analyze the independence properties of the graph structure

CO3

Understand and analyze Markov networks for the graphical modeling of probability

distributions

CO4

Familiarize methods that approximate joint distributions

CO5

Study and evaluate methods to learn the parameters of networks with known and

unknown structures

 

Prerequisites

  • Probability and Statistics
  • Programming Languages
  • Algorithm Design

Evaluation Pattern

Evaluation Pattern – 70:30

 

  • Midterm Exam – 30%
  • Continuous Evaluation – 20%
  • End Semester Exam – 50%

Text Books / References

Text Book / References

  1. Daphne Koller and Nir Friedman, ”Probabilistic Graphical Models: Principles and Techniques”, First Edition, MIT Press, 2009.
  2. Michael Jordan, ”Learning in Graphical Models”. MIT Press, 1998. Collection of Papers.
  3. Judea Pearl, Morgan Kaufmann, ”Probabilistic Reasoning in Intelligent Systems”, 1988.
  4. Kevin P. Murphy, “Machine Learning, a probabilistic perspective”, The MIT Press Cambridge, Massachusetts, 2012.
  5. Darwiche Adnan, ”Modeling and reasoning with Bayesian networks”, Cambridge university press, 2009.

DISCLAIMER: The appearance of external links on this web site does not constitute endorsement by the School of Biotechnology/Amrita Vishwa Vidyapeetham or the information, products or services contained therein. For other than authorized activities, the Amrita Vishwa Vidyapeetham does not exercise any editorial control over the information you may find at these locations. These links are provided consistent with the stated purpose of this web site.

Admissions Apply Now