Об этом курсе
Оценки: 204
Рецензии: 31

100% онлайн

Начните сейчас и учитесь по собственному графику.

Гибкие сроки

Назначьте сроки сдачи в соответствии со своим графиком.

Продвинутый уровень

Прибл. 24 часа на выполнение


Субтитры: Английский

Приобретаемые навыки

AlgorithmsExpectation–Maximization (EM) AlgorithmGraphical ModelMarkov Random Field

100% онлайн

Начните сейчас и учитесь по собственному графику.

Гибкие сроки

Назначьте сроки сдачи в соответствии со своим графиком.

Продвинутый уровень

Прибл. 24 часа на выполнение


Субтитры: Английский

Программа курса: что вы изучите

16 минуты на завершение

Learning: Overview

This module presents some of the learning tasks for probabilistic graphical models that we will tackle in this course....
1 видео ((всего 16 мин.))
1 видео
1 ч. на завершение

Review of Machine Learning Concepts from Prof. Andrew Ng's Machine Learning Class (Optional)

This module contains some basic concepts from the general framework of machine learning, taken from Professor Andrew Ng's Stanford class offered on Coursera. Many of these concepts are highly relevant to the problems we'll tackle in this course....
6 видео ((всего 59 мин.))
6 видео
Regularization: Cost Function 10мин
Evaluating a Hypothesis 7мин
Model Selection and Train Validation Test Sets 12мин
Diagnosing Bias vs Variance 7мин
Regularization and Bias Variance11мин
2 ч. на завершение

Parameter Estimation in Bayesian Networks

This module discusses the simples and most basic of the learning problems in probabilistic graphical models: that of parameter estimation in a Bayesian network. We discuss maximum likelihood estimation, and the issues with it. We then discuss Bayesian estimation and how it can ameliorate these problems....
5 видео ((всего 77 мин.)), 2 тестов
5 видео
Maximum Likelihood Estimation for Bayesian Networks15мин
Bayesian Estimation15мин
Bayesian Prediction13мин
Bayesian Estimation for Bayesian Networks17мин
2 практического упражнения
Learning in Parametric Models18мин
Bayesian Priors for BNs8мин
21 ч. на завершение

Learning Undirected Models

In this module, we discuss the parameter estimation problem for Markov networks - undirected graphical models. This task is considerably more complex, both conceptually and computationally, than parameter estimation for Bayesian networks, due to the issues presented by the global partition function....
3 видео ((всего 52 мин.)), 2 тестов
3 видео
Maximum Likelihood for Conditional Random Fields13мин
MAP Estimation for MRFs and CRFs9мин
1 практическое упражнение
Parameter Estimation in MNs6мин
17 ч. на завершение

Learning BN Structure

This module discusses the problem of learning the structure of Bayesian networks. We first discuss how this problem can be formulated as an optimization problem over a space of graph structures, and what are good ways to score different structures so as to trade off fit to data and model complexity. We then talk about how the optimization problem can be solved: exactly in a few cases, approximately in most others....
7 видео ((всего 106 мин.)), 3 тестов
7 видео
Likelihood Scores16мин
BIC and Asymptotic Consistency11мин
Bayesian Scores20мин
Learning Tree Structured Networks12мин
Learning General Graphs: Heuristic Search23мин
Learning General Graphs: Search and Decomposability15мин
2 практического упражнения
Structure Scores10мин
Tree Learning and Hill Climbing8мин
22 ч. на завершение

Learning BNs with Incomplete Data

In this module, we discuss the problem of learning models in cases where some of the variables in some of the data cases are not fully observed. We discuss why this situation is considerably more complex than the fully observable case. We then present the Expectation Maximization (EM) algorithm, which is used in a wide variety of problems....
5 видео ((всего 83 мин.)), 3 тестов
5 видео
Expectation Maximization - Intro16мин
Analysis of EM Algorithm11мин
EM in Practice11мин
Latent Variables22мин
2 практического упражнения
Learning with Incomplete Data8мин
Expectation Maximization14мин
1 ч. на завершение

Learning Summary and Final

This module summarizes some of the issues that arise when learning probabilistic graphical models from data. It also contains the course final....
1 видео ((всего 20 мин.)), 1 тест
1 видео
1 практическое упражнение
Learning: Final Exam24мин
25 минуты на завершение

PGM Wrapup

This module contains an overview of PGM methods as a whole, discussing some of the real-world tradeoffs when using this framework in practice. It refers to topics from all three of the PGM courses....
1 видео ((всего 25 мин.))
1 видео
Рецензии: 31Chevron Right


начал новую карьеру, пройдя эти курсы


получил значимые преимущества в карьере благодаря этому курсу


стал больше зарабатывать или получил повышение

Лучшие рецензии

автор: LLJan 30th 2018

very good course for PGM learning and concept for machine learning programming. Just some description for quiz of final exam is somehow unclear, which lead to a little bit confusing.

автор: ZZFeb 14th 2017

Great course! Very informative course videos and challenging yet rewarding programming assignments. Hope that the mentors can be more helpful in timely responding for questions.



Daphne Koller

School of Engineering

О Стэнфордский университет

The Leland Stanford Junior University, commonly referred to as Stanford University or Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto, California, United States....

О специализации ''Графические вероятностные модели '

Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems....
Графические вероятностные модели

Часто задаваемые вопросы

  • Зарегистрировавшись на сертификацию, вы получите доступ ко всем видео, тестам и заданиям по программированию (если они предусмотрены). Задания по взаимной оценке сокурсниками можно сдавать и проверять только после начала сессии. Если вы проходите курс без оплаты, некоторые задания могут быть недоступны.

  • Записавшись на курс, вы получите доступ ко всем курсам в специализации, а также возможность получить сертификат о его прохождении. После успешного прохождения курса на странице ваших достижений появится электронный сертификат. Оттуда его можно распечатать или прикрепить к профилю LinkedIn. Просто ознакомиться с содержанием курса можно бесплатно.

  • Compute the sufficient statistics of a data set that are necessary for learning a PGM from data

    Implement both maximum likelihood and Bayesian parameter estimation for Bayesian networks

    Implement maximum likelihood and MAP parameter estimation for Markov networks

    Formulate a structure learning problem as a combinatorial optimization task over a space of network structure, and evaluate which scoring function is appropriate for a given situation

    Utilize PGM inference algorithms in ways that support more effective parameter estimation for PGMs

    Implement the Expectation Maximization (EM) algorithm for Bayesian networks

    Honors track learners will get hands-on experience in implementing both EM and structure learning for tree-structured networks, and apply them to real-world tasks

Остались вопросы? Посетите Центр поддержки учащихся.