Об этом курсе
4.0
Оценки: 420
Рецензии: 90
Специализация
100% онлайн

100% онлайн

Начните сейчас и учитесь по собственному графику.
Гибкие сроки

Гибкие сроки

Назначьте сроки сдачи в соответствии со своим графиком.
Промежуточный уровень

Промежуточный уровень

Часов на завершение

Прибл. 18 часа на выполнение

Предполагаемая нагрузка: 4 weeks of study, 4-5 hours/week...
Доступные языки

Английский

Субтитры: Английский...

Приобретаемые навыки

Python ProgrammingPrincipal Component Analysis (PCA)Projection MatrixMathematical Optimization
Специализация
100% онлайн

100% онлайн

Начните сейчас и учитесь по собственному графику.
Гибкие сроки

Гибкие сроки

Назначьте сроки сдачи в соответствии со своим графиком.
Промежуточный уровень

Промежуточный уровень

Часов на завершение

Прибл. 18 часа на выполнение

Предполагаемая нагрузка: 4 weeks of study, 4-5 hours/week...
Доступные языки

Английский

Субтитры: Английский...

Программа курса: что вы изучите

Неделя
1
Часов на завершение
5 ч. на завершение

Statistics of Datasets

Principal Component Analysis (PCA) is one of the most important dimensionality reduction algorithms in machine learning. In this course, we lay the mathematical foundations to derive and understand PCA from a geometric point of view. In this module, we learn how to summarize datasets (e.g., images) using basic statistics, such as the mean and the variance. We also look at properties of the mean and the variance when we shift or scale the original data set. We will provide mathematical intuition as well as the skills to derive the results. We will also implement our results in code (jupyter notebooks), which will allow us to practice our mathematical understand to compute averages of image data sets....
Reading
8 видео (всего 27 мин.), 5 материалов для самостоятельного изучения, 4 тестов
Video8 видео
Welcome to module 1мин
Mean of a dataset4мин
Variance of one-dimensional datasets4мин
Variance of higher-dimensional datasets5мин
Effect on the mean4мин
Effect on the (co)variance3мин
See you next module!мин
Reading5 материала для самостоятельного изучения
About Imperial College & the team5мин
How to be successful in this course5мин
Grading policy5мин
Additional readings & helpful references5мин
Symmetric, positive definite matrices10мин
Quiz3 практического упражнения
Mean of datasets15мин
Variance of 1D datasets15мин
Covariance matrix of a two-dimensional dataset15мин
Неделя
2
Часов на завершение
4 ч. на завершение

Inner Products

Data can be interpreted as vectors. Vectors allow us to talk about geometric concepts, such as lengths, distances and angles to characterise similarity between vectors. This will become important later in the course when we discuss PCA. In this module, we will introduce and practice the concept of an inner product. Inner products allow us to talk about geometric concepts in vector spaces. More specifically, we will start with the dot product (which we may still know from school) as a special case of an inner product, and then move toward a more general concept of an inner product, which play an integral part in some areas of machine learning, such as kernel machines (this includes support vector machines and Gaussian processes). We have a lot of exercises in this module to practice and understand the concept of inner products....
Reading
8 видео (всего 36 мин.), 1 материал для самостоятельного изучения, 5 тестов
Video8 видео
Dot product4мин
Inner product: definition5мин
Inner product: length of vectors7мин
Inner product: distances between vectors3мин
Inner product: angles and orthogonality5мин
Inner products of functions and random variables (optional)7мин
Heading for the next module!мин
Reading1 материала для самостоятельного изучения
Basis vectors20мин
Quiz4 практического упражнения
Dot product10мин
Properties of inner products20мин
General inner products: lengths and distances20мин
Angles between vectors using a non-standard inner product20мин
Неделя
3
Часов на завершение
4 ч. на завершение

Orthogonal Projections

In this module, we will look at orthogonal projections of vectors, which live in a high-dimensional vector space, onto lower-dimensional subspaces. This will play an important role in the next module when we derive PCA. We will start off with a geometric motivation of what an orthogonal projection is and work our way through the corresponding derivation. We will end up with a single equation that allows us to project any vector onto a lower-dimensional subspace. However, we will also understand how this equation came about. As in the other modules, we will have both pen-and-paper practice and a small programming example with a jupyter notebook....
Reading
6 видео (всего 25 мин.), 1 материал для самостоятельного изучения, 3 тестов
Video6 видео
Projection onto 1D subspaces7мин
Example: projection onto 1D subspaces3мин
Projections onto higher-dimensional subspaces8мин
Example: projection onto a 2D subspace3мин
This was module 3!мин
Reading1 материала для самостоятельного изучения
Full derivation of the projection20мин
Quiz2 практического упражнения
Projection onto a 1-dimensional subspace25мин
Project 3D data onto a 2D subspace40мин
Неделя
4
Часов на завершение
5 ч. на завершение

Principal Component Analysis

We can think of dimensionality reduction as a way of compressing data with some loss, similar to jpg or mp3. Principal Component Analysis (PCA) is one of the most fundamental dimensionality reduction techniques that are used in machine learning. In this module, we use the results from the first three modules of this course and derive PCA from a geometric point of view. Within this course, this module is the most challenging one, and we will go through an explicit derivation of PCA plus some coding exercises that will make us a proficient user of PCA. ...
Reading
10 видео (всего 52 мин.), 5 материалов для самостоятельного изучения, 2 тестов
Video10 видео
Problem setting and PCA objective7мин
Finding the coordinates of the projected data5мин
Reformulation of the objective10мин
Finding the basis vectors that span the principal subspace7мин
Steps of PCA4мин
PCA in high dimensions5мин
Other interpretations of PCA (optional)7мин
Summary of this moduleмин
This was the course on PCAмин
Reading5 материала для самостоятельного изучения
Vector spaces20мин
Orthogonal complements10мин
Multivariate chain rule10мин
Lagrange multipliers10мин
Did you like the course? Let us know!10мин
Quiz1 практического упражнения
Chain rule practice20мин
4.0

Лучшие рецензии

автор: JSJul 17th 2018

This is one hell of an inspiring course that demystified the difficult concepts and math behind PCA. Excellent instructors in imparting the these knowledge with easy-to-understand illustrations.

автор: JVMay 1st 2018

This course was definitely a bit more complex, not so much in assignments but in the core concepts handled, than the others in the specialisation. Overall, it was fun to do this course!

Преподавателя

Avatar

Marc P. Deisenroth

Lecturer in Statistical Machine Learning
Department of Computing

О Imperial College London

Imperial College London is a world top ten university with an international reputation for excellence in science, engineering, medicine and business. located in the heart of London. Imperial is a multidisciplinary space for education, research, translation and commercialisation, harnessing science and innovation to tackle global challenges. Imperial students benefit from a world-leading, inclusive educational experience, rooted in the College’s world-leading research. Our online courses are designed to promote interactivity, learning and the development of core skills, through the use of cutting-edge digital technology....

О специализации ''Mathematics for Machine Learning'

For a lot of higher level courses in Machine Learning and Data Science, you find you need to freshen up on the basics in mathematics - stuff you may have studied before in school or university, but which was taught in another context, or not very intuitively, such that you struggle to relate it to how it’s used in Computer Science. This specialization aims to bridge that gap, getting you up to speed in the underlying mathematics, building an intuitive understanding, and relating it to Machine Learning and Data Science. In the first course on Linear Algebra we look at what linear algebra is and how it relates to data. Then we look through what vectors and matrices are and how to work with them. The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data. It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting. The third course, Dimensionality Reduction with Principal Component Analysis, uses the mathematics from the first two courses to compress high-dimensional data. This course is of intermediate difficulty and will require basic Python and numpy knowledge. At the end of this specialization you will have gained the prerequisite mathematical knowledge to continue your journey and take more advanced courses in machine learning....
Mathematics for Machine Learning

Часто задаваемые вопросы

  • Зарегистрировавшись на сертификацию, вы получите доступ ко всем видео, тестам и заданиям по программированию (если они предусмотрены). Задания по взаимной оценке сокурсниками можно сдавать и проверять только после начала сессии. Если вы проходите курс без оплаты, некоторые задания могут быть недоступны.

  • Записавшись на курс, вы получите доступ ко всем курсам в специализации, а также возможность получить сертификат о его прохождении. После успешного прохождения курса на странице ваших достижений появится электронный сертификат. Оттуда его можно распечатать или прикрепить к профилю LinkedIn. Просто ознакомиться с содержанием курса можно бесплатно.

Остались вопросы? Посетите Центр поддержки учащихся.