Об этом курсе
4.8
Оценки: 3,861
Рецензии: 751
Специализация
100% онлайн

100% онлайн

Начните сейчас и учитесь по собственному графику.
Гибкие сроки

Гибкие сроки

Назначьте сроки сдачи в соответствии со своим графиком.
Часов на завершение

Прибл. 27 часа на выполнение

Предполагаемая нагрузка: 6 weeks of study, 5-8 hours/week...
Доступные языки

Английский

Субтитры: Английский, Арабский...

Приобретаемые навыки

Linear RegressionRidge RegressionLasso (Statistics)Regression Analysis
Специализация
100% онлайн

100% онлайн

Начните сейчас и учитесь по собственному графику.
Гибкие сроки

Гибкие сроки

Назначьте сроки сдачи в соответствии со своим графиком.
Часов на завершение

Прибл. 27 часа на выполнение

Предполагаемая нагрузка: 6 weeks of study, 5-8 hours/week...
Доступные языки

Английский

Субтитры: Английский, Арабский...

Программа курса: что вы изучите

Неделя
1
Часов на завершение
1 ч. на завершение

Welcome

Regression is one of the most important and broadly used machine learning and statistics tools out there. It allows you to make predictions from data by learning the relationship between features of your data and some observed, continuous-valued response. Regression is used in a massive number of applications ranging from predicting stock prices to understanding gene regulatory networks.<p>This introduction to the course provides you with an overview of the topics we will cover and the background knowledge and resources we assume you have....
Reading
5 видео (всего 20 мин.), 3 материалов для самостоятельного изучения
Video5 видео
Welcome!1мин
What is the course about?3мин
Outlining the first half of the course5мин
Outlining the second half of the course5мин
Assumed background4мин
Reading3 материала для самостоятельного изучения
Important Update regarding the Machine Learning Specialization10мин
Slides presented in this module10мин
Reading: Software tools you'll need10мин
Часов на завершение
3 ч. на завершение

Simple Linear Regression

Our course starts from the most basic regression model: Just fitting a line to data. This simple model for forming predictions from a single, univariate feature of the data is appropriately called "simple linear regression".<p> In this module, we describe the high-level regression task and then specialize these concepts to the simple linear regression case. You will learn how to formulate a simple regression model and fit the model to data using both a closed-form solution as well as an iterative optimization algorithm called gradient descent. Based on this fitted function, you will interpret the estimated model parameters and form predictions. You will also analyze the sensitivity of your fit to outlying observations.<p> You will examine all of these concepts in the context of a case study of predicting house prices from the square feet of the house....
Reading
25 видео (всего 122 мин.), 5 материалов для самостоятельного изучения, 2 тестов
Video25 видео
Regression fundamentals: data & model8мин
Regression fundamentals: the task2мин
Regression ML block diagram4мин
The simple linear regression model2мин
The cost of using a given line6мин
Using the fitted line6мин
Interpreting the fitted line6мин
Defining our least squares optimization objective3мин
Finding maxima or minima analytically7мин
Maximizing a 1d function: a worked example2мин
Finding the max via hill climbing6мин
Finding the min via hill descent3мин
Choosing stepsize and convergence criteria6мин
Gradients: derivatives in multiple dimensions5мин
Gradient descent: multidimensional hill descent6мин
Computing the gradient of RSS7мин
Approach 1: closed-form solution5мин
Approach 2: gradient descent7мин
Comparing the approaches1мин
Influence of high leverage points: exploring the data4мин
Influence of high leverage points: removing Center City7мин
Influence of high leverage points: removing high-end towns3мин
Asymmetric cost functions3мин
A brief recap1мин
Reading5 материала для самостоятельного изучения
Slides presented in this module10мин
Optional reading: worked-out example for closed-form solution10мин
Optional reading: worked-out example for gradient descent10мин
Download notebooks to follow along10мин
Reading: Fitting a simple linear regression model on housing data10мин
Quiz2 практического упражнения
Simple Linear Regression14мин
Fitting a simple linear regression model on housing data8мин
Неделя
2
Часов на завершение
3 ч. на завершение

Multiple Regression

The next step in moving beyond simple linear regression is to consider "multiple regression" where multiple features of the data are used to form predictions. <p> More specifically, in this module, you will learn how to build models of more complex relationship between a single variable (e.g., 'square feet') and the observed response (like 'house sales price'). This includes things like fitting a polynomial to your data, or capturing seasonal changes in the response value. You will also learn how to incorporate multiple input variables (e.g., 'square feet', '# bedrooms', '# bathrooms'). You will then be able to describe how all of these models can still be cast within the linear regression framework, but now using multiple "features". Within this multiple regression framework, you will fit models to data, interpret estimated coefficients, and form predictions. <p>Here, you will also implement a gradient descent algorithm for fitting a multiple regression model....
Reading
19 видео (всего 87 мин.), 5 материалов для самостоятельного изучения, 3 тестов
Video19 видео
Polynomial regression3мин
Modeling seasonality8мин
Where we see seasonality3мин
Regression with general features of 1 input2мин
Motivating the use of multiple inputs4мин
Defining notation3мин
Regression with features of multiple inputs3мин
Interpreting the multiple regression fit7мин
Rewriting the single observation model in vector notation6мин
Rewriting the model for all observations in matrix notation4мин
Computing the cost of a D-dimensional curve9мин
Computing the gradient of RSS3мин
Approach 1: closed-form solution3мин
Discussing the closed-form solution4мин
Approach 2: gradient descent2мин
Feature-by-feature update9мин
Algorithmic summary of gradient descent approach4мин
A brief recap1мин
Reading5 материала для самостоятельного изучения
Slides presented in this module10мин
Optional reading: review of matrix algebra10мин
Reading: Exploring different multiple regression models for house price prediction10мин
Numpy tutorial10мин
Reading: Implementing gradient descent for multiple regression10мин
Quiz3 практического упражнения
Multiple Regression18мин
Exploring different multiple regression models for house price prediction16мин
Implementing gradient descent for multiple regression10мин
Неделя
3
Часов на завершение
2 ч. на завершение

Assessing Performance

Having learned about linear regression models and algorithms for estimating the parameters of such models, you are now ready to assess how well your considered method should perform in predicting new data. You are also ready to select amongst possible models to choose the best performing. <p> This module is all about these important topics of model selection and assessment. You will examine both theoretical and practical aspects of such analyses. You will first explore the concept of measuring the "loss" of your predictions, and use this to define training, test, and generalization error. For these measures of error, you will analyze how they vary with model complexity and how they might be utilized to form a valid assessment of predictive performance. This leads directly to an important conversation about the bias-variance tradeoff, which is fundamental to machine learning. Finally, you will devise a method to first select amongst models and then assess the performance of the selected model. <p>The concepts described in this module are key to all machine learning problems, well-beyond the regression setting addressed in this course....
Reading
14 видео (всего 93 мин.), 2 материалов для самостоятельного изучения, 2 тестов
Video14 видео
What do we mean by "loss"?4мин
Training error: assessing loss on the training set7мин
Generalization error: what we really want8мин
Test error: what we can actually compute4мин
Defining overfitting2мин
Training/test split1мин
Irreducible error and bias6мин
Variance and the bias-variance tradeoff6мин
Error vs. amount of data6мин
Formally defining the 3 sources of error14мин
Formally deriving why 3 sources of error20мин
Training/validation/test split for model selection, fitting, and assessment7мин
A brief recap1мин
Reading2 материала для самостоятельного изучения
Slides presented in this module10мин
Reading: Exploring the bias-variance tradeoff10мин
Quiz2 практического упражнения
Assessing Performance26мин
Exploring the bias-variance tradeoff8мин
Неделя
4
Часов на завершение
3 ч. на завершение

Ridge Regression

You have examined how the performance of a model varies with increasing model complexity, and can describe the potential pitfall of complex models becoming overfit to the training data. In this module, you will explore a very simple, but extremely effective technique for automatically coping with this issue. This method is called "ridge regression". You start out with a complex model, but now fit the model in a manner that not only incorporates a measure of fit to the training data, but also a term that biases the solution away from overfitted functions. To this end, you will explore symptoms of overfitted functions and use this to define a quantitative measure to use in your revised optimization objective. You will derive both a closed-form and gradient descent algorithm for fitting the ridge regression objective; these forms are small modifications from the original algorithms you derived for multiple regression. To select the strength of the bias away from overfitting, you will explore a general-purpose method called "cross validation". <p>You will implement both cross-validation and gradient descent to fit a ridge regression model and select the regularization constant....
Reading
16 видео (всего 85 мин.), 5 материалов для самостоятельного изучения, 3 тестов
Video16 видео
Overfitting demo7мин
Overfitting for more general multiple regression models3мин
Balancing fit and magnitude of coefficients7мин
The resulting ridge objective and its extreme solutions5мин
How ridge regression balances bias and variance1мин
Ridge regression demo9мин
The ridge coefficient path4мин
Computing the gradient of the ridge objective5мин
Approach 1: closed-form solution6мин
Discussing the closed-form solution5мин
Approach 2: gradient descent9мин
Selecting tuning parameters via cross validation3мин
K-fold cross validation5мин
How to handle the intercept6мин
A brief recap1мин
Reading5 материала для самостоятельного изучения
Slides presented in this module10мин
Download the notebook and follow along10мин
Download the notebook and follow along10мин
Reading: Observing effects of L2 penalty in polynomial regression10мин
Reading: Implementing ridge regression via gradient descent10мин
Quiz3 практического упражнения
Ridge Regression18мин
Observing effects of L2 penalty in polynomial regression14мин
Implementing ridge regression via gradient descent16мин
4.8
Рецензии: 751Chevron Right
Формирование карьерного пути

38%

начал новую карьеру, пройдя эти курсы
Карьерные преимущества

83%

получил значимые преимущества в карьере благодаря этому курсу
Продвижение по карьерной лестнице

12%

стал больше зарабатывать или получил повышение

Лучшие рецензии

автор: PDMar 17th 2016

I really enjoyed all the concepts and implementations I did along this course....except during the Lasso module. I found this module harder than the others but very interesting as well. Great course!

автор: RSNov 30th 2016

Excelent course, I highly recommend for those who are willing to learn machine learning from the basis, this module (linear regression) covered very important parts that I used to struggle to learn

Преподавателя

Avatar

Emily Fox

Amazon Professor of Machine Learning
Statistics
Avatar

Carlos Guestrin

Amazon Professor of Machine Learning
Computer Science and Engineering

О University of Washington

Founded in 1861, the University of Washington is one of the oldest state-supported institutions of higher education on the West Coast and is one of the preeminent research universities in the world....

О специализации ''Machine Learning'

This Specialization from leading researchers at the University of Washington introduces you to the exciting, high-demand field of Machine Learning. Through a series of practical case studies, you will gain applied experience in major areas of Machine Learning including Prediction, Classification, Clustering, and Information Retrieval. You will learn to analyze large and complex datasets, create systems that adapt and improve over time, and build intelligent applications that can make predictions from data....
Machine Learning

Часто задаваемые вопросы

  • Зарегистрировавшись на сертификацию, вы получите доступ ко всем видео, тестам и заданиям по программированию (если они предусмотрены). Задания по взаимной оценке сокурсниками можно сдавать и проверять только после начала сессии. Если вы проходите курс без оплаты, некоторые задания могут быть недоступны.

  • Записавшись на курс, вы получите доступ ко всем курсам в специализации, а также возможность получить сертификат о его прохождении. После успешного прохождения курса на странице ваших достижений появится электронный сертификат. Оттуда его можно распечатать или прикрепить к профилю LinkedIn. Просто ознакомиться с содержанием курса можно бесплатно.

Остались вопросы? Посетите Центр поддержки учащихся.