Вернуться к Mathematics for Machine Learning: Multivariate Calculus

# Отзывы учащихся о курсе Mathematics for Machine Learning: Multivariate Calculus от партнера Имперский колледж Лондона

4.7
звезд
Оценки: 4,775
Рецензии: 850

## О курсе

This course offers a brief introduction to the multivariate calculus required to build many common machine learning techniques. We start at the very beginning with a refresher on the “rise over run” formulation of a slope, before converting this to the formal definition of the gradient of a function. We then start to build up a set of tools for making calculus easier and faster. Next, we learn how to calculate vectors that point up hill on multidimensional surfaces and even put this into action using an interactive game. We take a look at how we can use calculus to build approximations to functions, as well as helping us to quantify how accurate we should expect those approximations to be. We also spend some time talking about where calculus comes up in the training of neural networks, before finally showing you how it is applied in linear regression models. This course is intended to offer an intuitive understanding of calculus, as well as the language necessary to look concepts up yourselves when you get stuck. Hopefully, without going into too much detail, you’ll still come away with the confidence to dive into some more focused machine learning courses in future....

## Лучшие рецензии

JT
12 нояб. 2018 г.

Excellent course. I completed this course with no prior knowledge of multivariate calculus and was successful nonetheless. It was challenging and extremely interesting, informative, and well designed.

SS
3 авг. 2019 г.

Very Well Explained. Good content and great explanation of content. Complex topics are also covered in very easy way. Very Helpful for learning much more complex topics for Machine Learning in future.

Фильтр по:

## 651–675 из 853 отзывов о курсе Mathematics for Machine Learning: Multivariate Calculus

автор: Harsh D

26 июня 2018 г.

good

автор: Roberto

25 мар. 2021 г.

thx

автор: Omar D

5 мая 2020 г.

gd

автор: Aidana P B

26 апр. 2021 г.

щ

автор: Naga V B G

7 авг. 2020 г.

.

автор: Rinat T

1 авг. 2018 г.

the part about neural networks needs improvement (some more examples of simple networks, the explanation of the emergence of the sigmoid function). exercises on partial derivatives need to be focused more on various aspects of partial differentiation rather than on taking partial derivatives of some complicated functions. I felt like there was too much of the latter which is not very efficient because the idea of partial differentiation is easy to master but not always its applications. just taking partial derivatives of some sophisticated functions (be it for the sake of Jacobian or Hessian calculation) turns into just doing lots of algebra the idea behind which has been long understood. so while some currently existing exercises on partial differentiation, Jacobian and Hessian should be retained, about 50 percent or so of them should be replaced with exercises which are not heavy on algebra but rather demonstrate different ways and/or applications in which partial differentiation is used. otherwise all good.

автор: yarusx

8 апр. 2020 г.

1) Totally British English with a bunch of very rare-used words and phrases globally. 2) The pace of the course is just not suitable for me. If you don't have strong math or engineer background you will need to search for the explanations somewhere else (khan academy - a great resource, etc.). Closer to the end of the course I stopped having a full understanding of what's going on and why. So I could calculate things, but I don't feel that I will able to that in 1-2 week because I didn't have a time and opportunity to strengthen gained skills. 3) Also I don't understand why instructors (especially David) don't visualize what they say like Sal or Grant are doing. They draw on the desk and on the plots and so on. Sometime it looks like you just listen to audio-book about the Math.

I will take Stanford ML course after this course and also review what I've learned here with Khan Academy resource.

автор: Vitor R C

18 сент. 2020 г.

Another great introduction to a very hard content that is Multivariate Calculus, including derivatives, but still good enough for someone with a very little mathematic basis to understand

One critique that I have is the lack of a smooth progression between the examples used in the video with the ones presented in the quizzes, sometimes the questions in the quiz are an entirely different order of difficulty than the ones in the videos.

Another critique is the seemly dive in quality in the content of the videos in the last two "weeks" of the course, you can see that very well because theses weeks have at most 20 min worth of videos each, even though it's supposed to be done during an entire week, and the content is very shallow, quick and hard to understand.

автор: JustsaiyanHS

4 янв. 2021 г.

A lot of the material Sam taught (first 4 weeks) felt very intuitive, his metaphors before introducing the concept and the following extrapolations into multivariate calc were easy to grasp. David teaches the last 2 weeks and I could no longer use the course as a starting point. I felt he overestimated prior knowledge of students and paced the lectures a bit too fast, often introducing 3-4 concepts in a short tangent.

That being said, I made it through with relative ease. The examples and labs were great and I used 3blue1brown / Khan Academy / calcworkshop (just the free lectures) to supplement my learning. I do have a good prior amount of CS, but most takes should feel comfortable enough in the jupyter environment.

автор: Jack C

31 мая 2020 г.

Great course! It was a pleasure to learn Multivariate Calculus, and Sam Cooper was great! I was even able to understand Neural Networks, which I had always found confusing! However, surprisingly, the final two weeks taught by David Dye about Optimisation and Regression were not taught well. I did not understand how to use them in practice, and the main reason why is because of Gradient Descent, an important algorithm, was not explained very well. The reason why this was so surprising is that David Dye was amazing in Linear Algebra, and I understood everything very well. Thank you Imperial College London for this great course, and I hope you edit it to explain Gradient Descent better.

автор: Deborah S

25 мар. 2021 г.

It was really fun to actually see back propagation and gradient descent actually working - thanks for a really fun experience. I'm sure some thought went into that. I DO regret that the Jupyter Notebooks aren't made available for download. I like to work in my local environment; spent a lot of time copying code etc.. I usually was able to get this working AOK. But not for the backpropagation network from lesson 3 (the "learn to draw a heart" exercise).

Any chance you could send the notebook for that one lesson???

Anyway thanks. I'm sure it's not easy designing courses where the audience is "assume knows nothing" coupled with "must teach something substantial". NOT easy!!

автор: Matteo L

20 апр. 2020 г.

This course is a great refresher for someone who has already studied these topics previously. The topics were very well illustrated and the objective of getting a good intuition of the math is achieved in my opinion.

I thought the examples like the neural network and the sandpits were great. That being said, I'd have liked to go a little bit deeper on the subject of optimization.

In general, I do feel that it would have been nice to have more practice on the topics (e.g. linear approximation and its use were not covered very thoroughly in my opinion). Also, the notebook assignments are far too easy and therefore don't add enough to the learning experience.

автор: Ronny A

27 июня 2018 г.

Course is pretty good. I like how well thought out the assignments are and the use of visualizations, even in the assignments, to enrich intuitive understanding. There were a couple of instances where the content wasn't clear and I referenced Khan Academy to clarify things for myself. The reason I give this course a 4-start rather than a 5-star is that it seems the teachers or else TAs were not responsive. Specifically, myself and another person had posted in the discussion forum how it seemed one of the slides had a typo in the Jacobian contour plot. There was no official response to this.

автор: Tuan Q N

5 февр. 2021 г.

The highest level of math I took was Algebra 2 almost ten years ago. The professors are pretty good, but many times their examples would not be very clear in terms of what needs to be done. I had to go watch some extra YouTube videos to understand derivatives and only then was I able to come back to the course and work my way through the assignments. My recommendation is when walking students through problems, please provide more details on the steps you're taking. Otherwise, I'm quite happy with this course and I'm learning forward to the PCA module.

автор: Ryan B

24 нояб. 2020 г.

A background in Mathematics is highly recommended before beginning this course. I learned these concepts 20+ years ago while completing my Engineering degree. They are presented so quickly here I needed to do a lot of research to truly understand the concepts they are presenting. A great external resource for mathematics is the 3Bule1Brown channel on YouTube where these concepts are brilliantly presented in a layman's format. Overall I thought this course was a good way to link the concepts of Linear Algebra and Calculus to Machine Learning.

автор: Salem A

20 июня 2020 г.

If you do not have a background in programming, some of the assignments will be intimidating and hard to do but if you go over it sequentially you will get the hang of it but it will take you time to do so. The lectures are too short and I feel that some concepts were not clarified enough because of how fast the lecturers go over them. The course, in general, is good for having an overview of the material so do not expect to cover these topics deeply. The presentation and the way some concepts were tough were enjoyable and enriching.

автор: Fang Z

11 июля 2019 г.

I really love Samuel's teaching style. He strived to make people understood by showing a lot of graph and I can easily follow him step by step. However, David's teaching I couldn't follow up his mind much maybe because less explanations given during the lecture.

In addition, I found some quiz have huge amount of calculated amount which I really spent a lot time to verify the answer.

Finally, I hope more detailed explanations could be given if I made mistakes in some quiz so I could boost what I've learned so far.

Thanks,

Fang

автор: Hermes J D R P

28 февр. 2020 г.

The first 4 weeks of the course were amazing: great content, clear explanations and fair and interactive assessment activities. However, the last 2 weeks weren't as good as the previous ones. That's why I don't give this course 5 stars. By and large, the first two courses of this specialization are the best resources available on the internet to learn the foundations of mathematics for Machine Learning. I recommend that instead of doing the last course, you had better try to read the related book wrote by Deisenroth.

автор: Christiano d S

3 авг. 2020 г.

this course contains good lessons, and the level of assignments is proportional to what is being taught. there are some minor issues at some of the videos, but it´s possible to clear the doubts in foruns, in general, I´ve found this course the best one by far compared to other courses in coursera in which you have to spend a lot of time searching for extra information and content to accomplish the assignments. for the first time I felt the instructors actually taught the content.

автор: Sergio A G

21 окт. 2020 г.

It starts brilliantly, but the last 2 weeks are quite bad. It has nothing to do with the new teacher taking over that part, I think he is as good as the other one. It's a matter of goals and focus. It seems like everything you learn in those weeks are just random things and little 'magic tricks', it's hard to see why they're relevant to the subject and everything seems disconnected.

Still, I really enjoyed the first 4 weeks. Awesome content, they made me realize I love calculus.

автор: Wu X

21 апр. 2020 г.

This course teaches multivariate calculus and its applications. In particular, Jacobian and Hessian Matrix are introduced as Matrix versioned derivatives (first order and second order), along with gradient descent optimization based on them. The structure of the course is a little bit loose, so it's not a good choice for those who want to seek systemically arranged learning materials. But it still worth taking for a better perspective and ideas.

автор: Saras A

29 янв. 2020 г.

Good course. I wish it had more sections as in a total of 12 sections or weeks and more steps to gain a more thorough graphical understanding (and perhaps even a more mathematical/algebraic understanding however overall that's much easier for me on that front...).

From a Data Science or Machine Learning perspective Week 6 (linear regression and non linear regression with chi-squared methods etc) were the most interesting.

автор: Donna D C

25 апр. 2020 г.

Nice balance between rigor and developing intuition (again as in the previous linear algebra course in this series). I would’ve liked some “homework” reading about backpropagation for training the simple neural to prepare for the future courses. Also, more references for additional reading on least squares minimization techniques to tie more into the statistics underlying the techniques. I love the stuff, thank you!!

автор: Dan L

30 мар. 2019 г.

The course accomplishes its goal of connecting concepts in calculus to machine learning, and is appropriately paced for students who have covered calculus in the past and are seeking a refresher or deeper understanding of its applications to real-world problems. For those who don't already have a certain minimum familiarity with the mathematics, however, the course will probably move at too fast a pace.

автор: Matt P

19 июля 2018 г.

Great class - very informative and eye opening - even with quite a bit of linear algebra background. Really liked the eigenvector and eigenvalue section - great descriptions. I wish the neural network discussion went on a bit further. I found some of the programming assignments' instructions a bit vague and confusing - what should have taken a few minutes ends up taking a half hour.