26 нояб. 2017 г.
Fantastic introduction to deep NNs starting from the shallow case of logistic regression and generalizing across multiple layers. The material is very well structured and Dr. Ng is an amazing teacher.
30 мая 2019 г.
I have learnt a lot of tricks with numpy and I believe I have a better understanding of what a NN does. Now it does not look like a black box anymore. I look forward to see what's in the next courses!
автор: Zaur G•
14 мая 2021 г.
I think overall course if very bad and discouraging. There is almost no connection between video lessons and programmer assignments. Instead of writing so much formulas during lesson tutor could spend time on explaining some part of code (it's very difficult to understand tasks only from decription). During the second week Tutor explained little bit code. But then there was no more connection between videos and assignments. Overall I'm very disappointed
автор: Domagoj K•
18 авг. 2017 г.
I am very disappointed with this new course concept where you have to pay 43$ a month to be able to solve a quiz. Coursera used to be famous for its free courses and now it just removes free features over the time. It has become another site with expensive courses. I watched first week lectures and this is probably my last time to enroll in Coursera course.
автор: Manish S•
31 дек. 2019 г.
This course is more of spoon feeding, I liked the introduction to neural network in "Introduction to Machine learning" course better.
автор: Maxence A•
29 окт. 2017 г.
The programmation exercice are nice, but the courses are mainly about very basic linear algebra.
автор: Joseph K•
20 мая 2018 г.
It will be a good course when you dump jupyter note books.
автор: Felix F•
19 дек. 2017 г.
giving low grade for ongoing delays of course 5
22 окт. 2017 г.
автор: Long H N•
10 дек. 2017 г.
автор: Amit W•
30 сент. 2018 г.
Hello Andrew Ng Sir & Coursera Team,
Tell your instructors about yourself.
My name is Amit Wadhe. I am software engineer working in Walmart, Bangalore, India. I have 4 Years of working experience. Prior to Walmart I was working for Morgan Stanley. I have done my Bachelor of Technology in Computer Science and Engineering. I was always passionate about the computer from my school days. Out of curiosity I did my first C Language class in 10th Standard(School). That too with daily up-down of total 180km with train for one month from my hometown to Akola city. That time there was no computer courses offered in my hometown. After my schooling, I decided to go for engineering in Computer branch. I think that is enough in short about me.
Why did you take the course? How has it helped you?
I am working mainly on Java applications for last 4 years professionally. In last couple of years I realised that its not something which is exciting me, Its not something I wanted to work on. I was not sure what I wanted to work on, what excites me. I was hearing bits and pieces about Machine Learning and Artificial Intelligence since long from friends and colleagues. I was having perception about AI is that it's something big, something rocket science, something not for normal professional. But I got true trigger when I saw video about self driving car in silicon valley. That time I felt, Yes I wanted work on something like this, something which can be useful in real life, day to day life. I started searching about ML courses on google, I saw multiple courses on Udemy and Coursera. I red feedback about some courses. First place I started with some Udemy courses on ML for beginners but It comprised of only on how to code instead how it work internally. I was interested in knowing how something works internally instead of more in coding part. As I was Java developer I knew coding is not big deal. So I was curios about how ML models work internally, what is mathematics behind it, I was having interest in mathematics from the school days, though I did not score top. Then I started with ML by Andrew Ng on Coursera. After completing course, I felt like Yes, this is what I was looking for. Post completion my curiosity in deep learning has taken deep dive and I started looking for more courses by Andrew Ng on Deep Learning.
This course helped me to clear my understanding about how Neural network works mathematically. I was knowing bits and pieces about neural network steps like forward propagation and backward propagation but that was partial knowledge. After completing course I got that satiate feeling, Yes I know now, I understand it now in and all.
What did you love about the course? Tell them!
"I loved the bottom up approach of Andrew Ng Sir explaining concepts and Unveiling the treasure".
Irrespective of background I think anyone can understand the course with some knowledge on matrices and linear algebra. Recalling required knowledge learned in previous slides in short before diving into concept. Pace of course is also something which helps to grasp concept easily. Very intuitive examples helps to understand concepts faster. The example which I like most is about Neural network model of housing price prediction where Andrew Sir told intuition of hidden layers which is really connected to real life examples.
автор: Sarah R•
29 дек. 2018 г.
This course was insanely clear and meticulously constructed. As someone who does data science work professionally, I so appreciated the thought that went into the design of the videos and the programming assignments. You are seeing really exemplary code and also really sophisticated use of the Jupyter notebook! Also, the test cases are so well-constructed. You really get to *see* all of this stuff working or not with the carefully designed helper functions that allow you to visualize the decision boundaries and view training examples. Of course, the writing of these helper functions is no small feat. IT WILL NOT BE LIKE THIS WHEN YOU CAST OUT ON YOUR OWN. But, what this course does for folks (like me) who didn't have the benefit of a course like this in their formal schooling (perhaps they are too old and this stuff only got well-organized and codified more recently) is provide exemplars. Will your code always look like this for everything you build? No. But it shows you, using the exact technology that you are likely to employ professionally (tensorflow is coming up in the next course), what is possible. I look forward to rest of the specialization.
A note on the pacing: Perhaps because I am already very familiar with python, numpy, and Jupyter notebooks, I was able to complete this course in about two days (rather less than 4 weeks). However, I still got a ton out of it. I think it is paced the way it is so as to be viewed as more accessible by everyone, and also not with the assumption that you want to dedicate the majority of a weekend to it. Probably also there is something to the psychology of completing it so very ahead of schedule that the designers of this specialization are not altogether unaware of. But, if you, like me, know that you want a refresher on neural nets that is going to be practical and useful, in that it will help you both implement them AND understand what you're doing, this is a quick and effective way to jump back in.
Finally, since this is such a quick course, I really recommend NOT skipping it, even if you want to get to the more advanced topics in the rest of the specialization quickly. The course is so thoughtfully designed and concepts are introduced in a very specific and intentional way to make sure you understand each step before the course progresses. Based on having experienced this careful design, I expect the notational and programming conventions established in this course will make the next courses in the specialization more accessible.
In conclusion, this is I think the best online course with integrated programming exercises I've ever taken. I think it might be a standard-bearer for the whole field. Well done!
автор: Jeremy W G•
25 апр. 2018 г.
Copy&Paste from the survey I wrote earlier.
In 2012, I graduated with a statistics degree (BS) from the middle west where many companies hire data scientists to do simple analytics work. With my dream to do more predictive modeling work, I decided to go to the west coast and join the University of Washington to learn statistics in the master's program. One reason was that UW offered a great statistics program that most students chose to continue the Ph.D. program. The other reason was that Seattle had a few great high tech companies for me to explore opportunities at. However, although the MS program gave me a strong background in statistics theory, I found the industry moved so fast that my knowledge was falling behind the industry needs. In 2013-2014, I took Andrew's ML course on Youtube and Amazon hired me as a data scientist in the marketing department of Cloud Computing department (AWS). I figured that as a stats major I didn't have the knowledge in cloud computing or marketing, so in 2015 I took Coursera's big data specialization offered by UC San Diego, and the digital marketing specialization from UIUC. Later, I found another ML job at Amazon, using a lot of big data tools (Hadoop, spark, etc.) on AWS. After a year of settling down in San Francisco, this year, I decided to pick up the knowledge in deep learning. The first course of DL was fundamental but contained so much information that sometimes I needed to review several times because I forgot many statistical theories back in school. I thought it'd be very hard course but Andrew did a great job designing the curriculum where the theory and the application have a great balance for working people like me to start with. The amount of homework was much easier than I anticipated. I think for students who want to take the real challenge of coding, should hide Andrew's hint and write own functions. Overall, I like the Coursera courses and will continue to learn.
11 июня 2019 г.
This course is great! I wish they would release a new version of the course where the math is visually explained instead of just handwriting by Dr. Ng. I think having to work with a small tablet really hapered his ability to develop the ideas as he was always trying to pack a lot of information on one ipad screen I would think that he could just stand in front of a white board and write on it with maybe hiring a sound technician this time? because despite the really high quality content of this course the audio is terrible and with the ipad screen not really doing justice to the writing, it really takes multiple viewings to figure out what's going on.
I would also suggest that Dr Ng really should explain when which one is which when he is using Y vs y and X vs x ... I'm sure it's crystal clear in his mind but for newbies like me, it can be confusing at times when there they write x but mean X (and vice versa)...
I still think this course is brilliant and it really cleared many concepts in my mind. It answered a lot of questions I've had after watching the fast.ai course. So if you're doing the fast.ai courses, you should definitely at least audit the deep learning.ai specialization courses and tbh, $50/mo is a steal for the calibre of information that is on offer (video/audio and ipad issues notwithstanding )
Work through it and you will find it extremely rewarding! Don't give up, keep going and if you feel frustrated, take a break and rewatch the videos the next day after a good night's sleep. It really helped me that I watched and rewatched video lectures, did the quiz, failed and came back to understand why I couldn't answer quiz answers. Good luck to all and Thank you to Dr Ng for making this available to us free of charge (if we wish to audit) I would buy the specialization though, since it is worth every penny and then some!
автор: Dave J•
9 февр. 2020 г.
Good introduction to implementing shallow and deep neural networks in Python. If you have no knowledge of neural networks or Python, I'd suggest doing a little preparatory study first so that you know what a neural network is and feel comfortable writing short Python programs.
Theory: the course is not heavy on machine learning theory. I had covered the theoretical parts previously in other courses. This course provided a useful summary of these and left me feeling confident that I had a good overview.
Maths: this course doesn't place great emphasis on the mathematics. It shows you the relevant equations, with the emphasis on understanding the underlying concepts rather than going through detailed derivations. Sometimes there's an optional extra video going through the equations in a little more depth. A frequent message is: don't worry if you don't understand all the mathematical detail, you can still learn to implement neural networks effectively.
Implementation: the course uses the Python NumPy library throughout. It does not go into deep learning frameworks such as TensorFlow or PyTorch. From the outset, you are taught to use NumPy in an efficient ("vectorized") way. The programming exercises are well thought through and I found that they all worked smoothly, a pleasant change from some other courses elsewhere.
Overall I found this to be a gentle but satisfying introductory course to the Deep Learning specialisation. Andrew Ng is an excellent teacher. His manner is both calm and enthusiastic and he clearly cares about equipping students with the skills that they need and doing so in an accessible way. The optional "Heroes of Deep Learning" interviews were particularly interesting, full of gems and hints about what could lie ahead if you decide to go more deeply into the field.
автор: Dejan Đ•
6 нояб. 2017 г.
TL;DR: Very much worth taking if you're looking to get into the field, develop (much) deeper understanding of the underlying theory and the necessary infrastructure.
I first gave it 4 stars and then changed to 5, let me tell you why. If you're reading this review, you are most likely considering taking this course and you very likely have some idea about what Deep Learning is supposed to be. You're also probably aware of the "black magic" stigma surrounding the field and that it is going to take some time to get used to the way of thinking, even though if you have some experience in "conventional" machine learning. Well this course (read: it's creators) also understands all of those points extremely well. With that in mind, the course caters to people who are are making their first steps in the field of DL, people who are not expected to have a high degree of expertise in dealing with DL models and especially not in creating those. Students are expected to understand about 85% of the underlying theory in order to get the models working (the rest is mostly calculus needed for deriving certain more difficult gradients) and the coding assignments include a considerable amount of hand-holding. That fact made me want to say how the course was trivialized in a certain way, and it really is (but don't let this discourage you; you will still need to implement all of the key parts and do take your time to really understand what they do), but then I thought about that again and concluded that I most likely would have struggled to complete the course otherwise. Andrew Ng and the deeplearning.ai team had a wonderful approach to teaching this course, it kept me coming for more and I cannot wait to start with following courses in the specialization.
автор: Yuri C•
22 янв. 2021 г.
I have recently completed the NLP specialization and decided to get a good introduction to the fundaments of deep learning. I was very satisfied with the NLP courses. Therefore, I chose to do the DL specialization as well. Andrew Ng is an amazing educator. The material is very well composed and thought through. I had already studied DL from diverse sources and I must say, the formalism presented here in this course is to date by far the best one I have seen. As a mathematician, I know that notation is power. Good notation will save you a lot of time and help you quickly understand and generalize concepts. Andrew Ng knows that also and put a lot of effort making the whole course very precise and without any loose end. This makes the learning very rewarding and easy to follow. The combination of mathematical formalism and intuition is on point and will help learners with a more programming background and the ones with a more math background as well. This is usually hard to achieve, the correct balance between practice and conceptual definitions. Everyone entering the field should be introduced with this choice of formalism and presentation. Congratulations on developing this notation! My only critique that I must make is the choice of the notation about the partial derivatives of the cost function. I pledge to reintegrate dL/dW into the notation, because this makes it clear with respect to what we are taking derivatives and also easier to get your mind around the chain rule. I hope this will be integrated in a new version of the course. But again, this is a small detail in the big picture. Awesome work! Thanks for providing this to the public at a such a price tag.
автор: Kevin M•
10 апр. 2020 г.
Terrific course with 4 solid weeks of learning. The journey includes logistic regression for classification, shallow neural networks, deep neural networks, and building your own picture classification NN.
The virtual classroom lectures, quizzes, and programming assignments test your knowledge every week.
NN initialization, forward propagation, cost function, loss, backward propagation, gradient descent, and prediction using your trained model to classify pictures (in this case cats)
The coverage of the Calculus and Linear Algebra that are the basis for algorithms is explained in a way that builds a solid foundation without deep knowledge of the fundamental math behind the activation functions (sigmoid, relu, and tanh). A good understanding of matrix math, especially matrix multiplication, is a benefit to help navigate the course
The programming assignments use numpy python and are conducted in a Coursera frame Juypiter notebook. Strongly recommend beginners take the python tutorial as the syntax challenges can burn a lot of effort that can take away from the NN learning experience. Also be mindful of stability issues that can cause erroneous results (restarts of kernel) and can cause lost worked due to failed auto saves
The volunteers that help on the message board are quite good! Thanks Paul!
Finally, Professor Andrew Ng truly knows his stuff, presents in an understandable way, and communicates the excitement he has for the topic. Having taken a previous machine learning course (Stanford Machine Learning also offered by Coursera), Professor Ng is a world class instructor and Data Scientist
Best of luck!
автор: Sebastian S•
15 дек. 2017 г.
I found it very helpful as it confirmed most of the things I had already learned by doing deep learning projects on my own, as well as browsing additional literature on machine learning / deep learning and having done some internships where I had to apply these things. So for me personally, this course did not teach me anything ew, but organised and structured the knowledge in my head nicely by summarizing it very neatly. Also, some of the hints on implementation where helpful (like the numpy reshaping issue with arrays of shape (n,) as opposed to (n,m)). One thing I found is that deep learning can only really be understood if the covering of back propagation includes the low level derivatives + chain rule discussions; otherwise, you dont really "understand" whats going on. I appreciate that the course (just like the original "Machine Learning" one, which was excellent) tries to reach a broad audience that does not necessarily know analysis to the extent required for backprop, but maybe it would be a nice idea to include a "mathematician's point of view" on the backprop as an optional part. I found that in my personal studies, looking at backprop from the pure analysis point of view helped me a lot in "demystifying" deep learning and seeing it for optimization approach that it is. Having said that, I found the course very nicely structured, with very clear explanations and relatable applications. Thanks to coursera and Andrew for providing this great source of knowledge for free, I really appreciate these efforts! Sebastian
PS: I gave it 4/5 stars, but for some reasion the rating keeps getting stuck on 5.
автор: Sven A•
24 дек. 2021 г.
Lectures and programming assignments were great! All the maths behind Deep Learning was explained intuitively and illustrated with good examples. I liked the fact that the lecturer went into details for those who are interested - I really recommend these additional materials, even if you have hard time understanding the calculus and linear algebra. Programming assignments had thorough explanations and detailed guidance how to approach each exercise. I would say that at some point explanations gave away the answer (if you were well acquainted with materials, of course). I liked that during programming assignments I had to implement every functions myself. This really helped me to gain a general understanding about Deep Learning. And thanks for having tests in the assignments! This really helped a lot, because I was able to debug my code before moving on.
There is only one suggestion I would like to make: perhaps it would be easier for students if there are smaller programming assignments right after a topic is covered in a lecture. E.g. forward propagation lectures would have a small programming assignment where one would have to implements few examples for forward propagation. The benefit I would see is learning through repetition - it is easier to learn materials through repeating small chunks at a time. Moreover, programming assignment at the end of a week would not be so difficult to approach anymore and would serve as a final repetition, where practiced and known materials are finally tested.
автор: Ryan F•
31 дек. 2017 г.
This was a very well-thought-out course for beginners in Neural Networks / Deep Learning. Andrew Ng sets a good pace; I was able to complete each week's lecture videos and assignments in less than 10 hours. Lectures were always clear and often went over things which would not directly be needed for assignments, but which will be useful to anyone planning to do work in this field. Andrew Ng was also very good about explaining where the mathematical equations came from, while stressing that it's not super-important to understand fully where they come from, as long as you're able to implement them.
I should add that I'm probably not the typical audience for this class --- I have an extensive math background but only just started programming a few months ago. Python code was scaffolded and commented in such a way that even a noobie to programming can follow and complete coursework, and I can say I've not only learned about NN/DL algorithms, but also a good deal about programming in python as well. One major topic that still blows me away is the speed boost we get from avoiding for loops and using vectorization instead.
The post-assignment interview videos were also interesting. Andrew Ng would interview a guest 'powerhouse' at the end of each week, and the topics covered there often went way beyond the scope of this individual course, and gave a much more broad overview of where we are now and where we seem to be heading in the near and long term.
автор: Dilip R•
16 мар. 2020 г.
This is a wonderful course. I have been reading passively for about a year on resources related to ML and DL, but never got the full grasp of the concepts the way Prof. Andrew explained them. The quizzes where entertaining and insightful, as well as the programming examples.
I completed this 4-week course in about 2 days straight; some of the quizzes were 70/100 at my first try but then got to 100/100 after 1-2 tries. On the programming assignments I got 100/100 on the first try (except for the first one which didn't register my last 3 code answers -even though I typed and ran correctly - for which I had to restart the kernel, launched it again in incognito mode and after I was done re-ran all the snippets one last time just to be sure).
The hardest part of the course for me was to understand derivatives and overall calculus development and factorization because I had the necessary classes a few years ago, and honestly I wasn't very good at it back then either.
One thing I would suggest would be to improve audio quality, as well as editing the videos instead of providing a warning message before a video with errors because sometimes it's hard to follow the course part let alone spot the error itself.
Again, I would like to thank Professor Andrew Ng. and the Deeplearning.ai team, as well as the Coursera platform for providing such great realtime capabilities like the jupyter notebook and the automatic grading system.
автор: Baili O•
23 янв. 2018 г.
This is a great course which covers some popular machine learning techniques such as regression. And then it moves to deep learning with neural networks with the techniques of forward and backward propagation. It is a good course for beginner and the homework are fairly easy. However, this course still leaves some unanswered questions which might be covered in the future courses such as how to select hyperparameters, why we choose this specific cost function, are there any other deep learning framework other than neural networks structure, any other application other than image recognition. In addition, for those who have some background in machine learning, the interview section is a bonus which talked about GANs.
There are somethings not very enjoyable as well. For example, the notebook is unable to download so people have to write it down otherwise when the course expired, it is quite hard to get the course material. The lecture notes are badly organized: you have to download every slides one by one. (or I just didn't find the right place to download). Thirdly, this course didn't talk too much about techniques that the industry is using. What I am trying to say is that I don't know if the techniques in this course is applicable in the industry, is it too simple or is it too old etc.
Overall, it is a very fun and educated course. I can't wait to jump into the next course.
автор: Melissa B•
16 авг. 2020 г.
This course has been very interesting and engaging for me. Dr. Ng explains everything very thoroughly and provides compelling examples of real world application for the material. Occasionally he can be a bit redundant, but I found that helpful, since sometimes it takes more than one pass through the material to understand it clearly. I am also taking his Machine Learning course (Stanford) concurrently and I found the courses to complement each other very well.
Additional practical notes: The linear algebra involved in the course is relatively basic and explained thoroughly enough that it can be picked up along the way if you don't have much math background. Also, the programming assignments help expose you to Python syntax. If you don't have much Python experience, use the discussion forums. If you do, you'll find the assignments to be incredibly easy, as they have ample starter code with a few plug-and-chug "your code here" sections. These are mostly just to demonstrate how the material is applied. The primary value in the course is conceptual. For someone with very little coding experience like myself, I appreciate how thoroughly the code was annotated so I could get a grasp on what it was doing without having to stab through trying to produce it myself.
Overall, I really loved this course, and I learned so much!
автор: Felipe P C N•
10 июля 2020 г.
O curso concilia uma abordagem compreensiva - partindo de conceitos intuitivos/elementares - com uma profundidade respeitável - construindo com cuidado técnicas mais aprofundadas. As aulas são extremamente didáticas e bem elaboradas, o que torna a familiarização com os conceitos de redes neurais (e a matemática por trás deles) mais natural. As avaliações têm tanto um componente teórico (quizzes, mais voltados para verificação de conhecimentos) quanto um prático (exercícios de programação). Por mais que seja positivo haver o componente prático, eu diria que ele poderia ser um pouco menos "guiado" - na prática, o que eles pedem é que o aluno "complete as lacunas" em um programa já (muito bem) estruturado (e praticamente completado) pelos professores. Senti a necessidade de complementar esses exercícios com tentativas por conta própria, programando os algoritmos do zero, para de fato assegurar que consigo implementar o que aprendi. Esse passo é sensivelmente mais difícil que os programas do curso (afinal, implementar um projeto do zero é essencialmente desafiador). Mesmo assim, vejo que esse tipo de "iniciativa" de levar os conteúdos aprendidos a alguma aplicação/prática para além do pedido pelo curso é algo que idealmente deveria ser feito ao se estudar qualquer disciplina.
28 февр. 2020 г.
This course from the basic to the advanced, leading us to understand deep learning. From the initial logistic regression, then to the shallow neural network, and finally to the deep neural network, we gradually learned the neural network representation and calculation process, and finally began to implement the cat image recognition binary classifier. The course is very clear and logical, eliminating the tedious mathematical derivation, but still allowing us to understand all the mathematical details including calculation and vectorization. The assignments are done step by step, starting from the basic functions and gradually encapsulating, and finally constitute a complete neural network, which enables students to have a deep understanding of neural network and master knowledge from practice. It is worth mentioning that the setting of course gradient is reasonable, and the details that are difficult to understand in the previous course do not need to be understood all at once. In the later courses, the understanding of the previous knowledge points will be deepened repeatedly, and due to the foreshadation of other knowledge points, a more complete and comprehensive supplement will be provided to the previous knowledge. Looking forward to series two.