Chevron Left
Back to Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Learner Reviews & Feedback for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization by DeepLearning.AI

4.9
stars
62,800 ratings

About the Course

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

Top reviews

JS

Apr 4, 2021

Fantastic course and although it guides you through the course (and may feel less challenging to some) it provides all the building blocks for you to latter apply them to your own interesting project.

AM

Oct 8, 2019

I really enjoyed this course. Many details are given here that are crucial to gain experience and tips on things that looks easy at first sight but are important for a faster ML project implementation

Filter by:

1 - 25 of 7,214 Reviews for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

By Brennon B

•

Apr 23, 2018

Walking away from this course, I do *not* feel adequately prepared to implement (end-to-end) everything that I've learned. I felt this way after the first course of this series, but even more so now. Yes, I understand the material, but the programming assignments really don't amount to more than "filling in the blanks"--that doesn't really test whether or not I've mastered the material. I understand that this is terribly hard to accomplish through a MOOC, and having taught university-level courses myself, I understand how much effort is involved in doing so in the "real world". In either case, if I'm paying for a course, I expect to have a solid grasp on the material after completing the material, and though you've clearly put effort into assembling the programming exercises, they don't really gauge this on any level. Perhaps it would be worth considering a higher cost of the course in order to justify the level of effort required to put together assessments that genuinely put the student through their paces in order to assure that a "100%" mark genuinely reflects both to you and the learner that they have truly internalized and mastered the material. It seems to me that this would pay off dividends not only for the learner, but also for the you as the entity offering such a certificate.

By oli c

•

Dec 9, 2018

Lectures are good. Quizzes and programming exercises too easy.

By Alan S

•

Sep 30, 2017

As far as the video lectures is concerned, the videos are excellent; it is the same quality as the other courses from the same instructor. This course contains a lot of relevant and useful material, and is worth studying, and complements the first course (and the free ML course very well).

The labs, however, are not particularly useful. While it's good that the focus of the labs is applying the actual formulas and algorithms taught, and not really on the mechanical aspects of putting the ideas in actual code, the labs have structured basically all of the "glue" and just leave you to basically translate formulas to the language-specific construct. This makes the lab material so mechanical as to almost take away the benefit.

The TensorFlow section was disappointing. It's really difficult to learn much in a 15 minute video lecture, and a lab that basically does everything (and oddly, for some things leaves you looking up the documentation yourself). I didn't get anything out of this lab, other than to get a taste for what it looks like. What makes it even worse is TensorFlow framework uses some different jargon that is not really explained, but the relevant code is almost given to you so it doesn't matter to get the "correct" answer. I finished the lab not feeling like I knew very much about it at all. It would have been far better to either spend more time on this, or basically omit it.

As with the first course, it is somewhat disappointing lecture notes are not provided. This would be handy as a reference to refer back to.

Still, despite these flaws, there's still a lot of good stuff to be learned. This course could have been much better, though.

By Lien C

•

Mar 31, 2019

The course provides very good insights of the practical aspect of implementing neural networks in general. Prof. Ng, as always, delivered very clear explanation for even the difficult concepts, and I have thoroughly enjoyed every single lecture video.

Although I do appreciate very much the efforts put in by the instructors for the programming assignments, I can't help but thinking I could have learnt much more if the instruction were *LESS* detailed and comprehensive. I found myself just "filling in the blank" and following step-by-step instruction without the need to think too much. I'm also slightly disappointed with the practical assignment of Tensorflow where everything is pretty much written out for you, leaving you with less capacity to think and learn from mistakes.

All in all, I think the course could have made the programming exercise much more challenging than they are now, and allow students to learn from their mistakes.

By NASIR A

•

Jan 14, 2020

After completion of this course I know which values to look at if my ML model is not performing up to the task. It is a detailed but not too complicated course to understand the parameters used by ML.

By Xiao G

•

Oct 31, 2017

Thank you Andrew!! I know start to use Tensorflow, however, this tool is not well for a research goal. Maybe, pytorch could be considered in the future!! And let us know how to use pytorch in Windows.

By Alex M

•

Oct 9, 2019

I really enjoyed this course. Many details are given here that are crucial to gain experience and tips on things that looks easy at first sight but are important for a faster ML project implementation

By Md. R K S

•

Apr 15, 2019

Excellent course. When I learned about implementing ANN using keras in python, I just followed some tutorials but didn't understand the tradeoff among many parameters like the number of layers, nodes per layers, epochs, batch size, etc. This course is helping me a lot to understand them. Great work Mr. Andrew Ng. :)

By Abhishek S

•

Apr 19, 2020

Very good course to give you deep insight about how to enhance your algorithm and neural network and improve its accuracy. Also teaches you Tensorflow. Highly recommend especially after the 1st course

By Carlos V M

•

Dec 24, 2017

Exceptional Course, the Hyper parameters explanations are excellent every tip and advice provided help me so much to build better models, I also really liked the introduction of Tensor Flow

Thanks.

By Matthew G

•

Apr 17, 2019

Very good course. Andrew really steps it up in part two with lots of valuable information.

By Yuhang W

•

Nov 25, 2018

programming assignments too easy

By Anand R

•

Feb 17, 2018

To set the context, I have a PhD in Computer Engineering from the University of Texas at Austin. I am a working professional (13+ years), but just getting into the field of ML and AI. Apologies for flashing this preamble for every course that I review on coursera.

This course is the 2nd in a 5 part series offered by Dr. Andrew Ng on deep learning on coursera. I believe it is useful to take this course in order and makes sense as a part of the series, though technically it is not necessary.

The course covers numerous tuning strategies and optimization strategies to help seed up as well as improve the quality of the machine learning output. It is very well planned and comprehensive (to the extent possible) -- and gives the student a very power toolbox of stratgies to attack a problem.

The instructor videos are very good, usually 10 min long, and Dr. Ng tries hard to provide intution using analogies and real-life examples. The quizzes that accompany the lectures are quite challenging and help ensure that the student has understood the material well. The programming exercises are the best part of the course. They help the student practice the strategies and also provide a jump-start for the student to use the code for their own problems at work or in school.

Overall, this is an excellent course. Thank you Dr Ng and the teaching assistants, Thank you coursera.

By Hernan F D

•

Dec 5, 2019

I enjoyed it, it is really helpful, id like to have the oportunity to implement all these deeply in a real example.

the only thing i didn't have completely clear is the barch norm, it is so confuse

By Glenn B

•

May 31, 2018

Course material was great, however the use of Tensorflow in the exercises requires more background than provided in the short tutorial.

I get the dynamic aspect of writing the lecture notes in the videos, however the lecture notes should be "cleaned up" in the downloadable files (i.e., typos corrected and typed up). Additionally, the notes written in the video could be written and organized more clearly (e.g., uniform directional flow across the page/screen rather than randomly fit wherever on the page.

By Abiodun O

•

Apr 6, 2018

Fantastic course! For the first time, I now have a better intuition for optimizing and tuning hyperparameters used for deep neural networks.I got motivated to learn more after completing this course.

By Youdinghuan C

•

Dec 28, 2017

This is a logical continuation of the previous course. The 3-week topics were excellently chosen. Andrew did a great job of delivering the lectures. The programming assignments really reinforced my understanding. In particular, essential knowledge and equations from video lectures were reiterated in the programming assignments for review and ease of reference. The amount of work was reasonable, and the level of challenge was appropriate. I especially appreciate the instructional team for making this course open to the public.

By Alessandro T

•

Jan 22, 2018

A right balance between theory (you are required to code know the models and code them from scratch) and practice (you get an overview of the frameworks available out there to put your code into production quickly and efficiently; and time is spent on practical aspects of the training phase).

A small "criticism": in the notebook, more than programming you just have to fill a template where a good part of the algorithm is already drafted for you. It is too much, students should be left scratching their heads a bit longer :)

By Hassan S

•

Apr 3, 2018

Andrew Ng and the teaching assistants' team of this class are obviously very very determined not to leave any single major subject in deep learning without coverage. I have been using deep learning for the past couple years, but I have to say by completing the second course of this specialization, they helped me deepen my understanding, overcome fear of implementing math and equations line by line, fix my intuitions about deep learning, and most importantly erase all the superstitions! Bravo and excellent job.

By Joseph S

•

Apr 5, 2021

Fantastic course and although it guides you through the course (and may feel less challenging to some) it provides all the building blocks for you to latter apply them to your own interesting project.

By Shah Y A

•

Oct 28, 2019

TL;DR: lectures are awesome, notebooks are bad.

The lectures by Prof. Ng are amazing, comprehensive and intuitive. The prof starts from first principles of simple neural networks and goes onto show concepts like normalization, bias, variance, overfitting, underfitting, regularization, dropout, L1 and L2 regularization, exponentially weighted averages, stochastic, mini batch and batch gradient descent, momentum, RMSprop, Adam optimization, batch normalization and intro to deep learning frameworks. He not only gives the mathematical foundations and code implementations of each concept, but spends a lot of time explaining the intuition behind it, so that we grasp the concept well. It's amazing how he starts from decade old neural networks in the first video, and within 2-3 hours of lecturing, he brings us into the state-of-the-art deep learning models. Thank you Prof. Ng!

But the notebooks have many flaws. The lectures don't set you up for the programming needed in the notebooks. The descriptions in the notebooks are lacking proper tutorial in many places, leading the students incompetent for the exercises that follow. Example: Week 3 tensorflow tutorail; the sigmoid function exercise; the description above the exercise doesn't really teach you how to effectively use placeholders and variables. I was confused and had to go through the noisy dicsussion forum. Please fix it, and if you'd really like more constructive criticism from me, contact me yasser.aziz94 (at da rate ov) gee mail dut com. (lol)

By Stefaan V w

•

Aug 21, 2019

The video material in this class is excellent, as usual. However, I feel that the coding assignments in this series are vastly inadequate. I already felt that there was a lot of "hand holding" in the previous course, but the Tensorflow introduction in this class unfortunately takes things to another level. The assignment, which could be a very valuable exercise to get people acquainted with Tensorflow, amounts to copy and pasting a few lines of code that did not teach me the skills required to approach a problem other than the specific image recognition example given.

By Ethan G

•

Oct 17, 2017

I did not think this was a great course, especially since it's paid. The programming assignment notebooks are very buggy and the course mentors are of varying quality. It feels more than a bit unfinished. It also covers two completely different topics - tools for improving deep learning nets and tensorflow - and doesn't make much of an effort to integrate them at all. The course could have used at least one more week of content and assignments to better explain the point of tf.

By Hequn W

•

Apr 18, 2018

I can't open week1 assignment initialization, and can't get any help from coursera. I've completed all five courses, but can't get certification without this assignment....

By Văn H B

•

May 27, 2019

I would rate for this course 4.5, but Coursera's system does not have it.

About the first and second week, explanation about terms in Deep Learning are very good from Prof. Andrew, the preparation for exams is quite good for you to revise lectures. I think programming exercices should be more challenge and more suggestive for students, but it was okay for me after having some knowledge from Machine Learning Course. I suggest you to finish Machine Learning Course before taking this.

About the third week, i expect a lot more about TensorFlow that Mr.Andrew can give me, or maybe more intuiation about it. Moreover, Batch Norm 's explanation is quite hard to understand, because we do not have any programming exercise for it, so I hope teachers can prepare a programming exercises among with the programiing exercise for TensorFlow.