Chevron Left
Back to Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Learner Reviews & Feedback for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization by DeepLearning.AI

4.9
stars
62,857 ratings

About the Course

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

Top reviews

NA

Jan 13, 2020

After completion of this course I know which values to look at if my ML model is not performing up to the task. It is a detailed but not too complicated course to understand the parameters used by ML.

AS

Apr 18, 2020

Very good course to give you deep insight about how to enhance your algorithm and neural network and improve its accuracy. Also teaches you Tensorflow. Highly recommend especially after the 1st course

Filter by:

7001 - 7025 of 7,218 Reviews for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

By Yashika S

Sep 10, 2019

tough one

By Mor K

Aug 30, 2019

excellent

By Luis E O

May 17, 2019

Excelente

By IURII B

Apr 3, 2018

Thank you

By MD. E k

Apr 30, 2020

was good

By Suman D

Jul 27, 2018

Awesome.

By Davit K

Jul 13, 2018

easy bb

By 刘倬瑞

Nov 2, 2017

helpful

By Suraj P

Jul 17, 2020

Great!

By SUMIT Y

Jul 4, 2020

NICE!!

By qiaohong

Oct 27, 2019

作业过于简单

By Sonia D

Jan 30, 2019

Useful

By DEEPOO M

Jul 18, 2020

great

By Johannes C

Aug 29, 2017

Good!

By Pallavi N

Jun 26, 2022

Nice

By Aditya S

Aug 9, 2019

good

By Łukasz Z

May 2, 2019

bugs

By Aakarapu S P

Jul 3, 2018

good

By Dheeraj M P

Feb 23, 2018

good

By Darwin S

May 19, 2022

ok

By Alexandru I

Jan 31, 2022

ok

By Mohamed S

Oct 20, 2019

e

By Joshua P J

Jun 7, 2018

I've loved Andrew Ng's other courses, but this course was boring and not well-organized. The lectures were unfocused and they rambled a lot; they're nearly the opposite style of Prof. Ng's other material, which I found extremely well-organized. Most topics could be shortened 33-50% with no of clarity.

The course structure itself could use improvement:

The first part of Week 3 (Hyperparameter Tuning) belongs in Week 2.

The third part of Week 3 (Multi-Class Classification) should be its own week and its own assignment and could really be its own course. This is *THE* problem that almost every "applied" machine learning paper I've read is attempting to solve, whether by deep learning or some other class of algorithms. (Context and full disclosure: I'm a Ph.D. Geophysicist and my research is in seismology and volcanology.)

The introduction to TensorFlow needs to explain how objects and data structures work in TF. It really needs to explain the structure and syntax of the feed dictionary.

In the programming assignment for Week 3, there are three issues: (a) The correct use of feed_dict in 1.3 is completely new and cannot be guessed from the instructions or the TF website, and it's not clear why we use float32 for Y instead of int64; (b) In 1.4, "tf.one_hot(labels, depth, axis)" should be "tf.one_hot(labels, depth, axis=axis_number)". (c) In 2.1, the expected output for Y should have shape (6,?), not (10,?).

By Francois T

Jun 30, 2020

As an old school (80s) software developer I feel uncomfortable about the lack of formal teaching on the structure and principles of TensorFlow. Sure, I can write the code and fly through the programming assignment, I "kind of" get it, but for a thorough engineer, that "kind of" creates a sense of unease. I wish Andrew Ng, being the incredible practical teacher he is with the theory of Machine Learning, would have spent a bit more time reviewing that particularly practical topic of TensorFlow more in depth, because 1h on it would bring much more value than say, understanding the inner working of batch norm, especially to an engineer ready to onboard a new project and start creating. For example, when should you use a placeholder vs a variable and why? Why is there a "name" parameter in the constructor of a variable, when should I make good use for the difference between the name at a tf level and its actual Python variable name? etc... Unlike Matlab or Numpy, TensorFlow looks to me like it could use a bit more theory before practice. Next class? :)

By David C

Jul 22, 2019

Nice explanation of Adam. Extremely minimal introduction to tensorflow; I felt unprepared to deal with all programming error messages I encountered when using TF. I would have liked to have had more exposure to softmax outputs as well; the multi-class case is new here. My biggest complaint is that there was quite a bit of time spent trying to explain batch normalization and no corresponding programming assignment. Also, in the past I felt I had my hand held a little too much in the programming exercises, whereas when tensorflow was introduced I felt I'd been thrown by that hand into the abyss; the expected output could not help me debug because it seemingly was designed to remind me over and over that tf.Session.run was needed to give value to tf variables. ya... I think you guys have some work to do on this course.