Chevron Left
Вернуться к Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Отзывы учащихся о курсе Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization от партнера

Оценки: 61,769

О курсе

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

Лучшие рецензии


5 дек. 2019 г.

I enjoyed it, it is really helpful, id like to have the oportunity to implement all these deeply in a real example.

the only thing i didn't have completely clear is the barch norm, it is so confuse


26 авг. 2021 г.

Amazing course which focus on the theoretical part of parameters tuning, but it needs more explanation of Tensorflow, as I felt a little lost in the last project. Except that, it is an amazing course.

Фильтр по:

76–100 из 7,091 отзывов о курсе Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

автор: Pantelis D

27 дек. 2020 г.

Another excellent course by professor Andrew Ng, short, on point and clear videos that go into the subject of optimizing Deep Neural Networks.

Like the previous course of the specialization the programming assignments are coded and submitted in the browser using Jupyter notebooks, the coding language used is python and for the math the python library "numpy". In the final week of the 3, an introduction to Tensorflow is made.

It is worth mentioning that some interviews with influential people on the field of DL are included and make the student fall in love with DL even more. Excited to see what's next in this specialization.

автор: Gerardo M L

18 июня 2019 г.

The course is amazing, the instructor explains everything with a good level of comprehension. All the covered topics are easy to understand, and the tips given are valuable. The examples given are new including also the information seen in the previous course, so you have a review of parts of the content you have seen. Although he keeps using the cat example, he introduces new other applications that are useful.

I wish that the last assignment were a little bit harder, or that we could use our previous knowledge and complement it with this new, but I suppose that it is this way because of pedagogy and it focus on the topic.

автор: Ronald A R

16 янв. 2022 г.

I'm so appreciative for the course content and the direction that it has moved me in a deeper understanding of NNs and tuning and optimization. I don't know about other students but I am a serious student that labors to comprehend all of Dr. Ng's videos. The labs are very helpful and gently focus the student on code lines that relate directly to video content. The very last lab compelled me study the TensorFlow function library and find what I needed to get the one line of code correct for computing the cost. This took time but was valuable. I'm now looking forward to more experience with the TensorFlow framework.

автор: Jason J D

6 авг. 2019 г.

This course is wonderful! Hats off to Prof. Andrew. The explanation for each topic is step wise and well organized. Every detail and reasoning is covered up. Even though there is a lot of content in this course, it is easy to remember and understand most of it, because of the way it is explained. The programming exercises are well planned and guide you through the code well. This course also has a brief introduction to TensorFlow, which is explained well through its programming exercise. Overall, this course is really good for those who are looking to master the methods to improve and optimize Neural Networks.

автор: Maximiliano B

27 окт. 2019 г.

The second module of the deep learning specialization is excellent. You will learn best practices regarding hyper-parameter tuning, how regularization can be used in Neural Networks, optimization algorithms such as Momentum, RMSProp and Adam. In addition, you will be able to build your first machine model using tensor flow as part of the practical assignments. Professor Andre NG explains the content clearly and it is very pleasant to watch his lectures. I definitely recommend this course because it will give you confidence to build your own models and will provide several additional tools in your tool-belt.

автор: Orson T M

13 сент. 2020 г.

Anyone who wants to excel in the field of AI must absolutely follow the 5 courses of this specialization in deep learning offered by the deep indeed the courses of the specializations will bring you a deep knowledge of the field because it is important that all those who want to embark on a career in AI, understand the fundamental concepts very clearly, as it will help them to design powerful AI models ready to be exploited for me. I'll help you say start now, not tomorrow. Tomorrow is a loser's excuse. many thanks to and coursera to share this knowledge...Orson Typhanel MENGARA

автор: Carson W

4 янв. 2018 г.

As with the first course in this specialization, the presentation was spot-on and the content was rich. The practical application is a wonderful tool for learning and I feel as though I have learned much more than I might have in a traditional classroom. I also feel that this course was slightly more challenging than the first, and introduced me to a few concepts I hadn't heard of before despite other research and development in AI/ML. Thank you so much for your dedication to sharing your knowledge and introducing new students to some of the brightest minds in the field with the optional interview videos.

автор: Javier R G

15 окт. 2020 г.

Having read some of the other reviews I understand why people feel like the course was too easy and that at the end of it they don't feel confident enough to implement some NN models from scratch. Just to be clear, I feel the same way. However, the intuitions I gained from this course, as well as the last one, helped me to understand at every moment what was being done in TensorFlow, and why, during the last programming assignment. For me this assignment was a smooth introduction to TensorFlow. In general, the videos are clear, concise and prepare you well for the quizzes and the assignments.

автор: Shahin A

8 мар. 2020 г.

It is very important for students to feel that the instructor and the education system are in their side, not in the confronting side. I feel like the whole course, from design elements to teaching, are in my side, they are here to help me learn. It is great!

The negative side is using TensorFlow1. The python package is clearly an interface to a lower level language, and thus either some background in that language is needed to understand the process better, or it is better to migrate to TensorFlow2. Why TF1 when version 2 is available and it is much easier?

автор: Ali Z

1 нояб. 2018 г.

small description error on the last project. tensorflow tutorial project.

X, Y = create_placeholders(12288, 6)

print ("X = " + str(X))

print ("Y = " + str(Y))

X = Tensor("Placeholder:0", shape=(12288, ?), dtype=float32)

Y = Tensor("Placeholder_1:0", shape=(6, ?), dtype=float32)

Expected Output:

X Tensor("Placeholder_1:0", shape=(12288, ?), dtype=float32) (not necessarily Placeholder_1)

Correct this from Y Tensor("Placeholder_2:0", shape=(10, ?), dtype=float32) (not necessarily Placeholder_2)


Tensor("Placeholder_2:0", shape=(6 ?), dtype=float32) (not necessarily Placeholder_2)

автор: Shahed B S

31 мая 2018 г.

This course goes into the various parameters and hyperparameters of deep neural networks, as well as suggestive values for ones we can use. This course is short in duration, but a lot of content is developed in here. It touches in on Tensorflow. The template based assignments provide great intuition for getting right on to the topics being taught, however, I feel there should be scope for more programming assignments where the student should be able to write more of that template as well. All in all, Andrew Ng is a great teacher and it was a pleasure to learn from him.

автор: Κώστας Π

5 мар. 2021 г.

first of all, I was very pleased with the organization and the programming of the course. It was compact with fundamental theories and concepts for improving deep neural networks. In particular, this is my second course I have attended on coursera, and I strongly believe that it helped me with my bachelor's thesis in mechanical engineering at Aristotle University of Thessaloniki. Above all, it was a fast-paced lesson including a fast "learning rate", which was quite suitable for me as an engineering student who is trying to finish his first diploma (bachelor) degree.

автор: Jong H S

1 окт. 2017 г.

At the time of writing this review, I have completed 3 of the 5 courses. I personally think these 3 courses are not merely courses to fill up the specialization. It is a journey, an incredible one. I will write metaphorically. My journey so far is like becoming a magician with Course 1 on how to become one, then went on to Course 2 to learn from the master magicians, their secrets revealed and Course 3 on what to do to put up a good show at Las Vegas trying to fool Penn and Teller. This specialization is my treasure vault. Great job to Prof Andrew Ng and team.

автор: Ernest S

5 нояб. 2017 г.

This course offers ground knowledge in all mayor concepts of non-recursive neural network and is excellent preparation to further exploring of this topic. Lectures cover broad choice of topics and discusses many problems you might encounter during your journey. Professor Andrew Ng explains theory in a way which builds good intuition and gives you building blocks for face the challenges of machine learning. If you are fluent with calculus or have academic background and expect to discover math behind the scenes I think you will be content too. I surely was.

автор: Rob S

9 июня 2018 г.

Another very well done course. You do a good job describing the benefits of Batch Norm, a lot more intuitively than presented in Szegedy's paper, which is pretty math heavy. However, I did notice one little ERROR on the Tensorflow project page, albeit an insignificant one. Double check that the expected output shape for the cell that outputs the shape of the training set and testing set. One of the expected outputs said that the test set should have 10 possible classes, when the dataset is for 0-5 fingers. This would be a very strange looking hand ;)

автор: David M

31 авг. 2017 г.

This is a practical course on how to work with neural networks. It covers a collection of "tips" and techniques, all grounded on a solid theoretical framework, to make a classifier train faster and be more accurate. The explanations are all engaging and interesting, and the assignments are rather easy.

The knowledge gained from this course is probably what everybody working in machine learning already knows, but if you are new to the field this is a great way to get up to speed fast and start implementing neural networks for your own projects.

автор: Jairo J P H

1 февр. 2020 г.

El curso es muy bueno, particularmente estoy muy agradecido con COURSERA, por darme la oportunidad de hacer los cinco cursos de la Especialización en Deep Learning con ayuda economica y permitirme tener acceso a este tipo de capacitacion y certificacion. Muchas Gracias…!

The course is very good, particularly I am very grateful to COURSERA, for giving me the opportunity to do the five courses of the Deep Learning Specialization with financial aid and allowing me to have access to this type of training and certification. Thank you very much!

автор: Yash P

29 дек. 2020 г.

The First course was the easiest of all the tutorials I could have found on the internet. Andrew Ng has taught it very well, and it's best suited for beginners. The second course has delved deeper into understanding various Optimization Algorithms and improving Deep Learning models by tuning Hyperameters and regularization.

I would strongly recommend you to take this course. It's a very beginner-friendly course, so no need to worry. If you have guts and passion for it, then what are you waiting for, just enroll...!!!!

автор: DANTE K

1 дек. 2020 г.

This course began similar to the first one in the Specialization, repeating lots of material from Andrew's ML course, but after the first week there's a lot of new material introuced. Andrew shows lots of techniques taken from recent papers that have had much success, which is something you probably won't see in ANY other DL course. Loved the intro to TensorFlow in the last week, really good job at explaning and using the basics without getting too bogged down on the details. Can't wait to do course 3!

автор: GEORGE A

5 мар. 2019 г.

Pretty solid class, learned a lot of basic concepts. The class won't go into a lot of mathematical details about the algorithms however, there is enough intuition provided in order to understand the inner workings of the algorithms and the logic behind them. The only con I have is that some of the programming exercises look outdated with the current versions of the notebook. For example, in my last exercise I couldn't make the NN with tensorflow to work properly but got 100/100 nevertheless.

автор: Matei I

1 февр. 2019 г.

This course covers details about neural network implementations that are extremely useful in practice. In fact, after completing week 1 and learning about vanishing gradients, I was finally able to debug a NN implementation that I had been struggling with. I'm also grateful for the introduction to Tensorflow. As with the previous course in this specialization, expect to be spoon-fed during the programming assignments. The course would be better if it let you think more during assignments.

автор: Pablo G G

10 сент. 2020 г.

Awesome introduction and guidance about where to tweak your model...altho in my expirience Adam is all you are going to need. Missed some teachings about fine tuning thought iterations with scheudeles! Tensorflow has this funciong than can adapt on the go your parameters so your optimization can push that loss lower and lower. Adam optimizer works like charm with an schedule for learning rate!!(

автор: 江小鱼

12 февр. 2019 г.

This time , I finished Regularzation, I think this is a interesting experience, for you can implement your alg step by step, I get some magic(not black magic) alg, like RMS, momentum and Adam. At last, the most fascinating is to construct Tensorflow, just like a pipeline, step by step , and every step was made by only one line, from forward (without backward) to the model, Tensorflow is really black magic.

(I have to say Tensorflow is a bit difficult, forgive my poor English, thanks )

автор: Nathan Y

16 окт. 2017 г.

Neural networks are not new. What we learned in this course is some of the critical implementation details/tricks from the past decades of making them work in practice. Going beyond gradient descent, types of regularization, hyperparameter searching we get to a set of robust tools that quickly find good solutions in extremely high dimensional spaces. As Professor Ng says, our understanding of optimization rules of thumb in low dimensional spaces doesn't carry over to deep learning.

автор: José A

30 окт. 2017 г.

Seamlessly continues the previous course. If you know the basic structures of Neural Networks, how to initialize weights. Sigmoid, Tangenth, activations, and so forth, this will help you understand terms such as L2 regularization, gradient descent with momentum, RMSProp, Adam, Exponentially weighted averages, and many others.

Don't let the 3 weeks set you off. It has a lot of micro-content material that works on top of the previous work. Thanks to all the mentors for this great course.