great introductory stuff, great way to keep in touch with tensorflow's new tools, and the instructor is absolutely phenomenal. love the enthusiasm and the interactions with andrew are a joy to watch.
A really good course that builds up the knowledge over the concepts covered in Course 1. All the ideas are applicable in real world scenario and this is what makes the course that much more valuable!
автор: Antoreep J•
In the workbook section, the question colab notebook opens up the answer notebook, please rectify the same.
got hands on , many stuff of cnn , great content. Thank you team
автор: Raffaele G•
Great course! I can't wait to going further and deeper. Thanks
автор: Asad A•
Learnt a lot and believe me this is perfect way to teach.
автор: Egon S•
Easy to follow and very good explanations
автор: Дим Щ•
Consize notebooks. Clear explanations
автор: Oliver M•
Great Course! Can't wait for part 3!
автор: DORA M B•
It's a great course. I enjoyed it!
автор: Chintada A•
really nice introduction to CNNs
автор: Zeev S•
Clear, concise, well designed
автор: NITESH N•
автор: Nicolas L•
First, I think the course was great, very instructive. Thanks to Andrew and Laurence for putting this together, is a great source of information to understand more about DL. Some things I think could improve the course.
I found the transfer learning lessons a bit unclear and I struggle generalizing this to other cases. Also, I was a bit confused by the flow of the course. The course starts with a multi classifier (or actually, the previous course), then the lessons focus on binary classifiers and it ends again with multi classifiers, because these should be the more complex ones.
One last technical thing, only on the last lesson of this course it is mentioned that the classifiers output the probabilities on alphabetical order when using ImageDataGenerators (or at least, that's my impresision). I've wondered since the course introduced the ImageDataGenerators, how the probabilities are assigned on the outputs. I could figure out on the sigmoid that the classifier would look for the first class on the directory and output 1 or 0 based on that, but it would be good to have this mentioned at some point on the video when the ImageDataGen is introduced.
Thanks again! Great course
автор: José D•
We go into deeper details following Course 1 with Convolutional Neural Network, using Data Augmentation & Dropout to reduce over-fitting, and with only a few lines of code thank to Keras (TensorFlow high level API). Easy useful examples. Just like Course1, there is no math, so you cannot understand what's under the hood, how and why it works. If you want deeper understanding, you must do the "Deep Learning" specialization, which is harder than this specialization.
автор: Jorge L M B•
I liked the hands-on approach of the course, but felt that the last assignment (Week 4) was a little buggy into which parts of code to write and which ones not. Nonetheless, I had a lot of fun!
автор: Edir G L•
It's great to learn about data augmentation techniques and how to implement this. This is a great complement for the deeplearning.ai's course on Convolutional Neural Networks.
автор: Vedang W•
The course has some great parts such as augmentation and transfer learning, but my expectations were understanding Tensorflow at a deeper level.
автор: Oleg K•
Last assignment could have been explained better. Laurence does not talk about ImageDataProcessing.flow, despite this is the only solution
автор: Henrique G•
I'm sad to say that I'm really disappointed with the course. What is even stranger is that professor Andrew is associated and endorse the course. I like professor Marooney, but honestly, even his free tutorials on the Tensorflow channel on Youtube have more information than this course. It really seems like something put together in a haste just to make it available on Coursera. The level of detail and instructions is not on par with the quality of both the Coursera platform and the professors associated with this course.
It seems that as I progress through the courses in this specialization the instructions get poorer and poorer and the level of information gets more and more scarce. It got to a point where we are just given notebooks to run; they are not even graded (they barely were on the first course). And even the notebooks where the we are given a chance to complete some code, there are absurd things like "print(#your code here#)" in places that don't even make sense except if we copy and paste from the other notebooks of the course. Really? Print what? The only way we can guess what kind of debug info the notebook is asking is by looking at other notebooks at that exact same line.
For the reviewers; if you are really reading this, please remember that Coursera is charging $49/month for this specialization. If we consider that an average student will take 4 weeks to complete, that's almost $200 for something that's barely a tutorial at it's current version. $49 may be a reasonable rate for a citizen of the US, for example, but it's and exorbitant amount of money for students of poorer countries using the platform in hopes of acquire knowledge of decent quality.
автор: Zoltan S•
After taking Andrew Ng,s truly excellent 5 course specialization, I was hoping that this followup specialization would be at the same high level. In my view (and I am sad to say this) the present course doesn't live up to that expectation.
Of course you could still learn something useful, mostly a selected part of the Keras API. The instuctor is friendly and explains some of the basics of convolutional neural networks. If you are willing to experiment on your own (run the code longer on Colab, play with the hyperparameters, etc) then you get more practice and certainly more out of this experience. Keras has a lot of good tools. For more advanced students going directly to the TensorFlow tutorial website is also an option (and it is free).
Overall the course seems a bit rushed, while it has the potential to be better. Let me suggest adding more basic materials to solidify knowledge (for example practicing hands-on image preprocessing before teaching the Keras preprocessing API and overall more experimentation with images). Also adding more exercises on more diverse topics (GAN's, face detection, variational autoencoders, object detection).
There are also some minor issues (easy to fix): for example right now in the Week 3 HW the prepared callback teaches the students exactly the wrong approach. It stops the learning cycle when the training accuracy improves over a certain threshold, instead of checking the validation accuracy. That is an unfortunate mistake to make in a week that discusses different ways to avoid overfitting.
автор: Kaustubh D•
This course is taught excellently, but there is very little content at least from a programming point of view. There was no need of an extra week for only specifying the differences of binary and multi-class classification in code. Rather, there could have been more covered if codes of different output structure like object recognition where the output is not a flat map could be covered. If it has been purposely done to keep the course open to even newbies in Machine Learning, then there should have been a course focussed for those who have done Andrew Ng's ML/DL specialization.
автор: Deepak A S•
This course doesn't talk about tensor flow. But uses keras only. The title is misleading!
автор: MD M R S•
Good, but not so good. they could have introduced tensorflow 2.0s functional api
автор: Paweł D•
Pretty basic level, aimed rather to beginners.
автор: Jarrod H•
The lectures are really good and quite engaging. The extra course content by Dr. Ng is also generally where I learn the most. This class does a decent job in introducing to you the Tensorflow library.
It feels a bit like an very well done tutorial. After finishing my second class I don't any any more idea of how to build a neural network than I did before. The data that they give you has already been cleaned and is ready to use which never happens in real life. The data manipulations that they ask you to do in the homework has zero explanation of why you are doing it. Just that add an extra dimension or split the data this way. I don't know why we need it split that way and it never says why. Further, exercise 4 in particular uses a different method than has been used for the entirety of first and second class. You're given a list of numbers rather than real images.
At the end of the class I wanted to understand how to build these models in the real world. If I want to predict cats vs dogs then great! However, if I want to try to categorize financial transactions or predict fraud or literally anything else, this class gives you no understanding of where to start, or how to approach the problem generally.
автор: Pawel B•
The course does not provide much knowledge. In fact this is a tiny extension compared to first part ("Introduction to TensorFlow for..."). The assignments are trivial - you need to change one or two parameters from the excercise codes. But another story is that the codes you succesfully run in Jupyter need to be tailored to satisfy coursera system (e.g. data cannot be loaded due to out of memory, TF version is different etc.). Instead of working on the model, you loose time on experimenting with the code unrelated to tensorflow or look at the forum for solutions specically suited to pass the test (they do not change how the models works). Do not recommend this course.