Nice experience taking this course. Precise and to the point introduction of topics and a really nice head start into practical aspects of Computer Vision and using the amazing tensorflow framework..
great introductory stuff, great way to keep in touch with tensorflow's new tools, and the instructor is absolutely phenomenal. love the enthusiasm and the interactions with andrew are a joy to watch.
автор: Islam U•
The course definitely teaches interesting techniques (Dropout, Transfer Learning) and tools (use of ImageDataGenerator). What i think would be an improment point is further tips on how to actually achieve a state of art (or really high quality) models. For example for full Cats and Dogs dataset from Kaggle, there was an optional ungraded work that asked to achieve over 99.9% accuracy on both training/validated datasets. It would be great if some tips on how to achieve this would be given. Maybe some discussion of network architectures that can achieve this, as this subject is not always covered, while it plays probably a dominant role whether you make it or break it. Otherwise, i liked the course and thanks for wonderfull explanations.
P.s. week 4 final graded task is structured suboptimally, so maybe it can be reviewed, as many people struggling with many sorts of errors.
автор: Xiaolong L•
In general a very good course. But it seems that the instructor could have put more work into the weekly projects. For example, the weekly project for the 3rd week is almost the same as that of week 2. Also, one of the week boasted that the project involves training on the full dogs vs cats dataset but it is actually still just a subset. I was able to run on the full data set by downloading and loading them manually. I can see that the platform has concerns on the computational resources usage, but it should at least be accurate on in the project descriptions.
автор: Manutej M•
Week 4 was a rather challenging exercise and was out of left-field compared to the pace of the other exercises. This last exercise felt more like a "final exam" There were several things not mentioned in depth in the class that could have aided in bolstering the understanding necessary for the labs and for the real world. The class can get a bit repetitive and narrow sometimes in its focus and perhaps that's for simplicity, but I believe people could benefit from more depth being taught in the course.
автор: João A J d S•
I think I might say this for every course of this specialisation:
Great content all around!
It has some great colab examples explaining how to put these models into action on TensorFlow, which I'm know I'm going to revisit time and again.
There's only one thing that I think it might not be quite so good: the evaluation of the course. There isn't one, apart from the quizes. A bit more evaluation steps, as per in Andrew's Deep Learning Specialisation, would require more commitment from students.
автор: Anand H•
One challenge i have faced is with deploying the trained models. I find very little coverage on that across courses. It's one thing to save a model.h5 or model.pb. It would be nice if you can add a small piece on deployment of these models using TF Serving or something similar. There is some distance between just getting these files outputted and deploying. TF documentation is confusing about some of these things. Would be nice if you can include a module on that.
автор: AbdulSamad M Z•
Great course! Builds on the concepts of Course 1 in this Specialization although the course can be taken without having completed Course 1. Concepts are explained in a super clear and engaging way and the hands-on exercises give you the experience you need to become proficient. The course covers plenty of practical concepts including some pitfalls for practitioners to avoid, but the theoretical concepts are covered less than I expected.
автор: Mikhail C•
Content was clear building upon each topic however the lab submissions need work. Most of the "write your own code" complexities and issues where around data wrangling, directories, and memory efficient code which was not too relevant to the main learning objectives. I spent 90% of the coding exercises fixing or waiting for the data prep functions instead of experimenting with the different layers, dropouts, augmentation values.
автор: Henrique G•
The course is well-paced and the instructor provides good coverage on the main topics on Convolutional Neural Networks. I'd recommend watching Andrew Ng videos from the Deep Learning specialization for a better understanding of topics like dropout, transfer learning, and optimization methods. The final exam is quite difficult as you need a lot of trial and error to get things to work properly - just like the real messy world.
автор: Jennifer J•
Whilst I very much enjoyed playing around with convolutional neural networks, transfer learning and using image transformation to augment standard convolution, this course lacked an proper introduction in how to use python and will require a course into python or a good python language reference book which should help you build the necessary functions for completing the tasks required. Otherwise, this was a great course!
автор: Bob K•
As another reviewer mentioned, this course is much simpler than Andrew Ng's deep learning specialisation but even so it has it's uses. I'm taking it to prepare for the Google TensorFlow certificate and it's forcing me to learn more of the api.
Andrew Ng's course was how to implement
the theory from papers, whereas this course is how to use TensorFlow. Each has it's place, although the former is probably more valuable.
автор: Grzegorz G•
Movies are short but essential and with practical knowledge. Quizzes are interesting and not obvious. Unfortunately, the weakest part of the course is the final tasks at the end of the week. They are poorly described, sometimes they do not even have specific requirements for what is the target result of your accuracy for that task. You learn about it when your tasks are declined during the process of grading!
автор: Tom G•
Overall very helpful. I wish debugging on the jupyter notebook assignments was better and that it gave pop text descriptions, etc. Google collab is much better that way. I wish the assignments could use that environment instead. Also, the assignments us model.fit_generator which is now deprecated in TF 2.2. Would be good if the assignments were updated to use model.fit instead.
автор: Sourav S•
The assignment in the last week was very poorly designed. Other than that, I really liked the course, especially the parts about augmenting data and using pre-trained models. Perhaps the course could cover more topics on how to use pre-trained models, the different kinds of pre-trained models available out there, and the specific applications in which they should be used.
автор: Danilo B•
The course is very good, but coming from the Deep Learning Specialization, also offered by deeplearning.ai, it feels somewhat like a downgrade having 15 minutes of video for each week, while the other specialization had real extense and complete explanations with over 2h of video. I feel like 10min more of explanations going through the code would make a huge difference.
автор: Jakub P•
Quite good basic overview of image classification in Tensorflow. After the course can implement basic convolutional neural network using data augmentation and transfer learning techniques. The tasks however are very basic and except for the last lab task do not provide enough challenge to be meaningful. One of the labs is a copy paste of the Introduction to AI one...
автор: Raman S•
The grader memory availability does not match the one available to us during the exercise. as a result insufficient memory is shown as grader remarks whereas we do not face such a problem. This becomes hard to debug and is more of analysis, trial and error. Can be avoided if we also get the same type of warning when we create/update our notebook
автор: Cameron W•
Course material was good. The only issue I found was that the graded exercises are graded by automated systems that have different requirements to the notebook environment used for development. This 'black box' strategy by Coursera makes some of the exercises difficult. If you don't have debugging skills with Python, don't attempt this course.
автор: Michael R•
Solid and accessible instruction. Would be remiss not to mention inconsistency between instruction and current tensorflow codebase. Requires a lot of digging by the student to reconcile the instruction with the exercises, particularly in week 4. However, my intuition for tensorflow architecture is probably deeper because of that digging.
автор: Anubhav S•
Short of words to describe this fabulous course by Laurence. Every concept is covered. However, would have liked him suggesting some extra resources like Tensoflow Playground, Hub, and stuff. The section on Transfer Learning could have used the newer syntax based on TF Hub. Otherwise, nothing to complain about. Top course.
автор: Oleksiy S•
Exellent tutorial for using Tensorflow and convolutional networks. Useful usage examples, interesting and challenging exercises. A few minor mistakes prevent five star grading. But please note that mistakes happen and we have to live with this :-). Nice work, looking forward for the next course of the specialization.
автор: Amit M•
Interesting course. I can do the exactly what is being taught - no more no less. It is almost like we are being taught to solve specific problems rather than learn of the subject. Perhaps, it is the nature of the subject itself - there is no systematic learning - it just is. Learn what is done now and works.
автор: RUDRA P D•
What I feel in this course is that, a lot of the exercises are much about file handling operations instead of CNN implementation. Also, in the exercises there are missing task allotments/comments.
I liked the explanation and implementation part of Transfer Learning, I think it's the best part of this course.
автор: Stefan B•
The course gives you an eagle eye view of how to use keras tensorflow for convnets. While they lectures are good, they are very short. I would have loved to hear more about training and storing your own networks for transfer learning and a bit more on regularization. A bit too shallow and easy for my taste.