Chevron Left
Back to Sequence Models

Learner Reviews & Feedback for Sequence Models by DeepLearning.AI

4.8
stars
29,852 ratings

About the Course

In the fifth course of the Deep Learning Specialization, you will become familiar with sequence models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more. By the end, you will be able to build and train Recurrent Neural Networks (RNNs) and commonly-used variants such as GRUs and LSTMs; apply RNNs to Character-level Language Modeling; gain experience with natural language processing and Word Embeddings; and use HuggingFace tokenizers and transformer models to solve different NLP tasks such as NER and Question Answering. The Deep Learning Specialization is a foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to take the definitive step in the world of AI by helping you gain the knowledge and skills to level up your career....

Top reviews

WK

Mar 13, 2018

I was really happy because I could learn deep learning from Andrew Ng.

The lectures were fantastic and amazing.

I was able to catch really important concepts of sequence models.

Thanks a lot!

JY

Oct 29, 2018

The lectures covers lots of SOTA deep learning algorithms and the lectures are well-designed and easy to understand. The programming assignment is really good to enhance the understanding of lectures.

Filter by:

3351 - 3375 of 3,621 Reviews for Sequence Models

By Yashwanth M

•

Jul 23, 2019

Good

By Rahila T

•

Nov 15, 2018

Good

By savinay s

•

Apr 9, 2018

good

By krishna m s g

•

Mar 22, 2018

g

o

o

By Aaradhya S

•

Apr 25, 2020

..

By Natalia O

•

Oct 4, 2019

in comparison to the previous courses from this sequence, this one is even less structured - ptobably this is because even broader knowledge is tried to be shown in only 3 weeks, but i feel like a lot is skipped between videos (which are ok) and the tasks - in many assignment tasks in this course it is not very well explained what is meant to be done - i mean this especially in case of Keras objects. In many cases it is quite unclear how those classes are supposed to be handled in the context of our task. There are some hints but those are mostly links to documentation (btw, some of the links are no longer up to date), but it is often not too well explained which properties those objects have, what one can do etc. so one ends up with trying using those objects in different configuarations, then googling around, looking on the course forum for the right answer but it is very difficult to derive it. There should be more precise instructions regarding handling Keras objects - the examples in the documentation and in blogs are often much simpler than those from assignments so one ends up not knowing what is going on. In summary - there is a big jump and a big gap between the intuitions in videos (which btw are much more fuzzy than those in first cources in the specialization, the intuitions get more and more superfluous as one doesnt go into detail) and what is being done in the assignments. One thing i really liked about hte previous assignments was that when writing the code one could really know very well what is going on. And this is no longer the case in this course...

By Mark S

•

Oct 9, 2019

As we head to the last course in the specialization (and the last two courses are the ones that interested me), we have error after error in the assignments, including problems with the kernel that are not obvious until you've struggled with incoherent stack trace output for a while.

Searching the disorganised discussion centre for the course/week in question you can find that these errors affect everyone and go back for a couple of years, never having been fixed. The mentors there help explain, but mentors cannot edit to fix the code as they do not have permission, and the course supervisors have long since disappeared. So you have to submit incorrect code to pass, then fix the code for your personal private code store - as the fixed code generates the correct numerical answers that unfortunately do not match the numerical answers that the grader requires to pass you!

It feels like, in the hurry to get the full specialization out, the final courses go downhill in terms of care & attention in the rush. Then afterwards, all of the errors and badly designed code in the assignments cause many unexpected headaches, nothing to do with DL, and were never fixed or maintained afterwards by the course supervisors.

In the end, the delays caused to me in the final (two) course(s) added at least one extra monthly payment on to my subscription. Overall I can't complain, the specialization is good. But feels abandoned by the lecturer & assistant lecturers since early 2018

By Stephen D

•

May 13, 2018

It's helpful to have this course since there aren't enough beginner-oriented courses on these topics, especially ones that also get into actual equations like he does. However I think he doesn't provide enough explanations of complicated topics like GRU's and LSTM's. There are lots of confusing aspects of both such technologies, and he could afford to spend even more time in explanation than he does.

EDIT: I am now on week 2. This course feels rushed and he doesn't take the time to clarify confusing issues - for example, when he first introduces how to learn word embeddings he calls the neural network you use a "language model" even though the network bears no resemblance to the language model we learned in week 1. This really confused me and he doesn't address this point. Also, he variously describes the embeddings as the "input" and the "parameters" of this neural network, even though those are clearly two different things. There are more issues where that came from.

Unlike all of his previous courses, I've found myself needing to go to Wikipedia and Google to try to fill in various holes in the presentations here.

Also, there is essentially no help on the forums. That isn't the reason for my low rating, since for a cheap course I didn't expect much. Still, it would have been nice if they had tried to do a little bit more there.

By Ramon R

•

May 8, 2018

Unlike the other courses which Andrew Ng provides, this one contains many spelling mistakes in the programming assignment, the programming assignments are less structured and understandable (missing or wrong information in nearly every assignment) and an introduction to keras is missing. I found it great that the keras framework is an important component of this course, but unlike the tensorflow introduction it is missing here. It is frustrating, when you might have the right functions but no information how to input and determine the correct variables for the functions. Anyway I found the outline of the course very good as it gives a good overview of many methods and how they work. To my mind the consistency of the assignments and also the story telling needs to be improved to reach the level of other courses where Andrew is involved. It appeared more chaotic and the complexity of the algorithms is overwhelming, so a better introduction to how they work, might be appealing. In the end, I worked through it and I gained a basic understanding of keras and RNN algorithms. So it was definitely worth it.

By Luca M B

•

Jul 27, 2018

A nice course after all, but I expected something more. It is valuable if you know nothing about RNN and NLP, or if you know something and want to go a little deeper and work on some guided keras examples.

What it is not, is an in depth RNN course. It's very short: in my case the 3 weeks boiled down to 2.5 fulltime days. Not enough for a full review of the topic.

Homeworks are interesting but:

-very simple, no large scale application

-small dataset

-short rounds of CPU training: no GPU, no access to server clusters

-keras layers are used without much explanation about it, this is sad since keras docs is really incomplete about usage examples. I'm referring to calling an LSTM layer inside a loop to manually create all the timestep stucture!

-keras layers are used that have never been introduced in the course (such as BatchNormalization)

-often heuristics for network topology and hyperparameter values are not clearly explained, leaving the student with no insights on how to approach different tasks

By Travis J

•

Nov 24, 2018

The subject matter was a good introduction to various RNN model types and concepts. I have to dock a couple stars, however, as the course leans so heavily on Keras implementations during the assignments that it really should be listed as a firm requirement. While I feel that I'm more experienced with both RNN models and the use of Keras now, it was a struggle with what felt like a lot of cargo culting for me to get through most of the assignments. I don't consider the brief lesson on Keras at the end of the second course to be sufficient training, particularly if much time has passed between taking that course and this one. A brief "Lesson 0" on Keras is sorely needed at the beginning of this course. Otherwise, it should be explicitly and firmly communicated at the start that the programming assignments require a certain familiarity with the Keras framework. Overall, I do highly recommend this course, but be forewarned about the need to be familiar with Keras before starting.

By Joshua P J

•

Jul 31, 2018

The material provides a strong overview of sample problems for which sequence models work well. However, the class doesn't give users the conceptual mastery needed to apply sequence models to new or related problems. The issue is that the motivation and concepts underlying new architectures aren't well-explained (they're often an afterthought at the end of a lecture). This approach to teaching feels backwards.

Specific issues: Week 2 & 3 homework treats lecture material as mostly black boxes so they aren't particularly illustrative. The week 3 Attention Model lectures make no sense, are taught in reverse order, and feel unfocused (with apologies, I know there's a bad pun there; it's not intentional). In Week 3, I ended up skipping to the homework because I found lecture exasperating; to my surprise, the Markdown comment boxes in the Python notebooks explained the material better than lecture did.

By Bill F

•

Sep 17, 2019

Toward the end of the specialization, there seemed to be a noticeable drop in both the quality of instruction and the programming assignments. Course 5 on sequence models was much more "hand wavy" than Course 4 on convolution models. At the end of Course 5, I'm still not sure if I learned anything meaningful other than filling in a few blank lines of code to complete the assignment. There was much less intuition provided about the nature of recurrent nets, and then translating that to code was foggy. More attention needs to be paid to how and what the framework is actually doing, not just giving hints at filling the blanks.

Finally, the grader especially in week 3 caused me many, many hours of wasted time and frustration chasing phantom problems in the notebook. Coursera and/or deeplearning.ai does not pay much attention if any to solving the grader or other systemic problems.

By Slobodan C

•

Feb 20, 2018

The best part of the course are "intuitions" presented by Prof. Ng. The worst parts are technical problems with Coursera infrastructure, and insufficient number of mentors available to offer suggestions. For example, in forums there are some doubts about the optional parts of assignments (bad formulas etc.), but these quite valid questions are just not addressed by anybody. I would also suggest adding a separate course on Keras as a part of the specialization, because the Keras introduction offered in a specialization is way too basic. This makes it quite difficult to go through the assignments for the sequential models. It would also be helpful to extend the last two courses to five weeks or so, to cover course material in more details.

By Julien B

•

Jul 15, 2018

The lectures are great, but the assignments are not: apart from the hours wasted restarting notebooks (!), I've found very frustrating to have to go between "write `j = 0` on the next line" to "figure out Keras documentation by yourself, the grader will only tell you `it's wrong`" (Keras having such a horrendous API, with many functions having 20+ arguments, and sometimes the course tells you to specify an argument that's not even in the documentation!).There is no balance between the two (you're mostly told "write this, write that", with no space for thinking as in the first course of the specialisation) and the assignments are primarily a chore you have to go through, even though you won't learn much, if anything, from them.

By Franck B

•

Feb 17, 2018

Really big struggle with dinos, versions of workbooks, and sometimes no logical way to explain why grader does not validate a working notebook. Pain, frustration, taking away time from proper learning.

On the course itself, some exercices felt like toying (e.g. very simple function to check if a time_segment already exists) in the middle of a keras deep learning model, where learning debugging, setting up smaller ones would have helped me learn more I think.

Still not sure I am at ease with creating models, we experimented various ways over the specialisation, and the selection of model architecture or even tuning after 1st running version is still mostly guess work to me. Will need to digest and keep learning

By Benny P

•

Mar 29, 2018

This course provides great introduction to RNN and other sequence models and their application to popular fields such as NLP and audio processing. It does great job in providing the motivation and intuition behind the creation of such sequence models (e.g. LSTM, GRU, Word2vec, GloVe), however I feel that the theoretical explanations need to have more depth. During this course I had to refer to other websites to gain more technical understanding about LSTM and GRU. The programming exercises are nice, they cover many popular topics such as NLP, speech, and music processing, but I struggled when doing it in Keras. I wish some pointers were provided on where to learn it before doing the assignments.

By José A M

•

Aug 5, 2018

Too many stability issues on the platform to get the notebook up and running.

Few bugs and errors on lectures and exercises, if they are found by the community you should update the material even if it involves recording a video again. Too much time spent on the notebooks figuring out "side" stuff that is not what I am here to learn.

While on the course for CNN it covered the state of the art of the field, in LSTM I think there is much more that could have been explained.

I have missed examples on other type of problems like forecasting time series, events and other more business like applications.

Still I learnt a lot and would do it again.

By André L

•

Feb 14, 2023

The content is didatic, as well as the explanations, a reasonable course for a real beginner. However, the material is one of the worst I have seen: a lot of errors that are indicated with notes between the classes and MANY annotations and sketches from Andrew in the slides. It mixes up handwritten annotations with digital text, a complete mess. I had to edit the PDF in order to make something useful, even though a lot of information is either missing or floating somethere in the slide. Besides that, some videos are not edited properly: it is possible to experience many repetitions of the same phrase.

By Felipe M

•

Feb 24, 2018

Although the course content is very useful, the hurry in which the course was put together does show. Video was clearly under-edited (as is apparent by Andrew repeating some statements in the expectation that the previous one would be edited out), and the auto graders caused me to waste many more hours than truly needed to get my assignments in a format that would be accepted. Finally, I was very disappointed at the fact that the specialization was launched and then the last course pulled out, so I had to pay two months even though I had budgeted my own time to finish it in one.

By Arjan G

•

Mar 3, 2018

Nice to learn how RNN's work. But too rough around the edges for a 5-star score.

Good points:

I learned RNNs, language models and many other useful techniques

Subject matter is mostly well explained in the lectures

Original authors of a technique are cited

Bad points:

Some things should be explained more elaborately while other explanations can be shorter, especially in the assignments.

Mistakes in the editing in the audio clips of the lectures

Mistakes in the notebooks, sometimes non-intuitive/bad coding principles are used

By Gautam D

•

Jun 17, 2019

To be completely honest, I loved Dr. Andrew's method of teaching. But the assignments just flew over my head because I didn't have enough hours of practice of Keras under my belt. I know Keras is there to make things easy but it's very difficult to just trying to pass the grader. To goal of assignments was fantastic, I mean, generating music, etc. sounds really amazing but I feel that if there was some more time given to make us better in Keras and other technicalities then I would've loved this course much more!

By Andrei S

•

Sep 19, 2023

This course presents a number of ideas that are used in mainstream sequential models. Unfortunately, the lectures are not as detailed and precise as one may wish (and are not on par with the previous courses in the specialization). Also, quizzes and programming exercises are not really useful. In particular, the programming exercises are quite superficial, more of "guess what the test expects" type.

The bottom line: the videos are worth watching (although not perfect), the exercises are useless for learning.

By Javedali S

•

Mar 29, 2018

Good but i expected more. The main thing i like about first 3 courses, they were really deep. In the last two courses we have skipped the backpropogation. Now this is something which you can keep optional. I like the way Andrew Ng teaches, going to the basics, and that is why I came here and paid 40 euros per month. Also, there are few stuff missing like Generative models, Adversarial networks, GAN and etc. It would be good if Andrew can have more courses related to this and deep (as it is deep learning :))

By Hossein K

•

Jan 3, 2024

The course introduces a lot of new concepts about sequence models. It seems introductory as it only scratches the surface. There are lab assignments to reinforce the learning. I found the lab assignment a bit overwhelming as I encountered new details about the implementation and new reading materials and references while I was working on the clock to finish them within the time limit. Moving those materials out of the lab assignments would give students a chance to review them beforehand.