Week Introduction

video-placeholder
Loading...
Просмотреть программу курса

Получаемые навыки

Word Embedding, Sentiment with Neural Nets, Siamese Networks, Natural Language Generation, Named-Entity Recognition

Рецензии

4.5 (оценок: 937)

  • 5 stars
    70,86 %
  • 4 stars
    17,18 %
  • 3 stars
    6,18 %
  • 2 stars
    2,88 %
  • 1 star
    2,88 %

BS

25 сент. 2020 г.

Great Course as usual. Tried siamese models but got a very different results. Will need to study more on the conceptual side and implementation behind them. But overall, I am glad I touched LSTMs.

KT

24 сент. 2020 г.

The lectures are well planned--very short and to the point. The labs offer immense opportunity for practice, and assignment notebooks are well-written! Overall, the course is fantastic!

Из урока

Recurrent Neural Networks for Language Modeling

Learn about the limitations of traditional language models and see how RNNs and GRUs use sequential data for text prediction. Then build your own next-word generator using a simple RNN on Shakespeare text data!

Преподаватели

  • Placeholder

    Younes Bensouda Mourri

    Instructor

  • Placeholder

    Łukasz Kaiser

    Instructor

  • Placeholder

    Eddy Shyu

    Curriculum Architect

Ознакомьтесь с нашим каталогом

Присоединяйтесь бесплатно и получайте персонализированные рекомендации, обновления и предложения.