Chevron Left
Back to Natural Language Processing with Sequence Models

Learner Reviews & Feedback for Natural Language Processing with Sequence Models by DeepLearning.AI

4.5
stars
1,100 ratings

About the Course

In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets, b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model, c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

SA

Sep 27, 2020

Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.

AB

Nov 11, 2021

This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.

Filter by:

226 - 231 of 231 Reviews for Natural Language Processing with Sequence Models

By Hùng N T

Feb 26, 2024

Everything was good except that this course uses Trax. This framework has yet to have any new releases since 2021, and I cannot manage to train deep learning models using Trax on my GPU, not even possible in Colab. Trax is also very buggy and it does not have a large community to help. Recommendation for learners: take the course after it is fully rebooted to TensorFlow unless you want to take other courses to get useful/working code for NLP.

By Victor N

Oct 26, 2022

The code assignments are poorly documented, and doesn't even follow its own instructions. E.g. the last assignment has instructions in plain text, referencing variables and how to use them. But in the comments when to actually implement the code, there are new variables and new comments which seems to not overlap with the previous instructions. This makes it very confusing when just trying to understand what is suppsed to happen in the code.

By Greg D

Dec 24, 2020

Spends a lot of time going over tedious implementation details rather than teaching interesting NLP topics and nuances, especially in the assignments. Introduction to Trax seems to be the only saving grace, one bonus star :)))).

For having Andrew Ng's course as suggested background for this course this is a big step (read as fall) down.

By Miguel Á C T

Mar 19, 2021

The course is good as an example of code that executes tasks correctly; that is, you can see how neural networks are defined and used in Trax. However, from a pedagogical point of view, I find it quite weak. Concepts are poorly explained and notebooks consist of little more than copying and pasting previously displayed code.

By Mridul S

Oct 14, 2023

Theory wise, it's a decent course. But I am paying the money to access the labs and assignments as well. But all those practical work is in 'Trax' which is a task to learn and is totally useless because no one uses that. Kindly convert the labs and assignments to tensorflow or pytorch.

By Manikandan N

Jul 26, 2022

Useless i would strongly recommend you to don't enroll this course. None of the content is properly covered. For your kind notice Andrew Ng is not the instructor of this course.