Categories

There are currently no items in your shopping cart.

User Panel

Forgot your password?.

Coursera Natural Language Processing Specialization

Video Introducing this tutorial


NLP Specialization:
01 course 4 introduction
02 connect with your mentors and fellow learners on slack instructions
03 seq2seq
04 alignment
05 background on seq2seq instructions
06 optional the real meaning of ich bin ein berliner instructions
07 attention
08 attention instructions
09 setup for machine translation
10 training an nmt with attention
11 training an nmt with attention instructions
12 optional what is teacher forcing instructions
13 evaluation for machine translation
14 evaluation for machine translation instructions
15 sampling and decoding
16 sampling and decoding instructions
17 content resource instructions
01 how to refresh your workspace instructions
01 andrew ng with oren etzioni
01 transformers vs rnns
02 transformers vs rnns instructions
03 transformer applications
04 transformer applications instructions
05 dot product attention
06 dot product attention instructions
07 causal attention
08 causal attention instructions
09 multi head attention
10 multi head attention instructions
11 transformer decoder
12 transformer decoder instructions
13 transformer summarizer
14 transformer summarizer instructions
15 content resource instructions
01 week 3 overview
02 week 3 overview instructions
03 transfer learning in nlp
04 transfer learning in nlp instructions
05 elmo gpt bert t5
06 elmo gpt bert t5 instructions
07 bidirectional encoder representations from transformers bert
08 bidirectional encoder representations from transformers bert instructions
09 bert objective
10 bert objective instructions
11 fine tuning bert
12 fine tuning bert instructions
13 transformer t5
14 transformer t5 instructions
15 multi task training strategy
16 multi task training strategy instructions
17 glue benchmark
18 glue benchmark instructions
19 question answering
20 question answering instructions
21 content resource instructions
01 tasks with long sequences
02 optional ai storytelling instructions
03 transformer complexity
04 lsh attention
05 optional knn lsh review instructions
06 motivation for reversible layers memory
07 reversible residual layers
08 reformer
09 optional transformers beyond nlp instructions
10 acknowledgments instructions
01 andrew ng with quoc le
01 references instructions
01 vector space models
02 word by word and word by doc
03 euclidean distance
04 cosine similarity intuition
05 cosine similarity
06 manipulating words in vector spaces
07 visualization and pca
08 pca algorithm
01 assignment word embeddings instructions
01 intro to course 2
02 connect with your mentors and fellow learners on slack instructions
03 overview
04 autocorrect
05 building the model
06 building the model ii
07 minimum edit distance
08 minimum edit distance algorithm
09 minimum edit distance algorithm ii
10 minimum edit distance algorithm iii
01 how to refresh your workspace instructions
01 n grams overview
02 n grams and probabilities
03 sequence probabilities
04 starting and ending sentences
05 the n gram language model
06 language model evaluation
07 out of vocabulary words
08 smoothing
09 week summary
01 overview
02 basic word representations
03 word embeddings
04 how to create word embeddings
05 word embedding methods
06 continuous bag of words model
07 cleaning and tokenization
08 sliding window of words in python
09 transforming words into vectors
10 architecture of the cbow model
11 architecture of the cbow model dimensions
12 architecture of the cbow model dimensions 2
13 architecture of the cbow model activation functions
14 training a cbow model cost function
15 training a cbow model forward propagation
16 training a cbow model backpropagation and gradient descent
17 extracting word embedding vectors
18 evaluating word embeddings intrinsic evaluation
19 evaluating word embeddings extrinsic evaluation
20 conclusion
01 acknowledgments instructions
01 course 3 introduction
02 connect with your mentors and fellow learners on slack instructions
03 neural networks for sentiment analysis
04 trax neural networks
05 why we recommend trax
06 reading optional trax and jax docs and code index
06 reading optional trax and jax docs and code instructions
07 trax layers
08 dense and relu layers
09 serial layer
10 other layers
11 training
01 how to refresh your workspace instructions
01 traditional language models
02 recurrent neural networks
03 applications of rnns
04 math in simple rnns
05 cost function for rnns
06 implementation note
07 gated recurrent units
08 deep and bi directional rnns
01 rnns and vanishing gradients
02 optional intro to optimization in deep learning gradient descent instructions
03 introduction to lstms
04 optional understanding lstms instructions
05 lstm architecture
06 introduction to named entity recognition
07 training ners data processing
08 long short term memory deep learning specialization c5 instructions
09 computing accuracy
01 siamese networks
02 architecture
03 cost function
04 triplets
05 computing the cost i
06 computing the cost ii
07 one shot learning
08 training testing
01 acknowledgments instructions

NLP with Attention Models:
01 course 4 introduction
02 connect with your mentors and fellow learners on slack instructions
03 seq2seq
04 alignment
05 background on seq2seq instructions
06 optional the real meaning of ich bin ein berliner instructions
07 attention
08 attention instructions
09 setup for machine translation
10 training an nmt with attention
11 training an nmt with attention instructions
12 optional what is teacher forcing instructions
13 evaluation for machine translation
14 evaluation for machine translation instructions
15 sampling and decoding
16 sampling and decoding instructions
17 content resource instructions
01 how to refresh your workspace instructions
01 andrew ng with oren etzioni
01 transformers vs rnns
02 transformers vs rnns instructions
03 transformer applications
04 transformer applications instructions
05 dot product attention
06 dot product attention instructions
07 causal attention
08 causal attention instructions
09 multi head attention
10 multi head attention instructions
11 transformer decoder
12 transformer decoder instructions
13 transformer summarizer
14 transformer summarizer instructions
15 content resource instructions
01 week 3 overview
02 week 3 overview instructions
03 transfer learning in nlp
04 transfer learning in nlp instructions
05 elmo gpt bert t5
06 elmo gpt bert t5 instructions
07 bidirectional encoder representations from transformers bert
08 bidirectional encoder representations from transformers bert instructions
09 bert objective
10 bert objective instructions
11 fine tuning bert
12 fine tuning bert instructions
13 transformer t5
14 transformer t5 instructions
15 multi task training strategy
16 multi task training strategy instructions
17 glue benchmark
18 glue benchmark instructions
19 question answering
20 question answering instructions
21 content resource instructions
01 tasks with long sequences
02 optional ai storytelling instructions
03 transformer complexity
04 lsh attention
05 optional knn lsh review instructions
06 motivation for reversible layers memory
07 reversible residual layers
08 reformer
09 optional transformers beyond nlp instructions
10 acknowledgments instructions
01 andrew ng with quoc le
01 references instructions

NLP with Classification and Vector Spaces:
01 welcome to the nlp specialization
02 connect with your mentors and fellow learners on slack instructions
03 welcome to course 1
04 acknowledgement ken church instructions
05 supervised ml sentiment analysis
06 supervised ml sentiment analysis instructions
07 vocabulary feature extraction
08 vocabulary feature extraction instructions
09 negative and positive frequencies
10 feature extraction with frequencies
11 feature extraction with frequencies instructions
12 preprocessing
13 preprocessing instructions
14 putting it all together
15 putting it all together instructions
16 logistic regression overview
17 logistic regression overview instructions
18 logistic regression training
19 logistic regression training instructions
20 logistic regression testing
21 logistic regression testing instructions
22 logistic regression cost function
23 optional logistic regression cost function instructions
24 optional logistic regression gradient instructions
01 assignment logistic regression instructions
02 how to refresh your workspace instructions
01 andrew ng with chris manning
01 probability and bayes rule
02 probability and bayes rule instructions
03 bayes rule
04 bayes rule instructions
05 naive bayes introduction
06 naive bayes introduction instructions
07 laplacian smoothing
08 laplacian smoothing instructions
09 log likelihood part 1
10 log likelihood part 1 instructions
11 log likelihood part 2
12 log likelihood part 2 instructions
13 training naive bayes
14 training naive bayes instructions
15 testing naive bayes
16 testing naive bayes instructions
17 applications of naive bayes
18 applications of naive bayes instructions
19 naive bayes assumptions
20 naive bayes assumptions instructions
21 error analysis
22 error analysis instructions
01 assignment naive bayes instructions
01 vector space models
02 word by word and word by doc
03 euclidean distance
04 cosine similarity intuition
05 cosine similarity
06 manipulating words in vector spaces
07 visualization and pca
08 pca algorithm
01 assignment word embeddings instructions
01 overview
02 transforming word vectors
03 k nearest neighbors
04 hash tables and hash functions
05 locality sensitive hashing
06 multiple planes
07 approximate nearest neighbors
08 searching documents
01 andrew ng with kathleen mckeown
01 word translation instructions
02 acknowledgements instructions
03 bibliography instructions

NLP with Probabilistic Models:
01 intro to course 2
02 connect with your mentors and fellow learners on slack instructions
03 overview
04 autocorrect
05 building the model
06 building the model ii
07 minimum edit distance
08 minimum edit distance algorithm
09 minimum edit distance algorithm ii
10 minimum edit distance algorithm iii
01 how to refresh your workspace instructions
01 part of speech tagging
02 markov chains
03 markov chains and pos tags
04 hidden markov models
05 calculating probabilities
06 populating the transition matrix
07 populating the emission matrix
08 the viterbi algorithm
09 viterbi initialization
10 viterbi forward pass
11 viterbi backward pass
01 n grams overview
02 n grams and probabilities
03 sequence probabilities
04 starting and ending sentences
05 the n gram language model
06 language model evaluation
07 out of vocabulary words
08 smoothing
09 week summary
01 overview
02 basic word representations
03 word embeddings
04 how to create word embeddings
05 word embedding methods
06 continuous bag of words model
07 cleaning and tokenization
08 sliding window of words in python
09 transforming words into vectors
10 architecture of the cbow model
11 architecture of the cbow model dimensions
12 architecture of the cbow model dimensions 2
13 architecture of the cbow model activation functions
14 training a cbow model cost function
15 training a cbow model forward propagation
16 training a cbow model backpropagation and gradient descent
17 extracting word embedding vectors
18 evaluating word embeddings intrinsic evaluation
19 evaluating word embeddings extrinsic evaluation
20 conclusion
01 acknowledgments instructions

NLP with Sequence Models:
01 course 3 introduction
03 neural networks for sentiment analysis
04 trax neural networks
05 why we recommend trax
06 reading optional trax and jax docs and code index
06 reading optional trax and jax docs and code instructions
07 trax layers
08 dense and relu layers
09 serial layer
10 other layers
11 training
01 how to refresh your workspace instructions
01 traditional language models
02 recurrent neural networks
03 applications of rnns
04 math in simple rnns
05 cost function for rnns
06 implementation note
07 gated recurrent units
08 deep and bi directional rnns
01 rnns and vanishing gradients
03 introduction to lstms
04 optional understanding lstms instructions
05 lstm architecture
06 introduction to named entity recognition
07 training ners data processing
09 computing accuracy
01 siamese networks
02 architecture
03 cost function
04 triplets
05 computing the cost i
06 computing the cost ii
07 one shot learning
08 training testing
01 acknowledgments instructions

NLP with Attention Models:
01 course 4 introduction
02 connect with your mentors and fellow learners on slack instructions
03 seq2seq
04 alignment
05 background on seq2seq instructions
06 optional the real meaning of ich bin ein berliner instructions
07 attention
08 attention instructions
09 setup for machine translation
10 training an nmt with attention
11 training an nmt with attention instructions
12 optional what is teacher forcing instructions
13 evaluation for machine translation
14 evaluation for machine translation instructions
15 sampling and decoding
16 sampling and decoding instructions
17 content resource instructions
01 how to refresh your workspace instructions
01 andrew ng with oren etzioni
01 transformers vs rnns
02 transformers vs rnns instructions
03 transformer applications
04 transformer applications instructions
05 dot product attention
06 dot product attention instructions
07 causal attention
08 causal attention instructions
09 multi head attention
10 multi head attention instructions
11 transformer decoder
12 transformer decoder instructions
13 transformer summarizer
14 transformer summarizer instructions
15 content resource instructions
01 week 3 overview
02 week 3 overview instructions
03 transfer learning in nlp
04 transfer learning in nlp instructions
05 elmo gpt bert t5
06 elmo gpt bert t5 instructions
07 bidirectional encoder representations from transformers bert
08 bidirectional encoder representations from transformers bert instructions
09 bert objective
10 bert objective instructions
11 fine tuning bert
12 fine tuning bert instructions
13 transformer t5
14 transformer t5 instructions
15 multi task training strategy
16 multi task training strategy instructions
17 glue benchmark
18 glue benchmark instructions
19 question answering
20 question answering instructions
21 content resource instructions
01 tasks with long sequences
02 optional ai storytelling instructions
03 transformer complexity
04 lsh attention
05 optional knn lsh review instructions
06 motivation for reversible layers memory
07 reversible residual layers
08 reformer
09 optional transformers beyond nlp instructions
10 acknowledgments instructions
01 andrew ng with quoc le
01 references instructions

NLP with Classification and Vector Spaces:
01 welcome to the nlp specialization
02 connect with your mentors and fellow learners on slack instructions
03 welcome to course 1
04 acknowledgement ken church instructions
05 supervised ml sentiment analysis
06 supervised ml sentiment analysis instructions
07 vocabulary feature extraction
08 vocabulary feature extraction instructions
09 negative and positive frequencies
10 feature extraction with frequencies
11 feature extraction with frequencies instructions
12 preprocessing
13 preprocessing instructions
14 putting it all together
15 putting it all together instructions
16 logistic regression overview
17 logistic regression overview instructions
18 logistic regression training
19 logistic regression training instructions
20 logistic regression testing
21 logistic regression testing instructions
22 logistic regression cost function
23 optional logistic regression cost function instructions
24 optional logistic regression gradient instructions
01 assignment logistic regression instructions
02 how to refresh your workspace instructions
01 andrew ng with chris manning
01 probability and bayes rule
02 probability and bayes rule instructions
03 bayes rule
04 bayes rule instructions
05 naive bayes introduction
06 naive bayes introduction instructions
07 laplacian smoothing
08 laplacian smoothing instructions
09 log likelihood part 1
10 log likelihood part 1 instructions
11 log likelihood part 2
12 log likelihood part 2 instructions
13 training naive bayes
14 training naive bayes instructions
15 testing naive bayes
16 testing naive bayes instructions
17 applications of naive bayes
18 applications of naive bayes instructions
19 naive bayes assumptions
20 naive bayes assumptions instructions
21 error analysis
22 error analysis instructions
01 assignment naive bayes instructions
01 vector space models
02 word by word and word by doc
03 euclidean distance
04 cosine similarity intuition
05 cosine similarity
06 manipulating words in vector spaces
07 visualization and pca
08 pca algorithm
01 assignment word embeddings instructions
01 overview
02 transforming word vectors
03 k nearest neighbors
04 hash tables and hash functions
05 locality sensitive hashing
06 multiple planes
07 approximate nearest neighbors
08 searching documents
01 andrew ng with kathleen mckeown
01 word translation instructions
02 acknowledgements instructions
03 bibliography instructions

NLP with Probabilistic Models:
01 intro to course 2
02 connect with your mentors and fellow learners on slack instructions
03 overview
04 autocorrect
05 building the model
06 building the model ii
07 minimum edit distance
08 minimum edit distance algorithm
09 minimum edit distance algorithm ii
10 minimum edit distance algorithm iii
01 how to refresh your workspace instructions
01 part of speech tagging
02 markov chains
03 markov chains and pos tags
04 hidden markov models
05 calculating probabilities
06 populating the transition matrix
07 populating the emission matrix
08 the viterbi algorithm
09 viterbi initialization
10 viterbi forward pass
11 viterbi backward pass
01 n grams overview
02 n grams and probabilities
03 sequence probabilities
04 starting and ending sentences
05 the n gram language model
06 language model evaluation
07 out of vocabulary words
08 smoothing
09 week summary
01 overview
02 basic word representations
03 word embeddings
04 how to create word embeddings
05 word embedding methods
06 continuous bag of words model
07 cleaning and tokenization
08 sliding window of words in python
09 transforming words into vectors
10 architecture of the cbow model
11 architecture of the cbow model dimensions
12 architecture of the cbow model dimensions 2
13 architecture of the cbow model activation functions
14 training a cbow model cost function
15 training a cbow model forward propagation
16 training a cbow model backpropagation and gradient descent
17 extracting word embedding vectors
18 evaluating word embeddings intrinsic evaluation
19 evaluating word embeddings extrinsic evaluation
20 conclusion
01 acknowledgments instructions

NLP with Sequence Models:
01 course 3 introduction
03 neural networks for sentiment analysis
04 trax neural networks
05 why we recommend trax
06 reading optional trax and jax docs and code index
06 reading optional trax and jax docs and code instructions
07 trax layers
08 dense and relu layers
09 serial layer
10 other layers
11 training
01 how to refresh your workspace instructions
01 traditional language models
02 recurrent neural networks
03 applications of rnns
04 math in simple rnns
05 cost function for rnns
06 implementation note
07 gated recurrent units
08 deep and bi directional rnns
01 rnns and vanishing gradients
03 introduction to lstms
04 optional understanding lstms instructions
05 lstm architecture
06 introduction to named entity recognition
07 training ners data processing
09 computing accuracy
01 siamese networks
02 architecture
03 cost function
04 triplets
05 computing the cost i
06 computing the cost ii
07 one shot learning
08 training testing
01 acknowledgments instructions