In this blog you will find the correct answer of the Coursera quiz Introduction to Machine Coursera Week 4 Quiz mixsaver always try to brings best blogs and best coupon codes
Week 4 Comprehensive
1.
Question 1
What is meant by “word vector”?
1 point
- The latitude and longitude of the place a word originated.
- A vector of numbers associated with a word.
- Assigning a corresponding number to each word.
- A vector consisting of all words in a vocabulary.
2.
Question 2
Which word is a synonym for “word vector”?
1 point
- Norm
- Array
- Embedding
- Stack
3.
Question 3
What is the term for a set of vectors, with one vector for each word in the vocabulary?
1 point
- Space
- Array
- Codebook
- Embedding
4.
Question 4
What is natural language processing?
1 point
- Making natural text conform to formal language standards.
- Translating natural text characters to unicode representations.
- Translating human-readable code to machine-readable instructions.
- Taking natural text and making inferences and predictions.
5.
Question 5
What is the goal of learning word vectors?
1 point
- Find the hidden or latent features in a text.
- Labelling a text corpus, so a human doesn’t have to do it.
- Determine the vocabulary in the codebook.
- Given a word, predict which words are in its vicinity.
6.
Question 6
What function is the generalization of the logistic function to multiple dimensions?
1 point
- Hyperbolic tangent function
- Exponential log likelihood
- Squash function
- Softmax function
7.
Question 7
What is the continuous bag of words (CBOW) approach?
1 point
- Vectors for the neighborhood of words are averaged and used to predict word n.
- Word n is used to predict the words in the neighborhood of word n.
- Word n is learned from a large corpus of words, which a human has labeled.
- The code for word n is fed through a CNN and categorized with a softmax.
8.
Question 8
What is the Skip-Gram approach?
1 point
- Word n is used to predict the words in the neighborhood of word n.
- The code for word n is fed through a CNN and categorized with a softmax.
- Word n is learned from a large corpus of words, which a human has labeled.
- Vectors for the neighborhood of words are averaged and used to predict word n.
9.
Question 9
What is the goal of the recurrent neural network?
1 point
- Learn a series of images that form a video.
- Predict words more efficiently than Skip-Gram.
- Synthesize a sequence of words.
- Classify an unlabeled image.
10.
Question 10
Which model is the state-of-the-art for text synthesis?
1 point
- Long short-term memory
- CNN
- Multilayer perceptron
- CBOW
Important Links:
- Introduction to Machine Coursera Week 1 Quiz
- Introduction to Machine Coursera Week 2 Quiz
- Introduction to Machine Coursera Week 3 Quiz
- Introduction to Machine Coursera Week 4 Quiz