In this blog you will find the correct answer of the Coursera quiz Mastering Data Analysis in Excel Coursera week 3 Quiz mixsaver always try to brings best blogs and best coupon codes
 

1. Using the information Gain Calculator, without changing any inputs in the confusion matrix, what is the conditional probability of getting a Positive Test, if you have a defective chip? Use the link below to access the spreadsheet. There is also an explanation about using the Information Gain Calculator that you may find helpful to review beforehand.

Information Gain Calculator.xlsx

  • 37.5%
  • 50%
  • 25%
  • 14%


 


 

2. The conditional probability of getting a Positive Test if you have a defective chip can be written p(Test POS | “+”). What is this probability called on the Confusion Matrix?

  • The False Positive Rate
  • The False Negative Rate
  • The True Negative Rate
  • The True Positive Rate


 


 

3. What is the remaining uncertainty or entropy of the test classification if we learn a chip is truly defective?

  • 1 bit
  • .9183 bits
  • .8113 bits
  • .5917 bits


 


 

4. What is the probability that a chip chosen at random from the assembly line is defective?


 

  • .2
  • .3
  • .7
  • .8


 


 

5. What is the conditional Probability of Getting a “Negative” Test classification if you have a non-defective chip?

  • 75%
  • 14%
  • 25%
  • 50%


 


 

6. The conditional probability of getting a Negative Test if you have a non-defective chip can be written P(Y = “NEG” | X = “-”). What is this probability called on the Confusion Matrix?

  • True Positive Rate
  • True Negative Rate
  • False Negative Rate
  • False Positive Rate


 


 

7. Challenging question: What is the remaining uncertainty, or entropy, of the Test Classification, if we know that a chip is not-defective?

  • 1 bit
  • .9183 bits
  • .5917 bits
  • .8113 bits


 


 

8. How frequently will a non-defective chip occur?

  • .8
  • .2
  • .3
  • .7


 

9. What is the expected, or average, uncertainty or entropy, remaining regarding a Test Outcome, give knowledge of whether or not a chip is defective?

  • .8813 bits
  • .8490 bits
  • 1 bit
  • .0323 bits


 

10.The optical scanner breaks down and begins to classify 30% of all chips as defective completely at random. What is the random test’s True Positive Rate and False Positive Rate?

  • 30% and 70%
  • 70% and 70%
  • 30% and 30%
  • 70% and 30%


 


 

Information Measures (graded)


 

1. Suppose we have two coins: one “fair” coin, where p(head) = p(tails) = .5; and an “unfair” coin where p(heads) does not equal p(tails). Which coin has a larger entropy prior to observing the outcome?

  • The fair coin
  • The unfair coin


 

2.If you roll one fair dice (6-sided), what is its entropy before the result is observed?

  • 2.58 bits
  • 0.46 bits
     
  • 0.43 bits
  • 2.32 bits


 


 

3. If your friend picks one number between 1001 to 5000, under the strategy used in video Entropy of a Guessing Game, what is the maximum number of questions you need to ask to find out that number?

  • 13
  • 12
  • 10
  • 11


 


 

4. Use the “Information Gain Calculator” spreadsheet to calculate the “Conditional Entropy” H(X|Y) given a = 0.4, c = 0.5, e = 0.11.

Information Gain Calculator.xlsx

  • 0.97 bits
  • 0.90 bits
  • 1.87 bits
  • 0.87 bits


 


 

5. On the “Information Gain Calculator” spreadsheet, given a = 0.3, c = 0.2, suppose now we also know that H(X,Y) = H(X) + H(Y). What is the joint probability e?

Information Gain Calculator.xlsx

  • 0.5
  • 0.04
  • 0.06
  • 0.3


 

6. Given a = 0.2, c = 0.5 on the Information Gain Calculator Spreadsheet suppose now we also know the true positive rate is 0.18. What is the Mutual Information?

Information Gain Calculator.xlsx

  • 0.13 bits
  • 0.72 bits
  • 1.64 bits
  • 0.08 bits


 

7. Consider the Monty Hall problem, but instead of the usual 3 doors, assume there are 5 doors to choose from. You first choose door #1. Monty opens doors #2 and #3. What is the new probability that there is a prize behind door #4?

  • 0.67
  • 0.5
  • 0.2
  • 0.4


 

8. Again, consider the Monty Hall problem, but with 5 doors to choose from instead of 3. You pick door #1, and Monty opens 2 of the other 4 doors. How many bits of information are communicated to you by Monty when you observe which two doors he opens?

  • 1.52 bits
  • 2.32 bits
  • 0.80 bits
  • 0.67 bits


 


 

9. B stands for “the coin is fair”, ~B stands for “the coin is crooked”. The p(heads | B) = 0.5, and p(heads | ~B) = 0.4. Your friend tells you that he often tests people to see if they can guess whether he is using the fair coin or the crooked coin, but that he is careful to use the crooked coin 70% of the time. He tosses the coin once and it comes up heads.


 

What is your new best estimate of the probability that the coin he just tossed is fair?

  • 0.15
  • 0.35
  • 0.40
  • 0.43


 


 

10. Suppose you are given either a fair dice or an unfair dice (6-sided). You have no basis for considering either dice more likely before you roll it and observe an outcome. For the fair dice, the chance of observing “3” is 1/6. For the unfair dice, the chance of observing “3” is 1/3. After rolling the unknown dice, you observe the outcome to be 3.


 

What is the new probability that the die you rolled is fair?

  • 0.08
  • 0.23
  • 0.33
  • 0.36

 

Important link: