
Instructor solution
You may exit out of this review and return later without penalty.
Naïve Bayes Review Assignment
1.What assumptions are made by a naive Bayes classifier?
2.How are Naive Bayes classifiers trained?
3.What's the difference between multilabel and multinomial text classification?
4.What is a prior probability?
5.Construct a bag-of-words feature matrix for the following email: "Hi, Dan. Are we still on for Tuesday? We can meet for coffee or we can meet at my office. Best, Alec"
6.What is a "bag-of-words"?
7.Consider a Naive Bayes model with the parameters provided in Figure A. How would this model classify the following sentence: "This morning was freezing cold."
8.What is the joint probability of a given individual with one lottery ticket in Cleveland, OH winning the grand prize on a clear day in the winter?
9.How can a bag of words handle seeing words in testing that were not seen during training?
10.What is the sigmoid function and why is it useful?
11.When training a logistic regression model, a training example yields \( \sigma (w·x + b) = 0.73 \), though the actual value is \( y = 0 \). Using the cross-entropy loss function, what is the loss?
You may exit out of this review and return later without penalty.