🚀 Crack FAANG Interviews: Master Logistic Regression with This 3-Level Quiz Challenge (Score 90%+ to Prove You're Ready!)
Logistic Regression Quiz - Intermediate Level
Q1. Which metric is generally most appropriate for evaluating a logistic regression model on an imbalanced dataset?
A) AccuracyB) Precision
C) Recall
D) F1-Score
Q2. What is the purpose of the ROC curve in logistic regression?
A) To evaluate regression lossB) To assess correlation
C) To show trade-off between TPR and FPR
D) To rank features
Q3. What does the area under the ROC curve (AUC) represent?
A) Probability of true negative classificationB) Mean squared error of predictions
C) Ability to rank a randomly chosen positive instance higher than a negative one
D) Distribution of classes
Q4. What does L1 regularization do to logistic regression coefficients?
A) Adds Gaussian noiseB) Shrinks them to zero exactly, encouraging sparsity
C) Forces them to equal values
D) Converts them to binary
Q5. What is one advantage of L2 regularization over L1 in logistic regression?
A) It produces a sparse modelB) It performs automatic binning of variables
C) It encourages smaller, but non-zero coefficients, improving generalization
D) It ignores outliers better
Q6. Which scikit-learn parameter enables multi-class classification in LogisticRegression()?
A) binary=TrueB) penalty='multi'
C) multi_class='ovr' or 'multinomial'
D) type='categorical'
Q7. How does logistic regression handle categorical features natively?
A) It converts them internallyB) It requires manual conversion like one-hot encoding
C) It only accepts numeric data
D) It maps strings to binary
Q8. What problem arises if the features are highly correlated in logistic regression?
A) Model becomes more accurateB) Multicollinearity distorts coefficient estimates
C) Loss function does not converge
D) Sigmoid fails to activate
Q9. In logistic regression, increasing the regularization parameter C in scikit-learn does what?
A) Increases regularization strengthB) Decreases regularization strength
C) Increases model complexity penalty
D) Converts logistic regression to linear
Q10. If your logistic regression model is overfitting, what is one thing you should try first?
A) Use polynomial featuresB) Decrease regularization
C) Add more features
D) Increase regularization
Detailed Explanations
- F1-score balances precision and recall and is better than accuracy on imbalanced data.
- ROC curve helps visualize classifier performance across thresholds.
- AUC measures how well the classifier ranks positive instances above negatives.
- L1 encourages sparsity, effectively removing unimportant features.
- L2 reduces model complexity without zeroing out coefficients completely.
- scikit-learn uses
multi_class='ovr'
or'multinomial'
for multi-class classification. - Manual encoding (e.g., one-hot) is needed before applying logistic regression.
- Multicollinearity inflates standard errors of coefficients, making them unstable.
- In scikit-learn,
C
is the inverse of regularization strength. - Increasing regularization helps simplify the model and reduce overfitting.