Confusion Matrix Terminologies

12345678910
Across
  1. 2. Measures the proportion of true positive predictions among all positive predictions made by the model.
  2. 6. Measures the proportion of actual negatives that are correctly identified as such by the model.
  3. 8. Instances correctly classified as positive by the model.
  4. 9. Instances incorrectly classified as negative when they are actually positive.
  5. 10. Instances correctly identified as negative by the model.
Down
  1. 1. Also known as recall, it measures the proportion of actual positives that are correctly identified by the model.
  2. 3. A prediction mistake where the model wrongly identifies a negative instance as positive.
  3. 4. The proportion of misclassified instances in the dataset.
  4. 5. The ratio of correctly predicted instances to the total instances in the dataset.
  5. 7. The harmonic mean of precision and recall, balancing the two metrics.