ML CLASSIFICATION CONFUSION MATRIX AND KAPPA | Download Table
Cohen's Kappa: What it is, when to use it, and how to avoid its pitfalls | by Rosaria Silipo | Towards Data Science
Below is the Confusion Matrix and Statistics: (talk | Chegg.com
Metrics to evaluate classification models with R codes: Confusion Matrix, Sensitivity, Specificity, Cohen's Kappa Value, Mcnemar's Test - Data Science Vidhya
regression - How to calculate information included in R's confusion matrix - Cross Validated
Kappa for Predictive Model - Cross Validated
GitHub - habernal/confusion-matrix: Minimalistic Java implementation of a confusion matrix for evaluating learning algorithms, including accuracy, macro F-measure, Cohen's Kappa, and probabilistic confusion matrix