You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The mismatch is caused by a discretization of the values in pred_array prior to calling roc_auc_score. ROC-AUC values are typically evaluated on class probabilities. The following patch fixes the problem:
> cat matbench-scores-bug.patch
--- data_ops.py 2022-08-29 09:51:34.565746826 +0200
+++ data_ops.py.new 2022-09-08 08:52:52.994181877 +0200
@@ -108,18 +108,22 @@
for metric in metrics:
mfunc = METRIC_MAP[metric]
+ true_array_ = true_array
+ pred_array_ = pred_array
+
if metric == "rocauc":
# Both arrays must be in probability form
# if pred. array is given in probabilities
if isinstance(pred_array[0], float):
- true_array = homogenize_clf_array(true_array, to_probs=True)
+ true_array_ = homogenize_clf_array(true_array, to_probs=True)
# Other clf metrics always be converted to labels
elif metric in CLF_METRICS:
if isinstance(pred_array[0], float):
- pred_array = homogenize_clf_array(pred_array, to_labels=True)
+ pred_array_ = homogenize_clf_array(pred_array, to_labels=True)
+
+ computed[metric] = mfunc(true_array_, pred_array_)
- computed[metric] = mfunc(true_array, pred_array)
return computed
I am not sure if the following behavior is intended:
The mismatch is caused by a discretization of the values in pred_array prior to calling roc_auc_score. ROC-AUC values are typically evaluated on class probabilities. The following patch fixes the problem:
Matbench version: 70c79fb
The text was updated successfully, but these errors were encountered: