You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be great to be able to calculate the confusion matrix of the predictions/ground truth of a given estimator. Calculating the confusion matrix from preds/ground truth of size ~3 million can take at least 2.5 seconds on the CPU with sklearn.
Following mention in the text of #242 and #608 , filing an issue to make tracking explicit.
The text was updated successfully, but these errors were encountered:
Related to #242 , the following would be a basic 2-class confusion matrix with CuPy:
defcupy_conf_mat(y, y_pred):
""" Simple, fast confusion matrix for two class models designed to match sklearn. Assumes the classes are one of [0, 1]. It will fail edge cases, which are fairly numerous. Could be expanded to multi-class by following similar logic to the precision implementation, but no need for now. This is about 300x faster than sklearn for 3M float64 predictions. """nclasses=len(cp.unique(y))
assertnclasses==2res=cp.zeros((2, 2))
pos_pred_ix=cp.where(y_pred==1)
neg_pred_ix=cp.where(y_pred!=1)
tn_sum= (y[neg_pred_ix] ==0).sum()
fn_sum= (y[neg_pred_ix] ==1).sum()
tp_sum= (y[pos_pred_ix] ==1).sum()
fp_sum= (y[pos_pred_ix] ==0).sum()
res[0, 0] =tn_sumres[1, 0] =fn_sumres[0, 1] =fp_sumres[1, 1] =tp_sumreturnres
It would be great to be able to calculate the confusion matrix of the predictions/ground truth of a given estimator. Calculating the confusion matrix from preds/ground truth of size ~3 million can take at least 2.5 seconds on the CPU with sklearn.
Following mention in the text of #242 and #608 , filing an issue to make tracking explicit.
The text was updated successfully, but these errors were encountered: