Obtaining Confidence Intervals for AUC Plots

Hi all,

I've recently been using a package known as 'iai' for interpretable machine learning.

However, after going through their user guides to obtain a plotted AUC curve, I wondered if it was possible to construct a bootstrapped confidence interval around the AUC.

I've since tried the standard commands in the pROC package, such as 'ci(roc)' and ci.auc(roc). But unfortunately, I've had no success.

I wondered therefore if I had my AUC score (single value), where the object 'grid' is my decision tree learner.

iai::score(grid, train_X, train_y, criterion = "auc")

Further, the AUC is plotted using the following argument:

iai::roc_curve(grid, test_X, test_y, positive_label = 1)

I wondered if anyone can think with these limited arguments on how to construct confidence intervals for the AUC?

Would be appreciated.

https://docs.interpretable.ai/dev/IAI-R/quickstart/ot_classification/

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.