Help Interpreting Model Local Explanations for Classification

In the book TMWR, in section 18.2 (Local Explanations), it states (2nd paragraph) -

There are multiple possible approaches to understanding why a model predicts a given price for this duplex. One is a break-down explanation, implemented with the DALEX function predict_parts() ; it computes how contributions attributed to individual features change the mean model’s prediction for a particular observation...

How does this explanation change for a classification problem. Can I say -

it computes how contributions attributed to individual features change the model's probability prediction for a particular observation...

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.