After some R packages updates, the predicted probabilities of my xgboost model are very close to 0.5.
I don't know why.
My 15 predictors are numeric, and my outcome is a factor (0 or 1).
My results look like this:
.pred_0 .pred_1 .pred_class 0.5000001 0.4999999 0 0.5000002 0.4999998 0 0.4999996 0.5000004 1...
My model was selected for its best accuracy using hyperparameter tuning:
xgb_spec1 <- boost_tree(mode = "classification", mtry = tune(), trees = 1000, min_n = tune(), tree_depth = tune(), learn_rate = tune()) %>% set_engine("xgboost")
Is there a parameter to specify that I missed?
Thank you in advance for your help!