xgboost scores with tidymodels

After some R packages updates, the predicted probabilities of my xgboost model are very close to 0.5.
I don't know why.

My 15 predictors are numeric, and my outcome is a factor (0 or 1).
My results look like this:

.pred_0          .pred_1          .pred_class
0.5000001    0.4999999     0
0.5000002    0.4999998     0
0.4999996     0.5000004    1...

My model was selected for its best accuracy using hyperparameter tuning:

xgb_spec1 <- boost_tree(mode = "classification", 
                        mtry = tune(), 
                        trees = 1000, 
                        min_n = tune(),
                        tree_depth = tune(),
                        learn_rate = tune()) %>% set_engine("xgboost")

Is there a parameter to specify that I missed?
Thank you in advance for your help!

It could very well be that you have no informative predictors.

You might want to post this in the modeling and machine learning group to get a faster response.

Can you please provide a minimal reprex (reproducible example)? The goal of a reprex is to make it as easy as possible for me to recreate your problem so that I can fix it: please help me help you!

If you've never heard of a reprex before, start by reading "What is a reprex", and follow the advice further down that page.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.