GPU accelaration with tidymodels

I am working in a Kaggle notebook and I was wondering if you can use GPU acceleration with any of the tidymodels as it can accelerate training times up to 100x compared to CPU. I know it is possible for the 'xgboost' package (link) but I can't get it to work using the tidymodels interface if I for instance try:

xg_model <-
  boost_tree(
    # trees = 1000, 
    # tree_depth = tune(), min_n = tune(), 
    # loss_reduction = tune(),                     ## first three: model complexity
    # sample_size = tune(), mtry = tune(),         ## randomness
    # learn_rate = tune(),                         ## step size
  ) %>%
  set_engine("xgboost", tree_method = 'gpu_hist') %>%
  set_mode("classification")
3 Likes

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.