Clearing memory mid render of html output in Rmd file

I have a Rmd file which I'm rendering using rmarkdown::render_site() which generates html.

The file has 3 sections, a, b and c. I am able to run sections a through b no problem. On it's own I can run section c. However, when I try to render the entire document, a,b and c I receive an out of memory error.

So, after completing section b I added the line rm(list = ls()). However, this doesn't solve the issue since I still hit an out of memory error message while attempting to render the document.

(Preferred) Is there a prescribed solution to this issue?

(Plan b) Is there some kind of work around where I could compile 2 separate Rmd files then somehow merge them into the same html?

I would try to find the cause of the "out of memory" error to understand what is going on.

(Plan b) Is there some kind of work around where I could compile 2 separate Rmd files then somehow merge them into the same html?

If you only have c inside the document, is it working ?

Hi.

It's this code block below. Note that I ran this very code block further up in my script. It runs fine. I should have made a function but was in a rush having been held up so much fighting with R's ram limitations.

What I'm doing here is predicting on my training data so as to make some visualizations with the predictions later on. So, I had to make some transformations to the training data so it's in the same format as the data passed to train earlier:

```{r model_predictions, echo=F, message=F, warning=F}

# (Me desperately trying to get things to run with samples of data) Transform original data to right format to predict
# model_img_data <- churn_data %>% sample_n(500000)
# model_img_data <- churn_data %>% sample_n(10000)

# clear memory for remaining steps
rm(list = setdiff(ls(), c("model_img_data", "quadratic_model")))

## make month just a categorical not ordered factor
model_img_data <- model_img_data %>% mutate(month = as.character(month)) %>% 
  mutate_at(vars(matches("v_")), funs(as.numeric)) %>% 
  mutate(auto_renewal_flag = as.numeric(auto_renewal_flag))

## make dummy vars
dummy <- caret::dummyVars(~ month + product_pnl_line_name + shopper_region_1_name + hosting_provider, 
                          data = model_img_data, fullRank = F, sep = ".")

dummy_df <- predict(dummy, model_img_data) %>% as.data.frame()

model_img_data <- c(model_img_data, dummy_df) %>% data.frame() %>% 
  mutate(count_domains_x_hosting_provider = v_count_domains * hosting_provider.Top.Ten.Hosting.Competitor.on.Venture) %>% 
  mutate(quadratic_term_tenure = tenure_months^2)

# predict onto original data for visualizing probability
preds <- predict(quadratic_model, 
                 newdata = model_img_data,
                 type = "prob")
modeling_data <- model_img_data %>% mutate(Predicted_Probability_Churn = preds$X1)
rm(list = setdiff(ls(), "modeling_data"))

So I'm able to fit the model but not predict.

Yes, that is correct.

I think that pasting my code block above might be a diversion to the issue. On it's own section c which includes the block above is not particularly big or challenging for r to work with. It's only when I try to run all the sections together that I hit the dreaded cannot allocate memory error. This is even after clearing memory with rm()

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.