Measuring amount of memory that code uses? (profvis or any other package)


I am currently running some ML models on a R studio Server with 64gb of RAM,

My ML models are run relatively quickly and what one would normally expect given their sparse matrix size

the methods i have been using are Logistic regression and XGBOOST

However I now want to "profile" and see the memory being used at the actual model fitting stage - i have used Profvis, but it does not seem to work on my matrix of 760 variables by 228,000 rows on the rstudio server, it does not load the actual profvis viewer and uses up all 64GB of ram!

Is there any way around this? (aside from shrinking the data)

As in other packages aside from profvis, that allow you to profile code at any moment to see how much memory is being used?

1 Like

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.