Memory Usage Issues

Hello all,

I'm having issues importing CSV datasets due to a limit in the memory usage. I'm using RStudio Desktop, and when I open the application, without running anything, the memory usage report reads 77% full. I attached an image to this post to show you the memory usage report upon opening the application. How can I free up some space?

Capture

Thanks!

We need more context. What OS do you have? How much RAM does your system have? How big is the CSV file you want to read in? In what format is it stored exactly? How much memory would you like to free? Is this only after reading in the csv and nothing else?

I run windows 10 Pro and have a 64-bit operating system. I have 4gb of ram where 3.73 gb is usable. I want to upbload 12 CSV files, all are excel documents saves as CSV files and are all around 100-150 mb each.

The memory usage report I posted is after I open RStudio desktop, without running anything. The memory Usage used by the system is very high and I would like to reduce it so that I insert the CSV files.

You might run Task Manager in Windows to see what other programs are taking up so much memory and stop the ones you don't need.

The other thing that might help is to delete objects in R as soon as you don't need them.

As far as I am aware that is sub par for windows 10 requirements in terms of RAM. You'd be better of setting up another OS that requires less resources to run. As far as possible try to close or end non essential system processes.

The other option would be to look at something like: https://rstudio.cloud/ so you can run your analysis plan on the cloud and perform the merge or cleaning you require.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.