Lately R studio is really slow. I installed the latest version of R studio just yesterday. My R version is 4.2.2 (and please do not suggest to update to R 4.3, I dont have the time right now to reinstall all these packages in my library).
I am working with a large scRNAseq dataset (objects ranging from 20000 to 150000 cells), and I do multiple forms of analysis so I cannot really afford having only a few objects in the workspace at any given time. Right now the workspace has a total number of around 300 objects (I have already saved the big seurat objects as rds, so no they are not in the workspace). The computer has a memory of 64gb, so I expect that it should have no problem. However, R studio either is slow even in making a small 10x2 dataframe, the "?" to find info about functions does not work etc. I am tired of restarting R and R studio and frankly I cannot spend 1/3 of my working hours restarting R and R studio. 64gb of RAM memory should be more than plenty to normally work. Am I missing something ?
I don't know if this applies to you...do you have VS Code open?
In my case, and don't know why it happens... if both are running system gets very sluggish
Are you on a network drive? Have you got libraries loading from the network drive?
No I dont have VS code open. I dont know if it is so imperative to update to R 4.3
No I am using my PC's drive, no network. I dont know if it is so imperative to update to R 4.3
The only thing I can think of then, is that your swapiness setting is too high
This topic was automatically closed 42 days after the last reply. New replies are no longer allowed.
If you have a query related to it or one of the replies, start a new topic and refer back with a link.