I was try to read a 5.0GB file (uploaded on the cloud in home dir) using RStudio Server. My data frame has (450.000 lines x 920 columns). Even using a dedicated google server (Linux, 625GB RAN, 96 GPU), it seems that I´m having some memory or a CPU issue, because it takes a long time to process. Running this same scrip on my personal computer (more than 10 times worse), seems to take the same processing time.
Apparently I'm using less than 5% of google cloud processing capacity.
How can I make RStudio Server use more CPU and memory to increase to increase performance (improve time consumption)?
Data1<- read.table("ArrayFiltrado3.txt",header = TRUE, sep="\t",stringsAsFactors=FALSE) mean(!complete.cases(Data1)) md.pattern(Data1) aggr(Data1, prop=FALSE, numbers=TRUE) matrixplot(Data1)
Any help would be appreciated.