The previous R session was abnormally terminated due to an unexpected crash. You may have lost workspace data as a result of this crash.

Hello. I'm a student who just start learning the Rstudio.
However, I have a problem.
While solving my homework in my Rstudio Cloud,
I continuously get that error message:

'The previous R session was abnormally terminated due to an unexpected crash. You may have lost workspace data as a result of this crash.'

and with this message, all of my variables which saved in my environment are deleted.

The codes I run is like this:
cols = c('chrom', 'source', 'feature_type', 'start', 'end', 'score', 'strand', 'phase', 'info')
d = read_delim('gencode.v31.basic.annotation.gtf.gz', delim='\t', skip = 5, progress = F, col_names = cols)

d1 <- d %>% group_by(source)

head(d1)

table(d1$source)

d2 <- d %>% group_by(source,chrom)
d2

Because I am a beginner, my knowledge is short but please help me..

How big is this file you are trying to read? have in mind that on RStudio Cloud there is a 1GB limit on the RAM memory.

it is 25mb file......

I suppose that is the compressed file size, just to be sure, what is the size of the uncompressed object in memory? You can get it this way

cols = c('chrom', 'source', 'feature_type', 'start', 'end', 'score', 'strand', 'phase', 'info')
d = read_delim('gencode.v31.basic.annotation.gtf.gz',
               delim='\t',
               skip = 5,
               progress = F,
               col_names = cols)

format(object.size(x = d), units = "MB")
1 Like

oh, I'm sorry.

Running this code, I get follow result:
[1] "521 Mb"

OK that confirms my suspicions, I don't completely remember memory allocation rules in R but I believe that in your code you are making three copies of the same dataframe in memory and by doing so you are surpassing the 1GB memory limit, which cause your R session to crash

1 Like

oh so you mean that i make d,d1, and d2 so these are more than 1gb, am i right?

Yes, that is my suspicion, although I'm not completely sure since I haven't test it and as I said, I'm not sure I remember memory allocation rules correctly (some commands just make a pointer while others make a copy, I think the dplyr::group_by() command is doing the latter).

okay, i will try this later and add my result. thank you very much!

i use filter to choose only needing data and make data smaller. and after that i don't get error again. i really appreciate you help me. have a good day!

1 Like

I'm glad to be of service. If your question's been answered (even by you!), would you mind choosing a solution? It helps other people see which questions still need help, or find solutions if they have similar problems. Here’s how to do it:

sure i will do that!

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.