I'm having an unpredictable error relating to a core dump, one thats very hard to make an MWE. In the mean-time, I'm wondering if I can fix the error reliably by making sure I'm not storing too many things in memory
foo = function(){
runif(10000)
}
foo()
If I run this as a script, does the 10,000 long vector get garbage collected? Or do I have to wrap calls to foo() in a function that returns nothing in order for the memory to get freed?
Sorry, just to confirm, I have a function that looks like this
inner_fun = function(x){
# writes to a file
t = runif(1000000) * x
return(t)
}
function outer_fun(){
inner_fun(1)
inner_fun(2)
inner_fun(3)
}
I'm getting a crash that I think might be related to running out of memory. My guess is that allocations caused by the inner_fun calls are not being gcd. Is this a valid hypothesis?
I would seriously doubt it ...
my reason for scepticism about that is that if you try to allocate a larger than possible vector, you'll get a meaningfull error message, and not a core dump
inner_fun = function(x){
# writes to a file
t = runif(10^10) * x
return(t)
}
inner_fun(1)
#Error: cannot allocate vector of size 74.5 Gb
Sorry for causing an XY problem here with my less than complete description. I'm not used to worrying about performance or memory allocations in R code.
My actual problem is with a package called fixest which I think will allocate memory using cpp, so maybe something is fishy there. I will file an issue there if I see this problem again.
fixest have a github which is the best place to raise errors, try to focus on making a minimal reproducible example if you can.
An example of an active github issue being raised and fixed: