"Killed" error during installation for many packages

I've seen the issue in a number of other threads but when I try to install the Bioconductor package iSEE, a lof of packages fail to install with a "Killed" message.

For instance, after several rounds of installation:

> BiocManager::install("iSEE")
Bioconductor version 3.8 (BiocManager 1.30.4), R 3.5.2 (2018-12-20)
Installing package(s) 'iSEE'
trying URL 'https://bioconductor.org/packages/3.8/bioc/src/contrib/iSEE_1.2.1.tar.gz'
Content type 'application/x-gzip' length 7591401 bytes (7.2 MB)
==================================================
downloaded 7.2 MB

* installing *source* package ‘iSEE’ ...
** R
** inst
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded
Killed

The downloaded source packages are in
	‘/tmp/RtmpzaBUiq/downloaded_packages’

I haven't logged the earlier installation outputs, but a number of other packages triggered this "Killed" message, without interrupting the overall installation process. Maybe there are some broken / semi-installed packages now, but I have no clue how to identify the source of the issue from here.

Tracking down the issue, I've cleared the library:

rm -rf "/home/rstudio-user/R/x86_64-pc-linux-gnu-library/3.5/*"

And relaunched the installation from scratch.

It seems that in total, 3 packages trigger the "Killed" error.

The downloaded source packages are in
	‘/tmp/Rtmp4d7qhL/downloaded_packages’
installation path not writeable, unable to update packages: class, codetools, mgcv
Warning messages:
1: In install.packages(pkgs = doing, lib = lib, repos = repos, ...) :
  installation of package ‘SummarizedExperiment’ had non-zero exit status
2: In install.packages(pkgs = doing, lib = lib, repos = repos, ...) :
  installation of package ‘SingleCellExperiment’ had non-zero exit status
3: In install.packages(pkgs = doing, lib = lib, repos = repos, ...) :
  installation of package ‘iSEE’ had non-zero exit status

In particular, SummarizedExperiment is described here, but didn't get a reply: Bioconductor Package Installtion Fails With Writeability Error

This is to addition to the warning message:

installation path not writeable, unable to update packages: class, codetools, mgcv

However, that one I know how to get around using:

install.packages(c("class", "codetools", "mgcv"), lib=.libPaths()[1])

This indicates to me that loading the package is exceeding the 1 GB limit to memory that currently exists in rstudio.cloud. However, the package is actually installed, and can be loaded by library(iSEE).

What does the package iSEE do when it is loaded?

The message

indicates that BiocManager is trying to update/install R system library packages, which are not, in fact, updatable by users. It seems to be an informative warning instead of an outright error though.

Hi Josh,

Thanks for the fast reply.
Actually, turns out that despite the error messages during installation, all 3 packages that trigger the "Killed" error (SummarizedExperiment, SingleCellExperiment, iSEE) can be loaded and used... under some circumstances.

Regarding iSEE, it loads fine, and I have a working example here where I'm making a up a dummy object for visualization in the app:

Now, my only issue is that my co-developers and I were hoping to use data (or at least a subset) from a real experiment as attempted here: https://rstudio.cloud/project/164719
But that approach keeps failing, possibly because of memory issues, even though we try to take a subset of the data. Strangely enough loading the original data set works, but the session crashes as soon as I try to trim down the object. Specifically, I don't make a second (smaller) copy of the data set; rather I set some assay matrices to NULL, in an attempt to reduce memory usage. But the session crashes even as soon as I attempt ... <- NULL

When I open that project and run gc() I see:

> gc()
           used  (Mb) gc trigger  (Mb) max used  (Mb)
Ncells  5475504 292.5    8828510 471.5  8828510 471.5
Vcells 16484956 125.8   23798222 181.6 19711902 150.4

I am not certain how R mutates into the copy, but if it performs a full memory copy before making the mutation, then you could very easily run out of memory here.

1 Like

Ouch. Well spotted. That would make sense. Each assay matrix (there are 4 of them) is ~60MB, without counting some fairly thorough metadata for each column... I'll look into other ways of getting that data set in there or perhaps simply a smaller data set yet.
Thanks!

Thanks again for investigating.

I've preprocessed (i.e., subsetted) the dataset locally on my machine, and then transferred the subset into the cloud session, so that it never sees the full data set.
It all works nice and smooth on the Shiny app instance linked above.

Thanks a ton!

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.