Installation of tensorflow terminated on RStudio Cloud

Similarly to posts here and here, I am having more trouble when I try to install TensorFlow in a new RStudio Cloud project. I know I need to set up both Miniconda and a virtual environment locally in /cloud/project/ so the Python dependencies stay with copies of the cloud project. Previous versions of the following setup script worked.

install.packages(c("keras", "rstudioapi", "tensorflow"))
lines <- c(
  paste0("RETICULATE_CONDA=", file.path(getwd(), "miniconda", "bin", "conda")),
  paste0("RETICULATE_PYTHON=", file.path(getwd(), "miniconda", "bin", "python")),
  paste0("WORKON_HOME=", file.path(getwd(), "virtualenvs"))
writeLines(lines, ".Renviron")
  envname = "r-tensorflow",
  python = Sys.getenv("RETICULATE_PYTHON")
  method = "virtualenv",
  conda = Sys.getenv("RETICULATE_CONDA"),
  envname = "r-tensorflow"

But I get an error on Cloud when I try to install Python's TensorFlow and Keras:

+   method = "virtualenv",
+   conda = Sys.getenv("RETICULATE_CONDA"),
+   envname = "r-tensorflow"
+ )
Using virtual environment 'r-tensorflow' ...
Collecting tensorflow==2.2.0
  Downloading tensorflow-2.2.0-cp38-cp38-manylinux2010_x86_64.whl (516.3 MB)
Error: Error installing package(s): 'tensorflow==2.2.0', 'keras', 'tensorflow-hub', 'h5py', 'pyyaml==3.12', 'requests', 'Pillow', 'scipy'

The same script on my local Ubuntu machine appears to succeed, but it ignores my local virtual environment even though I set WORKON_HOME.

> tensorflow::tf_config()
Installation of TensorFlow not found.

Python environments searched for 'tensorflow' package:

You can install TensorFlow using the install_tensorflow() function.

Setting up teaching materials for Cloud would be a whole lot easier if the default global Python installation from install_miniconda(); install_keras() automatically shipped with copies of the project. Installing locally requires advanced knowledge of reticulate environment variables, and it creates more problems on top of the existing Python installation woes that regularly keep coming up.

The local installation error turned out to be an instance of, which I solved by installing the development version of reticulate. But the original error on Cloud remains.


If you can share your project id with me, I can take a look and see if our logs shed any light on what is going on.


1 Like

Thanks, Sam! I posted a minimal version at


I've tried reproducing the problem you are seeing with mixed results. Yesterday, i was able to reproduce the problem on a new project which i then upgraded to 2GB as a way to test if more memory would solve the issue and the script finished successfully. This morning, I tried again on a default 1GB project and each time I tried, everything installed successfully. I'm not sure what exactly this means if you create a new project and try the same script you had in that project above, do you still see the same error?


I retried just now using a fresh project ( and got the same error. For what it's worth, I have a free account.

1 Like


I was able to reproduce what you are seeing reliably and it looks to be a memory issue. I upgraded my project to use 2 GB RAM and the steps you provided were able to run successfully. We have announced plans that allow you to increase the RAM used on a project, if that is a possibility:

I will be taking this issue back to the cloud team to see what improvements can be supported but that will take a while.


Thanks for confirming. rstan a similar memory issue, by the way: Stan on Rstudio cloud not working.

I try to reproduce your code on my cloud (With 3GB RAM and 1.5 CPU) It returns:

ERROR: Could not install packages due to an EnvironmentError: [Errno 28] No space left on device

Error: Error installing package(s): 'tensorflow==2.2.0', 'keras', 'tensorflow-hub', 'h5py', 'pyyaml==3.12', 'requests', 'Pillow', 'scipy'

As a workaround, could I access the Dockerfile used by Cloud? If I compile locally where there is more memory and ship those binaries to the Cloud project over the network, they should work on Cloud's toolchain and hopefully consume less memory than during compilation.

Unfortunately, the Dockerfile used by Cloud is not accessible.

Is that because RStudio considers the Dockerfile proprietary, or is it an implementation issue?

I'd say its a little of both. There are technical issues we would need to address in order to make the file publicly available. We've also never considered making it available so haven't audited the contents to identify if there is anything proprietary that shouldn't be included.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.