I'm trying to use the docker buildkit approach to caching packages to speed up adding packages to docker containers. For Python and apt-get I am able to get this to work, but I can't get it to work for R packages. In a Dockerfile for Python I'm able to change:
RUN pip install -r requirements.txt
to (and the comment looking bit at the top of the Dockerfile is needed)
# syntax=docker/dockerfile:experimental RUN --mount=type=cache,target=/root/.cache/pip pip install -r requirements.txt
And then when I add a package to the
requirements.txt file, rather than re-downloading and building the packages, pip is able to re-use all the work it has done. So buildkit cache mounts add a level of caching beyond the image layers of docker. It's a massive timesaver. Check out the instructions for both python and apt-get packages and useful answer on caching python packages. I'm hoping to set up something similar for r-packages.
Here is what I've tried that works for apt-get but not r-packges. I've also tried with the
# syntax=docker/dockerfile:experimental FROM rocker/tidyverse RUN rm -f /etc/apt/apt.conf.d/docker-clean; echo 'Binary::apt::APT::Keep-Downloaded-Packages "true";' > /etc/apt/apt.conf.d/keep-cache RUN --mount=type=cache,target=/var/cache/apt --mount=type=cache,target=/var/lib/apt \ apt update && apt install -y gcc \ zsh \ vim COPY ./requirements.R . RUN --mount=type=cache,target=/usr/local/lib/R/site-library Rscript ./requirements.R
Anyone have this working?