Multiple manifest.json files in one directory

One scenario that seems to be coming up fairly frequently at present is the idea that our users want to have multiple "app files" in one folder.

For example

  • they might want to have both an app.R and a plumber.R in a folder
  • or they might want to have multiple RMD reports in a folder

In both of these situations the authors generally have sub folders containing functions shared between their apps/reports.

In this situation, does anyone have any ideas about how to engineer multiple manifest.json files so that multiple pieces of RStudio Connect content can be git deployed from the same folder?

Unfortunately, this kind of use case isn't currently supported in the way you describe. RStudio Connect expects a single manifest file for a given directory, and multiple manifest files in one directory is not a supported pattern.

What we typically recommend is that similar pieces of content (for example a Shiny app and a Plumber API) are both subdirectories of a project. Shared code / functions between pieces of content is best achieved by writing an R package. That package can then either be manually installed on the RStudio Connect server and identified as an external package, or it can be distributed via an internal repository, like RStudio Package Manager so that it is available to deployed content.

An example of this pattern can be seen here: https://github.com/sol-eng/bike_predict

Thanks

That's a really useful link for our developers, but I'm not sure it helps our more data driven users as much - the package route feels a little "heavy" for them - especially when they just want to iterate locally over different code.

I'll keep looking at it and digging around - we can't be the only people looking for a lightweight way of sharing code between markdowns (although, if we are, then maybe that suggests we are doing some other things wrong too :slight_smile:)

1 Like

Hi there, thanks for this information.

Is there any plan to add support for multiple manifest.json files to Connect in the future?

My organization used to install our internal packages manually to try this out but found it a bit cumbersome (we would have to log into our production server, build the R package, and then place it in the appropriate location on the server - at which point push-button deployments are easier to manage).

We now use RSPM and structure all of our deployed content (shiny apps, rmd scripts, plumber APIs) as R packages but still have some awkwardness with git-backed deployments, example;

  1. We develop an R package that includes a shiny app (using {golem}) and a Plumber API in one git repo on Github.
  2. This package is synched with RSPM so when a new version is tagged on Github RSPM will fetch the new version sometime soon (according to RSPM's poll frequency).
  3. This Shiny app and Plumber APIs are also set up with git-backed deployment to auto-deploy to Connect sometime soon after changes are merged into the main branch (according to Connect's poll frequency).

Issue: If I merge some code changes into main, and tag the release, depending on the timing of Connect's and RSPM's polling, Connect may try to deploy the plumber and shiny apps before the new version of the package is available on RSPM.

A solution to this would be to have 2 Github repos; one for the R package and another for the deployments but this requires additional management and manual effort - to reflect any code changes in our deployed applications we need to push updates to 2 repositories (again at which point push-button deployment is potentially easier to manage).

Appreciate any thoughts here or if you have any other suggestions for how to structure projects that share code but have multiple deployment targets - thanks in advance!

Not an official answer - just a note on what we've done (which in turn was influenced by what another Rstudio customer did).

We gave up on git polling and instead added steps to our build chain to:

  • auto-increment each package's DESCRIPTION version field
  • build each package
  • push packages into our CRAN server (we use a simple S3-based HTTPs file store based off some principle in miniCRAN)
  • generate manifest files (using R and the rsconnect package)
  • push our apps and reports to the Connect server (using R and the connectapi package)

This seems to be working well for us... and we are now looking at extending it to support some small python deployments too.