Best practices for global/external variables used in a module



I wanted to know what others think is the best approach when making a shiny module that uses some functions/variables that in a non-module app would belong in global.R

Do you still keep them in global.R? To me that seems to break the isolation and independence of modules, since a module should be explicit about all its inputs and outputs.

As a concrete example: in the following app, the module UI and module server use a variable defined outside of it. Here’s one way to write it:

foodata <- mtcars
complex_model_calculation <- function(data) {

fooUI <- function(id) {
  ns <- NS(id)
    selectInput(ns("var"), "Variable", names(foodata)), 
    "Result:", textOutput(ns("result"))

foo <- function(input, output, session) {
  output$result <- renderText({

ui <- fluidPage(

server <- function(input, output, session) {
  callModule(foo, "foo")

shinyApp(ui, server)

If this was a non-modularized app, I would place the data definition and the calculation function in a global.R file. In a modular app, I can still place the data variable and the function in global.R, or I can put them in a “foo_module_helpers.R” file and source that file from the module code.

In terms of how to get these objects into the module, I can either do what’s currently done (assume the objects are defined, and use them in the module, but that goes against the idea that a module should be self-contained), or I can add them as parameters to the module UI and module server.

If I choose to add parameters to the module UI+server, then again I have two options: either make the main app ui/server know about the variable and function and the main app will pass these objects to the module, or I can use default parameters in the module.

I hope I explained the different situations clear enough - I would love to see some opinions on which is the best approach



I may have similar setup. I have a large Shiny app with quite a few modules. All the modules use global setup information (like database connection info and other settings). I have put all these global setup info into a separate module (along with UI and server logic for modification). I store the info as reactive values in the module, and I return a reactive from the module to access the values. The returned reactive is then passed as a parameter to each “client” module that needs the data.

In the following minimal example I probably use more reactives than necessary (but I always hit my head on a missing-reactive-context, so the more the better :-)), and of course all global data can be put into one structure, but I hope you get the idea.


# in app.R


server <- function(input, output, session) {
  GlobalData = callModule(GlobalModule, "globals")
  callModule(Module1, "mod1", GlobalData

# in globaldata.R

GlobalModule <- function(input, output, session) {
  stash = reactiveValues()
  stash$GlobalVal1 = ...
  return(list(GetGlobalVal1 = reactive(stash$GlobalVal1))

# in module1.R

Module1 <- function(input, output, session, globals) {
  #somewhere in a reactive context
  val1 = globals$GetGlobalVal1()


Thanks for adding to the discussion, that’s an interesting solution creating a “globals module”

What about the case where there are no shared variables across different modules - each module just has its own set of functions/variables that it needs, and I don’t want to hardcode them inside the module. Would you still say that creating a globals module that defines specific variables for each module makes sense?


If the functions/variables are truly local to the module (as in your example) I would put them in the module source R file along with the UI function(s) and the module (server) function. I that way the source file is self-contained. The module function probably won’t work without the UI function anyway, so these two will have to go together. I don’t see an issue with putting the helper functions and variables that the UI and module functions depend on into the source file as well. If it’s a question of having too much code in one file I would put them in a foo_modulee_helpers.R file and source that in the module file.

Just my $0.02 :slight_smile:


So that still leaves another topic unanswered: would you add them as parameters to the module ui+server so that the module is truly self contained, or would you just assume in the ui+server that these variables exist, which kind of goes against the idea that a module should not touch objects outside of it. And if you do use parameters, would you pass them in or use default arguments?

These are subtle differences, but I come across this often enough that I wonder what others think is cleanest


Well, the server function (in the module) and the UI function depends on each other anyway, so they both have external dependencies. I don’t see the difference between them two needing each other and a third object being needed by both. I would put them in the same source file, so the source file is self-contained. I think passing it as parameters is over-engineering :slight_smile:

If you use your module in different projects or contexts, and if the external objects vary between these projects/contexts, it may make sense to pass them as parameters.

So I guess it depends on how volatile the external objects are?


When you’re writing a regular function, and you have some information you need to pass to it, do you 1) define a global variable that the function will go look at, or 2) pass the information in as a parameter? However you answer that question, the same (more or less) applies for modules. I hope that most of the time you do number 2, and especially if the function (or module function) is defined in a different file than it’s called from.