I have an R script that pulls covid data from a Public Health Department. One step is to convert the date, but while the script works fine inside of R Studio, it fails when we call/run it via a batch file.
This is on a Windows Server (2019) with R 3.5.2.
This is the relevant bit of the script:
library(boxr) library(lubridate) library(curl) library(jsonlite) library(tidyverse) # Set up CURL link to APH data travis.counts <- curl('https://services.arcgis.com/0L95CJ0VTaxqcmED/arcgis/rest/services/Austin_Travis_County_COVID19_Daily_Counts_(Public_View)/FeatureServer/0/query?f=json&where=Record_Status_1%3C%3E%27Staging%27&returnGeometry=false&spatialRel=esriSpatialRelIntersects&outFields=*&orderByFields=Last_Update%20asc&resultOffset=0&resultRecordCount=32000&resultType=standard&cacheHint=true') # Extract json and convert to dataframe travis.counts.json <- readLines(travis.counts, warn = FALSE) travis.counts.parsed <- fromJSON(travis.counts.json, simplifyDataFrame = TRUE) travis.counts.nesteddf <- as.data.frame(travis.counts.parsed) travis.counts.df <- as.data.frame(flatten(travis.counts.nesteddf)) # Remove "attributes." from column names, and add indicator for Daily Counts as the source table names(travis.counts.df) <- gsub(x = names(travis.counts.df), pattern = "attributes.", replacement = "DC.") # Convert 'Last_Update' from Epoch time (milliseconds) to DateTime. travis.counts.df$DC.Last_Update <- as.POSIXct((travis.counts.df$DC.Last_Update/1000), origin="1970-01-01")
This is where it fails (when run from a batch file).
I have also tried:
travis.counts.df <- travis.counts.df %>% mutate(DC.Last_Update = as_datetime(DC.Last_Update/1000, origin="1970-01-01"))
(Among like 10 other similar variations.)
Like I said, this runs fine within RStudio.
Though the batch file I get an error like this:
Error in as_datetime(DC.Last_Update/1000, origin = "1970-01-01") :
object 'DC.Last_Update' not found
Calls: %>% ... mutate -> mutate.tbl_df -> mutate_impl -> as_datetime
Again, the column is actually there. It runs fine from RStudio.
Any help is greatly appreciated. We run this data pull twice a day, so while it's great that we're able to run it manually, getting it automated would be a big help.