I have a tables in Oracle database which are in a bunch of different schemas. I would like to have the tables searchable & indexable in R. I think it would be the best to -after establishing the db connection- to have a function that goes through all the schemas, lists all the tables avaiable to me and "preloads" the tibble via the
tbl function. After that, it would be the best to save these tbls in a named list of lists where the upper level would be the schema name and the lower the tibble name.
This is as far as I got with the problem.
conn = dbConnect(my_db_info) #note that the sql calls are probably Oracle-specific dbGetQuery(con, "select username as schema_name from sys.all_users order by username") %>% # list all the schemas tibble %>% transmute(schema = SCHEMA_NAME) %>% # simple name change mutate(qstring = paste0("select table_name from all_tables where owner = ", sQuote(schema, q = "ASCII"))) %>% # create the sql string to list tables for each schema rowwise %>% mutate(tables = list(dbGetQuery(con, qstring))) %>% # query the database to get table names filter(nrow(tables) > 0) %>% # drop schemas with 0 accessible tables unnest(tables)
After this step, I would like to simply iterate over all the rows and save the tbls in lists as mentioned above. But every way I try it breaks with a different error. Am I facing a dead end, or is there already a better solution in place, which I just managed to miss?