DBI::dbWriteTable is slow for writing to Azure

dbplyr

#1

I have read a few issues on github relating to write speed, but I have not found a workaround for the problem.

I currently have a 2.5MB file I would like to write to Azure. Below is an example of the code I am using.

The problem is the the write time is close to 40mins! What can I do to speed up the write speed?

 # connect to Azure
  con_azure <- DBI::dbConnect(odbc::odbc(), Driver = "SQL Server Native Client 11.0",
                                Server = "name-server.database.windows.net",
                                Database = "work_dw", UID = "user", PWD = "pass")

  # write to data to table
  my_file #2.5MB
  tictoc::tic()
  DBI::dbWriteTable(con_azure, DBI::Id(schema = "my_dev", name = "new_table"), my_file)
  tictoc::toc()
# 2305.56 sec elapsed

#2

Is this question something that is better posted in this community or as an issue on github?


#3

Hi @alex628, there are several factors that may influence the speed. The first thing I'd try is to do a comparison between this and how long it takes using a tool outside R, such as SQL Studio.


#4

Thanks for the suggestion @edgararuiz.

I just gave this a try. The alternative tool I ended up using was SQL Server Import and Export Wizard. I took the same 2.5MB file and saved as a csv then used the Wizard to upload to the Azure database. The write time was 8.32 seconds. This was a huge improvement over the ~40 minutes in R.

Do you have any suggestions on what I can try to speed up write times from R?


#5

Hi @edgararuiz, I was wondering if you knew of any alternative solutions.