Transferring a data frame from R Studio Cloud to GoogleSheets seems prohibitively slow

I have a data frame in R Studio cloud that is 85 rows by 207 columns (ultimately it may be as wide as 366 columns). The actual data is created using the code in this thread:

Function to generate multiple rows of vectors in R

Now that I have the data frame set up, I'm trying to use the googlesheets library to transfer it to an existing sheet (and will periodically update it as I get more data). I'm using this code to edit the google sheet:

gs_edit_cells(sheet, ws = "Test", input=data, anchor="A2")

I let it think for a long time (more than an hour) and it sat there with "Range affected by the update" and it never went through. If I slice it into several segments (doing say 30 or 60 or 90 columns at a time) then it does transfer but is prohibitively slow. This isn't a huge dataset, is there something I could improve in my code? I have loaded the sheet in advance and it does recognize it.

Does anyone have any input on this? I'd love a workaround if a direct load isn't possible.

Hi @Litmon!

I'm not sure this problem is specific to RStudio Cloud — it may be an problem with googlesheets and the APIs it relies on. This discussion on the googlesheets GitHub issue tracker might have some things you can try:

This also sounds similar:

But most significant for you are probably these comments from the package author:

jennybc commented on May 30, 2018

Hmm ... interesting re: tibble vs. data frame. In any case, the reboot googlesheets4 is well underway, with reading public Sheets already available for use. Writing will come soon!

What I'm saying: I will not be troubleshooting this any further here and, I hope, it will simply not be a problem in googlesheets4, with the new API.

(see also: Issues with Googlesheets Package · Issue #385 · jennybc/googlesheets · GitHub)

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.