RStudio 1.1.383 very slow to open View() tab on Windows 7

Just updated to the newest RStudio release and running into this problem exactly as described by this post on the RStudio support site from a couple months ago:

View() takes a very long time to open a new tab to view a data frame, upwards of 20 seconds on the wall clock. This happens regardless of the data frame source (e.g., mtcars, ggplot2::diamonds, or one I've generated myself) or length (mtcars is just 32 rows, diamonds is 53,940). It happens regardless of whether one manually types View(mtcars) or clicks the data frame view icon in the Environment panel. The prompt immediately reappears (so system.time() doesn't return the inflated time), but the R session is still blocked (running another command will wait until the View() tab opens).

This seems to be a Windows 7 issue as I haven't seen any reports from other OS and it runs fine on my Mac at home.

Any insight from the RStudio team on this? I love the new features in 1.1, but can't start using it until this bug is resolved. Session info below.

> sessionInfo()
R version 3.4.2 (2017-09-28)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 7 x64 (build 7601) Service Pack 1

Matrix products: default

locale:
[1] LC_COLLATE=English_United States.1252  LC_CTYPE=English_United States.1252   
[3] LC_MONETARY=English_United States.1252 LC_NUMERIC=C                          
[5] LC_TIME=English_United States.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

loaded via a namespace (and not attached):
[1] compiler_3.4.2 tools_3.4.2    yaml_2.1.14   

Can you provide any other details on your system setup? From what we have seen, this issue occurs most commonly for users working with networked filesystems on Windows; e.g. where RStudio, or R, or some combination thereof are on a potentially high-latency filesystem. Is that true in your case as well?

If the R session is locked during that period, it's possible it's stuck executing some R code. Can you try running the following steps?

  1. Execute Rprof("rstudio-trace.Rprof"),
  2. Call e.g. View(mtcars) (or whatever other view statement causes this slowness),
  3. Wait until the data set is rendered,
  4. Execute Rprof(NULL) to stop the profiler after all is done,
  5. Provide us a link to download the generated trace file.

In addition, a diagnostics report would be helpful. (You can upload the generated report as e.g. a GitHub gist.)

Hopefully, this will give us some hints as to what is going on.

I'm on a Windows machine that is connected to a network filesystem where I often store data and R scripts, but my R and RStudio installations are on my local machine. Is it possible that RStudio is looking for additional files on the network that might not be installed locally and that is slowing things down?

Prior to updating to 1.1, everything was running fine with 1.0.153 on the same machine and network.

Here is the trace file I generated using the above steps. And here is my diagnostics report.

Thanks for the quick response and let me know if there is anything else I can do to help diagnose this issue.

Thanks for taking the time to generate a trace -- unfortunately, it looks like the R session wasn't doing much of anything during this time. (It's possible that RStudio was running code separately from the R session that's responsible for this slowness, though.)

There's one other thing I could suggest, although it would be a bit time consuming. We publish daily builds of RStudio here (these are installer-less versions; you can just unzip them on your desktop and then run the file at bin/rstudio.exe within).

For example, you can download RStudio v1.1.340 using this link:

https://s3.amazonaws.com/rstudio-ide-build/desktop/windows/RStudio-1.1.340.zip

and can download other builds by replacing the build number in that link. If you have time, would you be able to test a range of RStudio releases, to see which particular build seems to have introduced the slowness you're observing? (You could imagine this as a bisect -- start at version 170; if the issue reproduces, try version 85; if not, try 255; and repeat the process until the bad build is found.

If you're able to discover the build in which the slowness was introduced (or even narrow it down to a range), that could be very helpful.

I had this same issue and figured out a workaround that has been doing the job. It seems to happen when your HOME is on a network drive (as it often is in an enterprise environment).

My workaround is here: https://gist.github.com/ateucher/f279abc1f7ba12f2763152c18810fb24 - essentially I just set the HOME env var to a location on my C: drive. As far as I can tell, HOME is generally unused (and unset by default) in Windows, so making this change didn't really change anything else on my system, except in the bash shell that comes with Git for Windows.

1 Like

I also am experiencing the same problem, with my $HOME directory as a (very slow) network drive. I was able to use the workaround suggested by andyteucher: in cmd under Windows 7, setting the $HOME directory as a local directory with setx HOME "%USERPROFILE%". I also used the bash prompt installed with git to verify it. Thanks for the tip !

1 Like

I had the same problem and could solve it also by setting the default working directory within RStudio to a local drive and setting the %HOME% variable under windows to a local drive.

Thanks for all of the responses, everyone -- it's good to hear that we at least have a workaround in the interim. We're going to keep investigating and we'll let you know if we're able to learn what the underlying cause is.

Some other questions:

  1. Do you see the same slowness if you invoke View(as.list(mtcars))? (This will view using the object explorer rather than the data viewer.)
  2. What is the output of Sys.getenv() on your system? Can you show the output both with HOME set, and HOME unset? (Some other environment variables might be set based on the value of HOME, so I'm curious to see what else this affects)

@kevinushey to answer your questions:

  1. Yes, it appears to affect the object viewer as well as the data viewer.

  2. It looks like both R_USER and R_LIBS_USER are set based on the value of HOME. Here is the relevant output:

  • With HOME unset:
    HOME //MY_WORK_NETWORK/PATH/ATEUCHER$
    R_LIBS_USER //MY_WORK_NETWORK/PATH/ATEUCHER$/R/win-library/3.4
    R_USER //MY_WORK_NETWORK/PATH/ATEUCHER$

  • With HOME set:
    HOME C:\Users\ateucher
    R_LIBS_USER C:/Users/ateucher/R/win-library/3.4
    R_USER C:/Users/ateucher

I also previously thought that the slowness didn't occur when I wasn't in a project, but with my testing today it was occurring whether or not I was in a project.

I also tried setting R_USER to C:/Users/ateucher in my .Renviron file, but it made no difference...

I have a similar issue on 1.1.383 on Ubuntu 17.04. I have identifed datasets that can be opened near instaneously and scroll smoothly in other applications, but open slowly and are unusable in the viewer, e.g. calling View().

Based on some ad-hoc testing, it seems like View() becomes exponentially slower with the width of the dataset. Could it be something to do with computing the layout of of columns, widths etc?

I have created a few test datasets you can use to verify this behaviour. They all contain around 16K cells but vary in dimensions. They range in size from 300 - 600kb. Google drive link: https://drive.google.com/open?id=0B7688WPR38x2ZURwTTJWNzgyVGM

1 Like

I've encountered this problem as well & reported this issue previously on the rstudio support forum.

1 Like

@kevinushey I can confirm on my system that changing the HOME variable from a networked drive to a location on my local machine fixed this problem. As with @andyteucher, the slowness occurs with both the object viewer and data viewer.

I've been swamped at work so haven't had time to investigate which daily build introduced the problem, but I hope to be able to test this out in the next week or so. Thanks for the workaround!

I'll second this opinion. View() is way slower than I recall it ever being. I have relatively small table with 92 columns and 258 rows - so not a large amount of data, especially since it's sparsely populated. Even this small table viewing throws one of 8 logical CPUs into 100% usage, darkens RStudio, and renders it unusable for significant amount of seconds (10-20). It seems the issue is induced by clicking on vertical scroll-bar.

My system is
OS: Ubuntu 16.04 LTS 64 bit;
CPU: Intel® Core™ i7 CPU 860 @ 2.80GHz × 8 ;
Graphics: GeForce 9800 GTX+/PCIe/SSE2
RStudio 1.1.383
R:
platform x86_64-pc-linux-gnu
arch x86_64
os linux-gnu
system x86_64, linux-gnu
status
major 3
minor 2.3
year 2015
month 12
day 10
svn rev 69752
language R
version.string R version 3.2.3 (2015-12-10)
nickname Wooden Christmas-Tree

NP