RStudio suddenly slow after updating to 3.5.3

Hi all,

I am running RStudio-1.1.463 with R 3.5.3 on Windows 10. I have using a Lenovo T470 with 16gb of RAM and an i5 7th gen processor.

I updated to 3.5.3 from 3.5.2 on Friday, and since then I have had trouble running some of my scripts that were running quickly before. The slowness is not a lag in typing or scrolling, but the console taking a long time to respond to certain (not all) commands.

For example, this part of my script is working fine and quickly:

#load data
test_creeks <- read.csv("~/R/test_creeks/test_creeks.txt")
#create date column from individual components (Year, Month, Day)
creeks_date <- test_creeks %>% 
  select(station, year, month, day, hour, minute, temperature) %>% 
  mutate(date = make_date(year, month, day))

#create column for Julian days
creeks_date$julian <- yday(creeks_date$date)

#to set variables within a dataset by which you want statistics sorted
creeks_temps<- group_by(creeks_date, station, year, julian, date)

#treat all temps with variance greater than 2 as air temp with dplyr. 1 is air temp, 0 is NOT air temp
creeks_temps_adjusted<- creeks_temps %>%
  mutate(air_temp = ifelse(var(temperature)>=2, 1, 0))

But when I try to do this, it now takes minutes when it used to take under 30 seconds:

#create new df for b3
b3<- creeks_temps_adjusted %>%
  filter(station == "B3") %>%
  select(station, year, julian, date, hour, temperature)

The size of creeks_temps_adjusted is: 1,745,939 x 10, while b3 is 119,933 x 7.

I have tried uninstalling both R and RStudio, have added exceptions to Windows Defender, and have tried disabling real-time protection as well to no avail.

I am not on a remote/networked system.

If anyone has any ideas, please let me know!

Do you see the same degradation in performance if you try running your code in a separate front-end, e.g. using the bundled RGui, or even trying to run R from a command terminal?

Hi Kevin,

I am seeing the same degradation in performance when I run my code in both RGui and in just the command terminal.

Looking at this with a fresh set of eyes today I figured out what the problem was. It turns out this is not a problem with R, RStudio, or Windows Defender, but was a problem with the new version of dplyr and my code.

The problem was in the step where I grouped my variables to make creek_temps. If I ungroup this/skip this step, it takes 0.04 seconds to create b3, compared to 6.85 minutes when they were grouped.

1 Like

If your question's been answered (even by you!), would you mind choosing a solution? It helps other people see which questions still need help, or find solutions if they have similar problems. Here’s how to do it:

I wonder if this is worth reporting upstream to Unless there was some other intentional change in behavior causing this, that seems like a fairly substantial performance degradation that the dplyr team might want to know about.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.