I am running RStudio-1.1.463 with R 3.5.3 on Windows 10. I have using a Lenovo T470 with 16gb of RAM and an i5 7th gen processor.
I updated to 3.5.3 from 3.5.2 on Friday, and since then I have had trouble running some of my scripts that were running quickly before. The slowness is not a lag in typing or scrolling, but the console taking a long time to respond to certain (not all) commands.
For example, this part of my script is working fine and quickly:
library(tidyverse) library(lubridate) #load data test_creeks <- read.csv("~/R/test_creeks/test_creeks.txt") #create date column from individual components (Year, Month, Day) creeks_date <- test_creeks %>% select(station, year, month, day, hour, minute, temperature) %>% mutate(date = make_date(year, month, day)) #create column for Julian days creeks_date$julian <- yday(creeks_date$date) #to set variables within a dataset by which you want statistics sorted creeks_temps<- group_by(creeks_date, station, year, julian, date) #treat all temps with variance greater than 2 as air temp with dplyr. 1 is air temp, 0 is NOT air temp creeks_temps_adjusted<- creeks_temps %>% mutate(air_temp = ifelse(var(temperature)>=2, 1, 0))
But when I try to do this, it now takes minutes when it used to take under 30 seconds:
#create new df for b3 b3<- creeks_temps_adjusted %>% filter(station == "B3") %>% select(station, year, julian, date, hour, temperature)
The size of creeks_temps_adjusted is: 1,745,939 x 10, while b3 is 119,933 x 7.
I have tried uninstalling both R and RStudio, have added exceptions to Windows Defender, and have tried disabling real-time protection as well to no avail.
I am not on a remote/networked system.
If anyone has any ideas, please let me know!