Thank you all in advance for being so supportive as I learn this language from the ground up.
So I got 12 csvs , for 12 months, and together they add up to over a GB., but the size is too much
I tried using RStudio Cloud, and just a basic R Console, as well as different codes.
I could load them individually, or only work with half as much data while I have this 7 year old MAC, but if there is a way to work with so much data, that'd be preferred.
This is the code I use before the last command falters:
install.packages("dplyr") install.packages("tidyverse") library(tidyverse) library(dplyr) library(data.table) setwd("file_path") files <- list.files(pattern = ".csv") temp <- lapply(files,fread,sep=",") data <- rbindlist(temp)
The error I get at the end says:
"2023-02-09 02:47:11.953 R[7686:174592] IMKClient Stall detected, please Report your user scenario attaching a spindump (or sysdiagnose) that captures the problem - (imkxpc_selectedRangeWithReply:) block performed very slowly (4.37 secs).
2023-02-09 02:47:11.953 R[7686:174592] IMKClient Stall detected, please Report your user scenario attaching a spindump (or sysdiagnose) that captures the problem - (imkxpc_deadKeyStateWithReply:) block performed very slowly (4.30 secs)"
So what would you do? Just work with less data or there is a certain package for more efficient csv_reading or something like that?
Thanks again! -Jon