Following R code is reading small JSON file but when I am applying huge JSON data (3 GB, 5,51,367 records, and 341 features), the reading process continues and does not end. My JSON data file is of proper format which is required for
stream_in() function. Here is my R code,
library(jsonlite) main_sample = jsonlite::stream_in(file("sample.json"),pagesize = 100000) # reads line by line, pagesize size is given to break records into chunks data = jsonlite::flatten(main_sample) # convert into more nested columns i <- sapply(data, is.list) # columns of list class converted into character data[i] <- lapply(data[i], as.character) write.table(data, file = "data.csv", sep = ",",row.names = FALSE, col.names = TRUE) # convert JSON into CSV format.
I want the solution which takes normal time to read the JSON file. Even though I have given
pagesize=100000 but still it takes non ending time. I have 8 GB RAM. Can anyone solve this problem of reading huge JSON file in R?