Cannot upload large excel file to RStudio

I am working on my Google Data Analytics Certificate and this part of my capstone project.

I was able to figure out with few search online as to how I can massage data.
I tested it using small excel file and it gave me what I needed.
Now, when I try to import actual excel file (131 MB), local RStudio just crashes. I tried using free RStudio Cloud, but RAM usage just explodes. Any suggestion?

With my test with small data set, I have figured out a solution, but I cannot get actual data (excel) in it.

Thanks everyone for help.

Try to export the excel file into tsv file, then import the tsv using read_tsv().
if that fixes then let's think about how to handle the excel file.
EDIT: sometimes the problem is caused by a large number of blank lines in the excel file (maybe tens of thousands). Check that this is not the case.

I do have large number of blank cells, but I cannot delete those rows because other columns and data within them are important for analysis.

The potential issue that was hinted at (I believe) is one where, due to some stray data being present (possibly invisibly) in a faraway cell, the sheet is considered much bigger than you actually intend it to be. Simply having missing data in your table is not a problem per se.

You can try copy/pasting the data (only the cells you want, don't just CTRL+A as that might copy over the issue too) into a new Excel file to see if it works.

The other route you were given was to try and isolate whether the issue is reading Excel files in general or whether the issue has to do with your data; hence the suggestion to convert your data to a different file type (tsv, csv is fine too -- both can be done by exporting from Excel).

I converted into CSV and it worked fine...still a rookie :blush:

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.