Million rows from R to SQL

I have previously used sqlsave() to write data from R into SQL table however this time i have got 3 million rows. salsave() is taking ages to transfer the data. On the other hand csv files allow 1 million rows per file.

Is there any method to rapidly transfer data from R in SQL ?

data is in table form lets say df includes 20 columns and 3 million rows.

What DBMS do you use?

This might be able to assist you
performance - How to load data quickly into R? - Stack Overflow

I am using SQL server. I could connect with database using dbconnect() but not able to write million rows still. sqlsave() even on a 200Mbps connection is slow.

Try, either:


1.With this, you export large data to a csv file. Only be careful about organizing your features.


2. To work with SQL.

Source:
https://www.stat.auckland.ac.nz/~yee/784/files13/ReadingInBigData

Please, let me know if it helps.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.