Handle high frequency data volume

I wrote a web application Data Collection but it only can save few days data... Thousands rows will be lack.

I am using saveRDS() and invalidate() to save every single second forex price scrapped from website. Is there any efficient method to handle real time high frequency trading volume?

database should be solution you are after, depending on amount of data sqlite may be able to accommodate this if not then postgresql.
given write every second this will result in 31.5M rows/year. How do you then want to use it? do you want to read all into Rsession to analyze (if so then R copy-on-write may be a showstopper therefore you will have to reach for C++/Fortran with pointer functionality)?
just thinking in quickly, you will need index in table in DB which will slow down write but speed up read significantly.
maybe two tables in db, one to be read from, indexed however updated every night, the second one live, non indexed where you write scraping data.
sqlite: portable, easy to set up, limited functionality; postgresql: opposite to sqlite.

rgds,
Peter