As @amarchin has pointed, the testthat , covr and packrat packages are very helpful and the way-to-go in getting R working in a production environment. 
However, I have faced some problems previously when trying to configure R in an external server. In older versions of certain OS, such as centOS, I found that:
- The latest available pre-compiled R version was 2.15. Since these OS also include older versions of C compilers, the actual compilation and installation of newer R versions can be a nightmare. But, if I could, you can
.
- This also implies that installing packages that must be compiled depends on the available C compiler. If it's not up-to-date, many newer packages fail to install.
I guess this is currently a highly unlikely situation, but it's the one I faced some 4 years ago. Just be patient and everything will be working fine within few days.
On the other hand, my work concerns primarily processing Neuroimaging (MRI, PET, etc.) studies. That is, the entities I use to analyze are 3d arrays of size 256x256x128. Using many of these arrays as intermediate results in our pipeline has led us to the need of big amounts of RAM memory and to the use of task schedulers to priorize some jobs over others, avoiding memory conflicts.
In my experience, although R is seen as a strange programming language by some of my colleagues, mostly because it's not strong-typed, it allows rapid development of analysis solutions, and this is what makes worthy all the work in setting up the production environment.