R and pqR together


The Pretty Quick R project seems very interesting to me, in that it improves performance of many functions and enables parallel computing.

To me, there are two things I hope see improvement in R. First, single core speed is not fast, and multiple core is not supported naturally. Second, out of memory data analysis is not good. I guess pqR is tackling the first issue, is it possible for R to borrow some ideas from it?

1 Like

There is a lot of history and philosophy packed into that question, Peter!

Is R an interactive, easily extendable, capable of being goosed through Rcpp and kin, functional programming language that has its greatest application in exploratory data analysis and prototyping? (One view of the status quo.)

Alternatively, should it head in the direction of being dominantly compiled language with a sort of macro level of user extensibility that is destined to be a great engine of Big Data Real Time Streaming types of problems?

In a world that I will probably never see, the functional interpreted R will be translated into the functional compiled Haskell.

Why is this my vision? Imperator/proceduralist programmers often find implementing an R toolchain something of a PITA. Maybe that's why so many shops turn to Python, which has the same paradigm as the mainline languages.

If the world of data engineers were more richly populated by afficionados of functional programming languages like Haskell, they would adore R. And, in theory at least, R would adore them, because a program that compiles in Haskell is said to be "self-proving" mathematically and logically. (I take this on faith.)

I see a world where R{base} has a Hackage library ready to import, soon followed by tidyverse and the other must haves, and eventually the entire CRANsphere. functional-to-functional

And maybe I've got a winning PowerBall ticket in my pocket somewhere.


I know a little bit of functional programming, which is ocaml. To my experience, ocaml and ATS are faster than Haskell. I am never a fan of oop and 0 based indexing languages, it is just so counterintuitive mathematically. However, c/CPP has proven solid performance over time. I hope r can be compiled to c, like what nim is doing.

1 Like

Peter, you are in the solid community of implementers, and I can't criticize you for wanting something to give 1:1 R:C-dialect translation.

I gave my programming viriginity, such as it was, to C, right up to pointers which made my primitive PDP-11 based desktop panic. I have a debt to K&R ed 1 that I can never repay.

But with concurrency, parallel processing, cores galore, how many real world tasks are still CPU bound, rather than I/O bound?

Silicon is cheap and quick to manufacture. Wetware is expensive and slow to grow. We may not end up with the best of all possible worlds, but eventually winner will take most, and I think your wishes are more in line with the probable future than mine.

1 Like

Have you seen this project - https://tweag.github.io/HaskellR/?
Admittedly, it goes the other way and tries to use R from Haskell, but in the world where Rcpp exists, why not RHaskell as well?


Thanks! I'll take a look!

Not at all! See the examples of how to call Haskell from R: https://laustep.github.io/stlahblog/posts/FloatExpansionHaskell.html and https://laustep.github.io/stlahblog/posts/YoungTableaux.html, and an R package using Haskell function https://github.com/stla/skewtableaux. It is obviously not as seamless as Rcpp (think calling C functions in R in pre-Rcpp days) but the method is definitely usable.

(I am a big fan of Haskell, it was a disappointment to learn that https://github.com/hadley/monads was a one-time experiment :wink: )


This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.