DeepLearning in R Interface Design


Currently, Keras and Tensorflow seem the best choice to run deep learning model in R.
However, it does not integrate with R very well like dplyr and sparklyr.

using tf$xxx style to call tensorflow function does not fit R programming philosophy, tf seems like too redundant.

From my point of view, dplyr lazy evaluation can propagate to more scenarios which support DAG mode or preview mode with df %>% head() operation.

Recently, I found a gluon package on mxnet, which can convert pipeline numeric to symbolic just like dplyr using tbl(sc,"some_table_in_database") %>% collect.

However, dplyr is the greatest interface design that I have ever seen before even gluon.hybridize().

Is anyone interested in this topic?


We've implemented a more native R interface to TensorFlow / Deep Learning along the lines you describe in the Keras package: See also tfestimators: