sparklyr::sdf_copy_to Java heap error to Databricks

I'm currently experiencing some issues with sparklyr. I am on my local mac connecting to Databricks.

When I call sparklyr::sdf_copy_to I get a Java heap error which I believe is related to my driver node? I am able to copy over a small subset of the dataframe to Databricks but when I do the full amount (only around 100,000 rows) I get a Java heap error.

I tried to reconfigure my spark config to increase the driver memory to 4g and 8g but end up getting a initiate hive session error.

Any ideas how to copy dataframes seamlessly?

Thanks,

Chris

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.