Hello,

I want to use R code as part of spark application (the same way I would do with 
Scala/Python).  I want to be able to run an R syntax as a map function on a big 
Spark dataframe loaded from a parquet file.
Is this even possible or the only way to use R is as part of RStudio 
orchestration of our Spark  cluster?

Thanks for the help!

Gilad

Reply via email to