A small correction when I typed it is not RDDBackend it is RBackend,sorry.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Share-RDD-from-SparkR-and-another-application-tp23795p23828.html
Sent from the Apache Spark User List mailing list archive at Nabble.co
I appreciate your reply.
Yes,you are right by putting in a parquet etc and reading from another app,I
would rather use spark-jobserver or IBM kernel to achieve the same if it is
not SparkR as it gives more flexibility/scalabilty.
Anyway,I have found a way to run R for my poc from my existing app us
Hi, hari,
I don't think job-server can work with SparkR (also pySpark). It seems it would
be technically possible but needs support from job-server and SparkR(also
pySpark), which doesn't exist yet.
But there may be some in-direct ways of sharing RDDs between SparkR and an
application. For exa