thanks.
On Tuesday, July 14, 2015, Shivaram Venkataraman
wrote:
> Both SparkR and the PySpark API call into the JVM Spark API (i.e.
> JavaSparkContext, JavaRDD etc.). They use different methods (Py4J vs. the
> R-Java bridge) to call into the JVM based on libraries available / features
> supporte
Both SparkR and the PySpark API call into the JVM Spark API (i.e.
JavaSparkContext, JavaRDD etc.). They use different methods (Py4J vs. the
R-Java bridge) to call into the JVM based on libraries available / features
supported in each language. So for Haskell, one would need to see what is
the best
Hello,
So far I think there are at two ways (maybe more) to interact
from various programming languages with the Spark Core: PySpark API
and R API. From reading code it seems that PySpark approach and R
approach are very disparate ... with the latter using the R-Java
bridge. Vis-a-vis/regardi