thanks.

On Tuesday, July 14, 2015, Shivaram Venkataraman <shiva...@eecs.berkeley.edu>
wrote:

> Both SparkR and the PySpark API call into the JVM Spark API (i.e.
> JavaSparkContext, JavaRDD etc.). They use different methods (Py4J vs. the
> R-Java bridge) to call into the JVM based on libraries available / features
> supported in each language. So for Haskell, one would need to see what is
> the best way to call the underlying Java API functions from Haskell and get
> results back.
>
> Thanks
> Shivaram
>
> On Mon, Jul 13, 2015 at 8:51 PM, Vasili I. Galchin <vigalc...@gmail.com
> <javascript:_e(%7B%7D,'cvml','vigalc...@gmail.com');>> wrote:
>
>> Hello,
>>
>>      So far I think there are at two ways (maybe more) to interact
>> from various programming languages with the Spark Core: PySpark API
>> and R API. From reading code it seems that PySpark approach and R
>> approach are very disparate ... with the latter using the R-Java
>> bridge. Vis-a-vis/regarding I am trying to decide Haskell which way to
>> go. I realize that like any open software effort that approaches
>> varied based on history. Is there an intent to adopt one approach as
>> standard?(Not trying to start a war :-) :-(.
>>
>> Vasili
>>
>> BTW I guess Java and Scala APIs are simple given the nature of both
>> languages vis-a-vis the JVM??
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> <javascript:_e(%7B%7D,'cvml','dev-unsubscr...@spark.apache.org');>
>> For additional commands, e-mail: dev-h...@spark.apache.org
>> <javascript:_e(%7B%7D,'cvml','dev-h...@spark.apache.org');>
>>
>>
>

Reply via email to