pache-spark-developers-list.1001551.n3.nabble.com/PySpark-Driver-from-Jython-td7142.html
>>> > ),
>>> > but it seems, that no one have really done this with Spark.
>>> > It looks like performance gain from using jython can be huge - you
>>> wouldn't
&
t;> > It looks like performance gain from using jython can be huge - you
>> wouldn't
>> > need to spawn PythonWorkers, all the code would be just executed inside
>> > SparkExecutor JVM, using python code compiled to java bytecode. Do you
>> > think
>&
uted inside
> > SparkExecutor JVM, using python code compiled to java bytecode. Do you
> > think
> > that's possible to achieve? Do you see any obvious obstacles? Of course,
> > jython doesn't have C extensions, but if one doesn't need them, then it
> >
need them, then it
> should fit here nicely.
>
> I'm willing to try to marry Spark with Jython and see how it goes.
>
> What do you think about this?
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble
> should fit here nicely.
>
> I'm willing to try to marry Spark with Jython and see how it goes.
>
> What do you think about this?
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/How-to-speed-PySpark-
ou see any obvious obstacles? Of course,
jython doesn't have C extensions, but if one doesn't need them, then it
should fit here nicely.
I'm willing to try to marry Spark with Jython and see how it goes.
What do you think about this?
--
View this message in context:
http://apache-sp