The problem is that both of these are not sharing a SparkContext as far as
I can see, so there is no way to share the object across them, let alone
languages.

You can of course write the data from Java, read it from Python.

In some hosted Spark products, you can access the same session from two
languages and register the DataFrame as a temp view in Java, then access it
in Pyspark.


On Fri, Mar 26, 2021 at 8:14 AM Aditya Singh <aditya.singh9...@gmail.com>
wrote:

> Hi All,
>
> I am a newbie to spark and trying to pass a java dataframe to pyspark.
> Foloowing link has details about what I am trying to do:-
>
>
> https://stackoverflow.com/questions/66797382/creating-pysparks-spark-context-py4j-java-gateway-object
>
> Can someone please help me with this?
>
> Thanks,
>

Reply via email to