ur actual requirement is but you will need to
> implement your custom version of PySpark API to get all functionality you
> need and control on JVM side.
>
>
> On 31/03/2021 06:49, Aditya Singh wrote:
>
> Thanks a lot Khalid for replying.
>
> I have one question though. The appro
gt; .getOrCreate()
>
> val df = spark
> .read
> .option("header", "True")
> .csv(myFile.toString)
> .collect()
>
> }
> object Py4JServerApp extends App {
>
>
> val server = new GatewayServer(SparkApp)
> server
ister the DataFrame as a temp view in Java, then access it
> in Pyspark.
>
>
> On Fri, Mar 26, 2021 at 8:14 AM Aditya Singh
> wrote:
>
>> Hi All,
>>
>> I am a newbie to spark and trying to pass a java dataframe to pyspark.
>> Foloowing link has details
Hi All,
I am a newbie to spark and trying to pass a java dataframe to pyspark.
Foloowing link has details about what I am trying to do:-
https://stackoverflow.com/questions/66797382/creating-pysparks-spark-context-py4j-java-gateway-object
Can someone please help me with this?
Thanks,