Hello

 I want to work with single context Spark from Python and Scala. Is it
possible?

Is it possible to do betwen started  ./bin/pyspark and ./bin/spark-shell
for dramatic example?


Cheers,

Leonid

Reply via email to