Hi gurus,
I have knowledge of Java, Scala and good enough knowledge of Spark, Spark SQL 
and Spark Functional programing with Scala.
I have started using Python with Spark PySpark.
Wondering, in order to be proficient in PySpark, how much good knowledge of 
Python programing is needed? I know the answer may be very good knowledge, but 
in practice how much is good enough. I can write Python in IDE like PyCharm 
similar to the way Scala works and can run the programs. Does expert knowledge 
of Python is prerequisite for PySpark? I also know Pandas and am also familiar 
with plotting routines like matplotlib. 
Warmest
Ashok

Reply via email to