Did you install and configure the proper Spark kernel (SparkMagic) on
your Jupyter Lab or Hub? See
https://github.com/jupyter/jupyter/wiki/Jupyter-kernels for more info...
On 1/5/22 4:01 AM, 流年以东” wrote:
In the process of using pyspark,there is no spark context when opening
jupyter and input sc.master show that sc is not define.we want to
initialize the spark context with script. this is error.
hope to receive your reply
------------------------------------------------------------------------
发自我的iPhone
---------------------------------------------------------------------
To unsubscribe e-mail:user-unsubscr...@spark.apache.org