Hello Team!
Hope you are doing well

I have downloaded the Apache Spark version (spark-3.1.1-bin-hadoop2.7). I
have downloaded the winutils file too from github.
Python version :Python 3.9.4
Java version: java version "1.8.0_291"
Java(TM) SE Runtime Environment (build 1.8.0_291-b10)
Java HotSpot(TM) 64-Bit Server VM (build 25.291-b10, mixed mode)

WHEN I ENTER THE COMMAND spark-shell in cmd it gives me this error
"'spark-shell' is not recognized as an internal or external command,operable
program or batch file."
I am sharing the screenshots of my environment variables. Please help me. I
am stuck now.

I am looking forward to hearing from you
Thanks & Regards
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to