path to under your user directory.
>
> Running Spark on Windows should work :)
>
> --
> *From:* Curtis Burkhalter
> *Sent:* Wednesday, June 7, 2017 7:46:56 AM
> *To:* Doc Dwarf
> *Cc:* user@spark.apache.org
> *Subject:* Re: problem initiating sp
: Curtis Burkhalter
Sent: Wednesday, June 7, 2017 7:46:56 AM
To: Doc Dwarf
Cc: user@spark.apache.org
Subject: Re: problem initiating spark context with pyspark
Thanks Doc I saw this on another board yesterday so I've tried this by first
going to the directory where I've stored the wintuti
Ha...it's a 1 off.I run spk on Ubuntu and docker on windows...I
don't think spark and windows are best friends. 😀
On Jun 10, 2017 6:36 PM, "Gourav Sengupta"
wrote:
> seeing for the very first time someone try SPARK on Windows :)
>
> On Thu, Jun 8, 2017 at 8:38 PM, Marco Mistroni
> wrot
seeing for the very first time someone try SPARK on Windows :)
On Thu, Jun 8, 2017 at 8:38 PM, Marco Mistroni wrote:
> try this link
>
> http://letstalkspark.blogspot.co.uk/2016/02/getting-started-
> with-spark-on-window-64.html
>
> it helped me when i had similar problems with windows..
try this link
http://letstalkspark.blogspot.co.uk/2016/02/getting-started-with-spark-on-window-64.html
it helped me when i had similar problems with windows...
hth
On Wed, Jun 7, 2017 at 3:46 PM, Curtis Burkhalter <
curtisburkhal...@gmail.com> wrote:
> Thanks Doc I saw this on another
Thanks Doc I saw this on another board yesterday so I've tried this by
first going to the directory where I've stored the wintutils.exe and then
as an admin running the command that you suggested and I get this
exception when checking the permissions:
C:\winutils\bin>winutils.exe ls -F C:\tmp\hiv
Hi Curtis,
I believe in windows, the following command needs to be executed: (will
need winutils installed)
D:\winutils\bin\winutils.exe chmod 777 D:\tmp\hive
On 6 June 2017 at 09:45, Curtis Burkhalter
wrote:
> Hello all,
>
> I'm new to Spark and I'm trying to interact with it using Pyspark.
Hello all,
I'm new to Spark and I'm trying to interact with it using Pyspark. I'm
using the prebuilt version of spark v. 2.1.1 and when I go to the command
line and use the command 'bin\pyspark' I have initialization problems and
get the following message:
C:\spark\spark-2.1.1-bin-hadoop2.7> bin\