Hi,
what do you get running just 'sudo netstat'?
Also, what's the output of 'jps -mlv' when running your spark application?
Can you post the contents of the files in $SPARK_HOME/conf ?
Are there any special firewall rules in place, forbidding connections
on localhost?
Regarding the IP address chan
Hi Jakob, sorry for my late reply
I tried to run the below; came back with "netstat: lunt: unknown or
uninstrumented protocol
I also tried uninstalling version 1.6.0 and installing version1.5.2 with Java 7
and SCALA version 2.10.6; got the same error messages
Do you think it would be worth me
regarding my previous message, I forgot to mention to run netstat as
root (sudo netstat -plunt)
sorry for the noise
On Fri, Mar 11, 2016 at 12:29 AM, Jakob Odersky wrote:
> Some more diagnostics/suggestions:
>
> 1) are other services listening to ports in the 4000 range (run
> "netstat -plunt")?
Some more diagnostics/suggestions:
1) are other services listening to ports in the 4000 range (run
"netstat -plunt")? Maybe there is an issue with the error message
itself.
2) are you sure the correct java version is used? java -version
3) can you revert all installation attempts you have done s
If you type ‘whoami’ in the terminal, and it responds with ‘root’ then you’re
the superuser.
However, as mentioned below, I don’t think its a relevant factor.
> On Mar 10, 2016, at 12:02 PM, Aida Tefera wrote:
>
> Hi Tristan,
>
> I'm afraid I wouldn't know whether I'm running it as super user
Hi Gaini, thanks for your response
Please see the below contents of the files in the conf. directory:
1. docker.properties.template
Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additiona
Hi Tristan,
I'm afraid I wouldn't know whether I'm running it as super user.
I have java version 1.8.0_73 and SCALA version 2.11.7
Sent from my iPhone
> On 9 Mar 2016, at 21:58, Tristan Nixon wrote:
>
> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a
> fresh 1.6.0
It really shouldn’t, if anything, running as superuser should ALLOW you to bind
to ports 0, 1 etc.
It seems very strange that it should even be trying to bind to these ports -
maybe a JVM issue?
I wonder if the old Apple JVM implementations could have used some different
native libraries for cor
It should just work with these steps. You don't need to configure much. As
mentioned, some settings on your machine are overriding default spark
settings.
Even running as super-user should not be a problem. It works just fine as
super-user as well.
Can you tell us what version of Java you are usi
That’s very strange. I just un-set my SPARK_HOME env param, downloaded a fresh
1.6.0 tarball,
unzipped it to local dir (~/Downloads), and it ran just fine - the driver port
is some randomly generated large number.
So SPARK_HOME is definitely not needed to run this.
Aida, you are not running thi
Hi Jakob,
Tried running the command env|grep SPARK; nothing comes back
Tried env|grep Spark; which is the directory I created for Spark once I
downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
Tried running ./bin/spark-shell ; comes back with same error as below; i.e
could
Sorry had a typo in my previous message:
> try running just "/bin/spark-shell"
please remove the leading slash (/)
On Wed, Mar 9, 2016 at 1:39 PM, Aida Tefera wrote:
> Hi there, tried echo $SPARK_HOME but nothing comes back so I guess I need to
> set it. How would I do that?
>
> Thanks
>
> Sent
As Tristan mentioned, it looks as though Spark is trying to bind on
port 0 and then 1 (which is not allowed). Could it be that some
environment variables from you previous installation attempts are
polluting your configuration?
What does running "env | grep SPARK" show you?
Also, try running just
Hi Tristan, my apologies, I meant to write Spark and not SCALA
I feel a bit lost at the moment...
Perhaps I have missed steps that are implicit to more experienced people
Apart from downloading spark and then following Jakob's steps:
1. curlhttp://apache.arvixe.com/spark/spark-1.6.0/spark-1.6.
SPARK_HOME and SCALA_HOME are different. I was just wondering whether spark is
looking in a different dir for the config files than where you’re running it.
If you have not set SPARK_HOME, it should look in the current directory for the
/conf dir.
The defaults should be relatively safe, I’ve be
I don't think I set the SCALA_HOME environment variable
Also, I'm unsure whether or not I should launch the scripts defaults to a
single machine(local host)
Sent from my iPhone
> On 9 Mar 2016, at 19:59, Tristan Nixon wrote:
>
> Also, do you have the SPARK_HOME environment variable set in you
Also, do you have the SPARK_HOME environment variable set in your shell, and if
so what is it set to?
> On Mar 9, 2016, at 1:53 PM, Tristan Nixon wrote:
>
> There should be a /conf sub-directory wherever you installed spark, which
> contains several configuration files.
> I believe that the tw
There should be a /conf sub-directory wherever you installed spark, which
contains several configuration files.
I believe that the two that you should look at are
spark-defaults.conf
spark-env.sh
> On Mar 9, 2016, at 1:45 PM, Aida Tefera wrote:
>
> Hi Tristan, thanks for your message
>
> When
Hi Tristan, thanks for your message
When I look at the spark-defaults.conf.template it shows a spark
example(spark://master:7077) where the port is 7077
When you say look to the conf scripts, how do you mean?
Sent from my iPhone
> On 9 Mar 2016, at 19:32, Tristan Nixon wrote:
>
> Yeah, accor
Yeah, according to the standalone documentation
http://spark.apache.org/docs/latest/spark-standalone.html
the default port should be 7077, which means that something must be overriding
this on your installation - look to the conf scripts!
> On Mar 9, 2016, at 1:26 PM, Tristan Nixon wrote:
>
>
Looks like it’s trying to bind on port 0, then 1.
Often the low-numbered ports are restricted to system processes and
“established” servers (web, ssh, etc.) and
so user programs are prevented from binding on them. The default should be to
run on a high-numbered port like 8080 or such.
What do yo
Hi Jakob,
Thanks for your suggestion. I downloaded a pre built version with Hadoop and
followed your steps
I posted the result on the forum thread, not sure if you can see it?
I was just wondering whether this means it has been successfully installed as
there are a number of warning/error mess
Hi everyone, thanks for all your support
I went with your suggestion Cody/Jakob and downloaded a pre-built version
with Hadoop this time and I think I am finally making some progress :)
ukdrfs01:spark-1.6.0-bin-hadoop2.6 aidatefera$ ./bin/spark-shell --master
local[2]
log4j:WARN No appenders cou
> On 8 Mar 2016, at 18:06, Aida wrote:
>
> Detected Maven Version: 3.0.3 is not in the allowed range 3.3.3.
I'd look at that error message and fix it
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional
I've had some issues myself with the user-provided-Hadoop version.
If you simply just want to get started, I would recommend downloading
Spark (pre-built, with any of the hadoop versions) as Cody suggested.
A simple step-by-step guide:
1. curl http://apache.arvixe.com/spark/spark-1.6.0/spark-1.6.
http://spark.apache.org/downloads.html
Make sure you selected Choose a package type: something that says pre-built
In my case, spark-1.6.0-bin-hadoop2.4.tgz
bash-3.2$ cd ~/Downloads/
bash-3.2$ tar -xzvf spark-1.6.0-bin-hadoop2.4.tgz
bash-3.2$ cd spark-1.6.0-bin-hadoop2.4/
bash-3.2$ ./bin/spa
Ok, once I downloaded the pre built version, I created a directory for it and
named Spark
When I try ./bin/start-all.sh
It comes back with : no such file or directory
When I try ./bin/spark-shell --master local[2]
I get: no such file or directory
Failed to find spark assembly, you need to bu
That's what I'm saying, there is no "installing" necessary for
pre-built packages. Just unpack it and change directory into it.
What happens when you do
./bin/spark-shell --master local[2]
or
./bin/start-all.sh
On Tue, Mar 8, 2016 at 3:45 PM, Aida Tefera wrote:
> Hi Cody, thanks for your r
Hi Cody, thanks for your reply
I tried "sbt/sbt clean assembly" in the Terminal; somehow I still end up with
errors.
I have looked at the below links, doesn't give much detail on how to install it
before executing "./sbin/start-master.sh"
Thanks,
Aida
Sent from my iPhone
> On 8 Mar 2016, at
You said you downloaded a prebuilt version.
You shouldn't have to mess with maven or building spark at all. All
you need is a jvm, which it looks like you already have installed.
You should be able to follow the instructions at
http://spark.apache.org/docs/latest/
and
http://spark.apache.org/
Hi Aida,
The installation has detected a maven version 3.0.3. Update to 3.3.3 and
try again.
Il 08/Mar/2016 14:06, "Aida" ha scritto:
> Hi all,
>
> Thanks everyone for your responses; really appreciate it.
>
> Eduardo - I tried your suggestions but ran into some issues, please see
> below:
>
> uk
tried sbt/sbt package; seemed to run fine until it didn't, was wondering
whether the below error has to do with my JVM version. Any thoughts? Thanks
ukdrfs01:~ aidatefera$ cd Spark
ukdrfs01:Spark aidatefera$ cd spark-1.6.0
ukdrfs01:spark-1.6.0 aidatefera$ sbt/sbt package
NOTE: The sbt/sbt script h
Hi all,
Thanks everyone for your responses; really appreciate it.
Eduardo - I tried your suggestions but ran into some issues, please see
below:
ukdrfs01:Spark aidatefera$ cd spark-1.6.0
ukdrfs01:spark-1.6.0 aidatefera$ build/mvn -DskipTests clean package
Using `mvn` from path: /usr/bin/mvn
Java
Installing spark on mac is similar to how you install it on Linux.
I use mac and have written a blog on how to install spark here is the link
: http://vishnuviswanath.com/spark_start.html
Hope this helps.
On Fri, Mar 4, 2016 at 2:29 PM, Simon Hafner wrote:
> I'd try `brew install spark` or `ap
Hi Aida
Run only "build/mvn -DskipTests clean package”
BR
Eduardo Costa Alfaia
Ph.D. Student in Telecommunications Engineering
Università degli Studi di Brescia
Tel: +39 3209333018
On 3/4/16, 16:18, "Aida" wrote:
>Hi all,
>
>I am a complete novice and was wondering whether anyone would
I'd try `brew install spark` or `apache-spark` and see where that gets
you. https://github.com/Homebrew/homebrew
2016-03-04 21:18 GMT+01:00 Aida :
> Hi all,
>
> I am a complete novice and was wondering whether anyone would be willing to
> provide me with a step by step guide on how to install Spar
36 matches
Mail list logo