regarding my previous message, I forgot to mention to run netstat as
root (sudo netstat -plunt)
sorry for the noise

On Fri, Mar 11, 2016 at 12:29 AM, Jakob Odersky <ja...@odersky.com> wrote:
> Some more diagnostics/suggestions:
>
> 1) are other services listening to ports in the 4000 range (run
> "netstat -plunt")? Maybe there is an issue with the error message
> itself.
>
> 2) are you sure the correct java version is used? java -version
>
> 3) can you revert all installation attempts you have done so far,
> including files installed by brew/macports or maven and try again?
>
> 4) are there any special firewall rules in place, forbidding
> connections on localhost?
>
> This is very weird behavior you're seeing. Spark is supposed to work
> out-of-the-box with ZERO configuration necessary for running a local
> shell. Again, my prime suspect is a previous, failed Spark
> installation messing up your config.
>
> On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon <st...@memeticlabs.org> wrote:
>> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then 
>> you’re the superuser.
>> However, as mentioned below, I don’t think its a relevant factor.
>>
>>> On Mar 10, 2016, at 12:02 PM, Aida Tefera <aida1.tef...@gmail.com> wrote:
>>>
>>> Hi Tristan,
>>>
>>> I'm afraid I wouldn't know whether I'm running it as super user.
>>>
>>> I have java version 1.8.0_73 and SCALA version 2.11.7
>>>
>>> Sent from my iPhone
>>>
>>>> On 9 Mar 2016, at 21:58, Tristan Nixon <st...@memeticlabs.org> wrote:
>>>>
>>>> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
>>>> fresh 1.6.0 tarball,
>>>> unzipped it to local dir (~/Downloads), and it ran just fine - the driver 
>>>> port is some randomly generated large number.
>>>> So SPARK_HOME is definitely not needed to run this.
>>>>
>>>> Aida, you are not running this as the super-user, are you?  What versions 
>>>> of Java & Scala do you have installed?
>>>>
>>>>> On Mar 9, 2016, at 3:53 PM, Aida Tefera <aida1.tef...@gmail.com> wrote:
>>>>>
>>>>> Hi Jakob,
>>>>>
>>>>> Tried running the command env|grep SPARK; nothing comes back
>>>>>
>>>>> Tried env|grep Spark; which is the directory I created for Spark once I 
>>>>> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>>>>>
>>>>> Tried running ./bin/spark-shell ; comes back with same error as below; 
>>>>> i.e could not bind to port 0 etc.
>>>>>
>>>>> Sent from my iPhone
>>>>>
>>>>>> On 9 Mar 2016, at 21:42, Jakob Odersky <ja...@odersky.com> wrote:
>>>>>>
>>>>>> As Tristan mentioned, it looks as though Spark is trying to bind on
>>>>>> port 0 and then 1 (which is not allowed). Could it be that some
>>>>>> environment variables from you previous installation attempts are
>>>>>> polluting your configuration?
>>>>>> What does running "env | grep SPARK" show you?
>>>>>>
>>>>>> Also, try running just "/bin/spark-shell" (without the --master
>>>>>> argument), maybe your shell is doing some funky stuff with the
>>>>>> brackets.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>
>
> On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon <st...@memeticlabs.org> wrote:
>> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then 
>> you’re the superuser.
>> However, as mentioned below, I don’t think its a relevant factor.
>>
>>> On Mar 10, 2016, at 12:02 PM, Aida Tefera <aida1.tef...@gmail.com> wrote:
>>>
>>> Hi Tristan,
>>>
>>> I'm afraid I wouldn't know whether I'm running it as super user.
>>>
>>> I have java version 1.8.0_73 and SCALA version 2.11.7
>>>
>>> Sent from my iPhone
>>>
>>>> On 9 Mar 2016, at 21:58, Tristan Nixon <st...@memeticlabs.org> wrote:
>>>>
>>>> That’s very strange. I just un-set my SPARK_HOME env param, downloaded a 
>>>> fresh 1.6.0 tarball,
>>>> unzipped it to local dir (~/Downloads), and it ran just fine - the driver 
>>>> port is some randomly generated large number.
>>>> So SPARK_HOME is definitely not needed to run this.
>>>>
>>>> Aida, you are not running this as the super-user, are you?  What versions 
>>>> of Java & Scala do you have installed?
>>>>
>>>>> On Mar 9, 2016, at 3:53 PM, Aida Tefera <aida1.tef...@gmail.com> wrote:
>>>>>
>>>>> Hi Jakob,
>>>>>
>>>>> Tried running the command env|grep SPARK; nothing comes back
>>>>>
>>>>> Tried env|grep Spark; which is the directory I created for Spark once I 
>>>>> downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>>>>>
>>>>> Tried running ./bin/spark-shell ; comes back with same error as below; 
>>>>> i.e could not bind to port 0 etc.
>>>>>
>>>>> Sent from my iPhone
>>>>>
>>>>>> On 9 Mar 2016, at 21:42, Jakob Odersky <ja...@odersky.com> wrote:
>>>>>>
>>>>>> As Tristan mentioned, it looks as though Spark is trying to bind on
>>>>>> port 0 and then 1 (which is not allowed). Could it be that some
>>>>>> environment variables from you previous installation attempts are
>>>>>> polluting your configuration?
>>>>>> What does running "env | grep SPARK" show you?
>>>>>>
>>>>>> Also, try running just "/bin/spark-shell" (without the --master
>>>>>> argument), maybe your shell is doing some funky stuff with the
>>>>>> brackets.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to