Can you try renaming the ~/.ivy2 file to ~/.ivy2_backup and build
spark1.4.0 again and run it?

Thanks
Best Regards

On Tue, Jul 7, 2015 at 6:27 PM, Max Demoulin <maxdemou...@gmail.com> wrote:

> Yes, I do set $SPARK_MASTER_IP. I suspect a more "internal" issue, maybe
> due to multiple spark/hdfs instances having successively run on the same
> machine?
>
> --
> Henri Maxime Demoulin
>
> 2015-07-07 4:10 GMT-04:00 Akhil Das <ak...@sigmoidanalytics.com>:
>
>> Strange. What are you having in $SPARK_MASTER_IP? It may happen that it
>> is not able to bind to the given ip but again it should be in the logs.
>>
>> Thanks
>> Best Regards
>>
>> On Tue, Jul 7, 2015 at 12:54 AM, maxdml <maxdemou...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I've been compiling spark 1.4.0 with SBT, from the source tarball
>>> available
>>> on the official website. I cannot run spark's master, even tho I have
>>> built
>>> and run several other instance of spark on the same machine (spark 1.3,
>>> master branch, pre built 1.4, ...)
>>>
>>> /starting org.apache.spark.deploy.master.Master, logging to
>>>
>>> /mnt/spark-1.4.0/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-xx.out
>>> failed to launch org.apache.spark.deploy.master.Master:
>>> full log in
>>>
>>> /mnt/spark-1.4.0/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-xx.out/
>>>
>>> But the log file is empty.
>>>
>>> After digging up to ./bin/spark-class, and finally trying to start the
>>> master with:
>>>
>>> ./bin/spark-class org.apache.spark.deploy.master.Master --host
>>> 155.99.144.31
>>>
>>> I still have the same result. Here is the strace output for this command:
>>>
>>> http://pastebin.com/bkJVncBm
>>>
>>> I'm using a 64 bit Xeon, CentOS 6.5, spark 1.4.0, compiled against hadoop
>>> 2.5.2
>>>
>>> Any idea? :-)
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Master-doesn-t-start-no-logs-tp23651.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to