Thanks,
I tried that, and the result was the same.
I still can start a master from the spark-1.4.0-bin-hadoop2.4 pre-built
version thought
I don't really know what to show more than the strace that I already
linked, so I could use any hint for that.
--
Henri Maxime Demoulin
2015-07-07 9:53 GMT
Can you try renaming the ~/.ivy2 file to ~/.ivy2_backup and build
spark1.4.0 again and run it?
Thanks
Best Regards
On Tue, Jul 7, 2015 at 6:27 PM, Max Demoulin wrote:
> Yes, I do set $SPARK_MASTER_IP. I suspect a more "internal" issue, maybe
> due to multiple spark/hdfs instances having success
Yes, I do set $SPARK_MASTER_IP. I suspect a more "internal" issue, maybe
due to multiple spark/hdfs instances having successively run on the same
machine?
--
Henri Maxime Demoulin
2015-07-07 4:10 GMT-04:00 Akhil Das :
> Strange. What are you having in $SPARK_MASTER_IP? It may happen that it is
>
Strange. What are you having in $SPARK_MASTER_IP? It may happen that it is
not able to bind to the given ip but again it should be in the logs.
Thanks
Best Regards
On Tue, Jul 7, 2015 at 12:54 AM, maxdml wrote:
> Hi,
>
> I've been compiling spark 1.4.0 with SBT, from the source tarball availabl
Hi,
I've been compiling spark 1.4.0 with SBT, from the source tarball available
on the official website. I cannot run spark's master, even tho I have built
and run several other instance of spark on the same machine (spark 1.3,
master branch, pre built 1.4, ...)
/starting org.apache.spark.deploy.