I will try a fresh setup very soon.
Actually, I tried to compile spark by myself, against hadoop 2.5.2, but I
had the issue that I mentioned in this thread:
http://apache-spark-user-list.1001560.n3.nabble.com/Master-doesn-t-start-no-logs-td23651.html
I was wondering if maybe serialization/deseria
Yes,
Thank you.
--
Henri Maxime Demoulin
2015-07-12 2:53 GMT-04:00 Akhil Das :
> Did you try setting the HADOOP_CONF_DIR?
>
> Thanks
> Best Regards
>
> On Sat, Jul 11, 2015 at 3:17 AM, maxdml wrote:
>
>> Also, it's worth noting that I'm using the prebuilt version for hadoop 2.4
>> and higher f
9:53 GMT-04:00 Akhil Das :
> Can you try renaming the ~/.ivy2 file to ~/.ivy2_backup and build
> spark1.4.0 again and run it?
>
> Thanks
> Best Regards
>
> On Tue, Jul 7, 2015 at 6:27 PM, Max Demoulin
> wrote:
>
>> Yes, I do set $SPARK_MASTER_IP. I suspect a
Yes, I do set $SPARK_MASTER_IP. I suspect a more "internal" issue, maybe
due to multiple spark/hdfs instances having successively run on the same
machine?
--
Henri Maxime Demoulin
2015-07-07 4:10 GMT-04:00 Akhil Das :
> Strange. What are you having in $SPARK_MASTER_IP? It may happen that it is
>
lication.
> On 30 Jun 2015 01:50, "Max Demoulin" wrote:
>
>> The underlying issue is a filesystem corruption on the workers.
>>
>> In the case where I use hdfs, with a sufficient amount of replica, would
>> Spark try to launch a task on another no
The underlying issue is a filesystem corruption on the workers.
In the case where I use hdfs, with a sufficient amount of replica, would
Spark try to launch a task on another node where the block replica is
present?
Thanks :-)
--
Henri Maxime Demoulin
2015-06-29 9:10 GMT-04:00 ayan guha :
> No
Can I actually include another version of guava in the classpath when
launching the example through spark submit?
--
Henri Maxime Demoulin
2015-06-25 10:57 GMT-04:00 Max Demoulin :
> I see, thank you!
>
> --
> Henri Maxime Demoulin
>
> 2015-06-25 5:54 GMT-04:00 Steve Lough
I see, thank you!
--
Henri Maxime Demoulin
2015-06-25 5:54 GMT-04:00 Steve Loughran :
> you are using a guava version on the classpath which your version of
> Hadoop can't handle. try a version < 15 or build spark against Hadoop 2.7.0
>
> > On 24 Jun 2015, at 19:03, maxdml wrote:
> >
> >Exc