ok, i see
i imported wrong jar files which only work well on default hadoop version
2014-06-05
bluejoe2008
From: prabeesh k
Date: 2014-06-05 16:14
To: user
Subject: Re: Re: mismatched hdfs protocol
If you are not setting the Spark hadoop version, Spark built using default
hadoop version
* prabeesh k
> *Date:* 2014-06-05 13:23
> *To:* user
> *Subject:* Re: mismatched hdfs protocol
> For building Spark for particular version of Hadoop
> Refer
> http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html
>
>
> On Thu, Jun 5, 2014 at 8:14 AM
thank you!
i am developping a java project in Eclipse IDE on Windows
in which spark 1.0.0 libraries are imported
and now i want to open HDFS files as input
the hadoop version of HDFS is 2.4.0
2014-06-05
bluejoe2008
From: prabeesh k
Date: 2014-06-05 13:23
To: user
Subject: Re: mismatched hdfs
For building Spark for particular version of Hadoop
Refer
http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html
On Thu, Jun 5, 2014 at 8:14 AM, Koert Kuipers wrote:
> you have to build spark against the version of hadoop your are using
>
>
> On Wed, Jun 4, 2014 at 10:25 PM,
you have to build spark against the version of hadoop your are using
On Wed, Jun 4, 2014 at 10:25 PM, bluejoe2008 wrote:
> hi, all
> when my spark program accessed hdfs files
> an error happened:
>
> Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC
> version 9 cann