ctedKeysOptimized(NioEventLoop.java:468)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
> at
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(
(SingleThreadEventExecutor.java:116)
at
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
at java.lang.Thread.run(Thread.java:744)
Caused by: org.apache.hadoop.ipc.RemoteException: Server IPC version 9
cannot communicate with client version
Of course, VERSION is supposed to be replaced by a real Hadoop version!
On Wed, Mar 25, 2015 at 12:04 PM, sandeep vura wrote:
> Build failed with following errors.
>
> I have executed the below following command.
>
> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERSION -DskipTests clean package
>
>
Where do i export MAVEN_OPTS in spark-env.sh or hadoop-env.sh
>>>>>
>>>>> I am running the below command in spark/yarn directory where pom.xml
>>>>> file is available
>>>>>
>>>>> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=VERS
>>>>
>>>>
>>>> On Wed, Mar 25, 2015 at 12:55 PM, Saisai Shao
>>>> wrote:
>>>>
>>>>> Looks like you have to build Spark with related Hadoop version,
>>>>> otherwise you will meet exception as mentioned. you could follow this doc:
>>>>> http://spark.apache.org/docs/latest/building-spark.html
>>>>>
>>>>> 2015-03-25 15:22 GMT+08:00 sandeep vura :
>>>>>
>>>>>> Hi Sparkers,
>>>>>>
>>>>>> I am trying to load data in spark with the following command
>>>>>>
>>>>>> *sqlContext.sql("LOAD DATA LOCAL INPATH
>>>>>> '/home/spark12/sandeep/sandeep.txt ' INTO TABLE src");*
>>>>>>
>>>>>> *Getting exception below*
>>>>>>
>>>>>>
>>>>>> *Server IPC version 9 cannot communicate with client version 4*
>>>>>>
>>>>>> NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>
u will meet exception as mentioned. you could follow this doc:
>>>> http://spark.apache.org/docs/latest/building-spark.html
>>>>
>>>> 2015-03-25 15:22 GMT+08:00 sandeep vura :
>>>>
>>>>> Hi Sparkers,
>>>>>
>>>>> I am trying to load data in spark with the following command
>>>>>
>>>>> *sqlContext.sql("LOAD DATA LOCAL INPATH
>>>>> '/home/spark12/sandeep/sandeep.txt ' INTO TABLE src");*
>>>>>
>>>>> *Getting exception below*
>>>>>
>>>>>
>>>>> *Server IPC version 9 cannot communicate with client version 4*
>>>>>
>>>>> NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>
-25 15:22 GMT+08:00 sandeep vura :
>>>
>>>> Hi Sparkers,
>>>>
>>>> I am trying to load data in spark with the following command
>>>>
>>>> *sqlContext.sql("LOAD DATA LOCAL INPATH
>>>> '/home/spark12/sandeep/sandeep.txt ' INTO TABLE src");*
>>>>
>>>> *Getting exception below*
>>>>
>>>>
>>>> *Server IPC version 9 cannot communicate with client version 4*
>>>>
>>>> NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>
>
I am trying to load data in spark with the following command
>>>
>>> *sqlContext.sql("LOAD DATA LOCAL INPATH
>>> '/home/spark12/sandeep/sandeep.txt ' INTO TABLE src");*
>>>
>>> *Getting exception below*
>>>
>>>
>>> *Server IPC version 9 cannot communicate with client version 4*
>>>
>>> NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13
>>>
>>>
>>>
>>>
>>>
>>
>
Hi Sparkers,
>>
>> I am trying to load data in spark with the following command
>>
>> *sqlContext.sql("LOAD DATA LOCAL INPATH
>> '/home/spark12/sandeep/sandeep.txt ' INTO TABLE src");*
>>
>> *Getting exception below*
>>
>>
>
llowing command
>
> *sqlContext.sql("LOAD DATA LOCAL INPATH '/home/spark12/sandeep/sandeep.txt
> ' INTO TABLE src");*
>
> *Getting exception below*
>
>
> *Server IPC version 9 cannot communicate with client version 4*
>
> NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13
>
>
>
>
>
park with the following command
>
> *sqlContext.sql("LOAD DATA LOCAL INPATH '/home/spark12/sandeep/sandeep.txt
> ' INTO TABLE src");*
>
> *Getting exception below*
>
>
> *Server IPC version 9 cannot communicate with client version 4*
>
> NOte : i am using Hadoop 2.2 version and spark 1.2 and hive 0.13
>
>
>
>
>
Hi Sparkers,
I am trying to load data in spark with the following command
*sqlContext.sql("LOAD DATA LOCAL INPATH '/home/spark12/sandeep/sandeep.txt
' INTO TABLE src");*
*Getting exception below*
*Server IPC version 9 cannot communicate with client version 4*
NOte : i
014 at 10:47 AM, Hafiz Mujadid
wrote:
> I am accessing hdfs with spark .textFile method. and I receive error as
>
> Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC
> version 9 cannot communicate with client version 4
>
>
> here are my
I am accessing hdfs with spark .textFile method. and I receive error as
Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC
version 9 cannot communicate with client version 4
here are my dependencies
<http://apache-spark-user-list.1001560.n3.nabble.co
14 matches
Mail list logo