Hi, all  

I’m trying the JDBC server, so the cluster is running the version compiled from 
branch-0.1-jdbc  

Unfortunately (and as expected), it cannot run the programs compiled with the 
dependency on spark 1.0 (i.e. download from maven)

1. The first error I met is the different SerializationVersionUID in 
ExecuterStatus  

I resolved by explicitly declare SerializationVersionUID in 
ExecuterStatus.scala and recompile branch-0.1-jdbc

2. Then I start the program compiled with spark-1.0, what I met is  

14/07/13 05:08:11 WARN AppClient$ClientActor: Could not connect to 
akka.tcp://sparkMaster@172.31.*.*:*: java.util.NoSuchElementException: key not 
found: 6  
14/07/13 05:08:11 WARN AppClient$ClientActor: Connection to 
akka.tcp://sparkMaster@172.31.*.*:* failed; waiting for master to reconnect...



I don’t understand how "key not found: 6” comes



Also I tried to start JDBC server with spark-1.0 cluster, after resolving 
different SerializationVersionUID, what I met is that when I use beeline to run 
“show tables;”, it shows some executors get lost and tasks failed for unknown 
reason

Anyone can give some suggestions on how to make spark-1.0 cluster work with 
JDBC?  

(maybe I need to have a internal maven repo and change all spark dependency to 
that?)

Best,

--  
Nan Zhu

Reply via email to