org.apache.spark.shuffle.FetchFailedException :: Migration from Spark 1.2 to 1.3

2015-05-18 Thread zia_kayani
Hi, I'm getting this exception after shifting my code from Spark 1.2 to Spark 1.3 15/05/18 18:22:39 WARN TaskSetManager: Lost task 0.0 in stage 1.6 (TID 84, cloud8-server): FetchFailed(BlockManagerId(1, cloud4-server, 7337), shuffleId=0, mapId=9, reduceId=1, message= org.apache.spark.shuffle.Fetch

Re: Building Spark : Adding new DataType in Catalyst

2015-04-23 Thread zia_kayani
I've already tried UDT in Spark 1.2 and 1.3 but I encountered Kryo Serialization Exception on Joining as tracked here , i've talked to Michael Armbrust about the Exception, he said "

Building Spark : Building just one module.

2015-04-22 Thread zia_kayani
Hi, I've to add custom things into spark SQL and Catalyst Module ... But for every time I change a line of code I've to compile the whole spark, if I only compile sql/core and sql/catalyst module those changes aren't visible when I run the job over that spark, What I'm missing ? Any other way to

Building Spark : Adding new DataType in Catalyst

2015-04-22 Thread zia_kayani
Hi, I am working on adding Geometry i.e. a new DataType into Spark catalyst, so that ROW can hold that object also, I've made a progress but its time taking as I've to compile the whole spark project, otherwise that changes aren't visible, I've tried to just build Spark SQL and Catalyst module but

Java and Kryo Serialization, Java.io.OptionalDataException

2015-03-30 Thread zia_kayani
I have set Kryo Serializer as default serializer in SparkConf and Spark UI confirms it too, but in the Spark logs I'm getting this exception, java.io.OptionalDataException at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1370) at java.io.ObjectInputStream.readObject(Obj

Spark SQL UDT Kryo serialization, Unable to find class

2015-03-17 Thread zia_kayani
Hi, I want to introduce custom type for SchemaRDD, I'm following this example. But I'm having Kryo Serialization issues, here is stack trace: org.apache.spark.SparkExceptio