Hi All,

 

I have tested my code without problem on EMR yarn (spark 1.3.0) with default
serializer (java).

But when I switch to org.apache.spark.serializer.KryoSerializer, the
broadcast value doesn't give me right result (actually return me empty
custom class on inner object).

 

Basically I broadcast a builder object, which carry an array of
propertiesUtils object. The code should not have any logical issue because
it works on default java serializer. But when I turn to the
org.apache.spark.serializer.KryoSerializer, it looks like the Array doesn't
initialize, propertiesList will give a right size, but then all element in
the array is just a normal empty PropertiesUtils.

 

Do I miss anything when I use this KryoSerializer? I just put the two lines,
do I need to implement some special code to enable KryoSerializer, but I
search all places but can't find any places mention it.

 

sparkConf.set("spark.serializer",
"org.apache.spark.serializer.KryoSerializer");

sparkConf.registerKryoClasses(new Class[]{ModelSessionBuilder.class,
Constants.class, PropertiesUtils.class, ModelSession.class});

 

public class ModelSessionBuilder implements Serializable {

                /**

                * 

                 */

                .

                private PropertiesUtils[] propertiesList;

                private static final long serialVersionUID =
-8139500301736028670L;

}

 

public class PropertiesUtils extends Properties {

       /**

       * 

        */

       private static final long serialVersionUID = -3684043338580885551L;

 

       public PropertiesUtils(Properties prop) {

              super(prop);

       }

 

       public PropertiesUtils() {

              // TODO Auto-generated constructor stub

       }

}

                

 

Regards,

 

Shuai

Reply via email to