Hi is this the right way to do that ??
```
spark

print(com.tr.dss.version)

import org.apache.spark.{SparkContext, SparkConf}

sc.stop()
val conf = new SparkConf()
        .set("spark.executor.memory", "4g")
        .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer") 
conf.registerKryoClasses(Array(classOf[dss.schema.logrecord]))
val sc = new SparkContext(conf)


print(sc.version)

```

N> dss.schema.logrecord is just avro generated class I am working on 

I am getting
java.lang.IllegalStateException: SparkContext has been shutdown
further down In the code 

even if
print(sc.version)
works ...


 

Reply via email to