Hi, I have a use case where I need to pass sparkcontext in map function
reRDD.map(row =>method1(row,sc)).saveAsTextFile(outputDir) Method1 needs spark context to query cassandra. But I see below error java.io.NotSerializableException: org.apache.spark.SparkContext Is there a way we can fix this ? Thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/using-spark-context-in-map-funciton-TASk-not-serilizable-error-tp25998.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org