I'm really struck at the point where I'm using reify in clojure to 
implement java interface "org.apache.spark.sql.api.java.UDF2" and define 
the method "call" in this interface.

I've been able to use reify to implement and it works fine when I do a 
"lein run", I see that this udf is being applied on my dataframe. I've also 
written test cases to check if the udf is being applied on the dataframe 
and I see that udf is being applied as expected. Now, when I do a "lein 
cloverage" to check my code coverage, this test case fails, giving a 
classNotFoundException.

Partial error stack trace: (running the app on virtual desktop, could not 
copy and paste) 
org.apache.spark.sparkException: Job aborted due to stage failure: 
classNotFoundException: ab.cd.ef.ef$fn$reify__2440


*On Hadoop:*
Getting the same kind of error when I run this application on Cluster mode 
used spark-submit, tried hadoop jar; all give the same error as above. *But, 
*when I run the same jar on a single node, it works fine without any 
issues. 

Using Clojure 1.9.0, Cloverage 1.0.13, Spark 2.1.3, Java 1.8.0_141


Has anyone been facing this kind of issue? if so, was there a solution?

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"Clojure" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to clojure+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to