Few lines of my error logs look like
2014-07-29 16:32:16,326 ERROR [ActorSystemImpl] Uncaught fatal error from
thread [spark-akka.actor.default-dispatcher-6] shutting down ActorSystem
[spark]
java.lang.VerifyError: (class:
org/jboss/netty/channel/socket/nio/NioWorkerPool, method: createWorker
si
Is there any example out there for unit testing a Spark application in Java?
Even a trivial application like word count will be very helpful. I am very
new to this and I am struggling to understand how I can use JavaSpark
Context for JUnit
--
View this message in context:
http://apache-spark-us
How can I transform the mapper key at the reducer output. The functions I
have encountered are combineByKey, reduceByKey, etc that work on the values
and not on the key. For example below, this is what I want to achieve but
seems like I can only have K1 and not K2:
Mapper->(K1,V1)->Reducer