I have the following code written in scala in Spark: (inactiveIDs is a RDD[(Int, Seq[String])], persons is a Broadcast[RDD[(Int, Seq[Event])]] and Event is a class that I have created)
val test = persons.value .map{tuple => (tuple._1, tuple._2 .filter{event => inactiveIDs.filter(event2 => event2._1 == tuple._1).count() != 0})} and the following error: java.lang.NullPointerException Any ideas? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Filter-function-problem-tp13787.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org