HI folks, I'm running into the following error when trying to perform a join in my code:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.sql.catalyst.types.LongType$ I see similar errors for StringType$ and also: scala.reflect.runtime.ReflectError: value apache is not a package. Strangely, if I just work with a single table, everything is fine. I can iterate through the records in both tables and print them out without a problem. Furthermore, this code worked without an exception in Spark 1.0.0 (thought the join caused some field corruption, possibly related to https://issues.apache.org/jira/browse/SPARK-1994 <https://www.google.com/url?q=https%3A%2F%2Fissues.apache.org%2Fjira%2Fbrowse%2FSPARK-1994&sa=D&sntz=1&usg=AFQjCNHNxePxWgmuymCQSprulDZZcOn4-Q>). The data is coming from a custom protocol buffer based format on hdfs that is being mapped into the individual record types without a problem. The immediate cause seems to be a task trying to deserialize one or more SQL case classes before loading the spark uber jar, but I have no idea why this is happening, or why it only happens when I do a join. Ideas? Keith P.S. If it's relevant, we're using the Kryo serializer.