That is a bug in generated code.  It would be great if you could post a
reproduction.

On Tue, Jan 26, 2016 at 9:15 AM, Jacek Laskowski <ja...@japila.pl> wrote:

> Hi,
>
> Does this say anything to anyone? :) It's with Spark 2.0.0-SNAPSHOT
> built today. Is this something I could fix myself in my code or is
> this Spark SQL?
>
> Caused by: java.lang.NullPointerException
> at
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown
> Source)
> at
> org.apache.spark.sql.execution.Exchange$$anonfun$org$apache$spark$sql$execution$Exchange$$getPartitionKeyExtractor$1$2.apply(Exchange.scala:184)
> at
> org.apache.spark.sql.execution.Exchange$$anonfun$org$apache$spark$sql$execution$Exchange$$getPartitionKeyExtractor$1$2.apply(Exchange.scala:184)
> at
> org.apache.spark.sql.execution.Exchange$$anonfun$3$$anonfun$apply$4.apply(Exchange.scala:198)
> at
> org.apache.spark.sql.execution.Exchange$$anonfun$3$$anonfun$apply$4.apply(Exchange.scala:198)
> at scala.collection.Iterator$$anon$11.next(Iterator.scala:370)
> at
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:148)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> at org.apache.spark.scheduler.Task.run(Task.scala:86)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
> Pozdrawiam,
> Jacek
>
> Jacek Laskowski | https://medium.com/@jaceklaskowski/
> Mastering Apache Spark
> ==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
> Follow me at https://twitter.com/jaceklaskowski
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to