hello,
as we continue to test spark 2.0 SNAPSHOT in-house we ran into the
following trying to port an existing application from spark 1.6.1 to spark
2.0.0-SNAPSHOT.

given this code:

case class Test(a: Int, b: String)
val rdd = sc.parallelize(List(Row(List(Test(5, "ha"), Test(6, "ba")))))
val schema = StructType(Seq(
  StructField("x", ArrayType(
    StructType(Seq(
      StructField("a", IntegerType, false),
      StructField("b", StringType, true)
    )),
    true)
  , true)
  ))
val df = sqlc.createDataFrame(rdd, schema)
df.show

this works fine in spark 1.6.1 and gives:

+----------------+
|               x|
+----------------+
|[[5,ha], [6,ba]]|
+----------------+

but in spark 2.0.0-SNAPSHOT i get:

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage
0.0 (TID 0, localhost): java.lang.RuntimeException: Error while encoding:
java.lang.ClassCastException: Test cannot be cast to
org.apache.spark.sql.Row
[info] getexternalrowfield(input[0, org.apache.spark.sql.Row, false], 0, x,
IntegerType) AS x#0
[info] +- getexternalrowfield(input[0, org.apache.spark.sql.Row, false], 0,
x, IntegerType)
[info]    +- input[0, org.apache.spark.sql.Row, false]

Reply via email to