Hi,
Any thoughts ?
Thanks,
On Sun, Feb 1, 2015 at 12:26 PM, Manoj Samel
wrote:
> Spark 1.2
>
> SchemaRDD has schema with decimal columns created like
>
> x1 = new StructField("a", DecimalType(14,4), true)
>
> x2 = new StructField("b", DecimalType(14,4), true)
>
> Registering as SQL Temp table
I think I found the issue causing it.
I was calling schemaRDD.coalesce(n).saveAsParquetFile to reduce the number
of partitions in parquet file - in which case the stack trace happens.
If I compress the partitions before creating schemaRDD then the
schemaRDD.saveAsParquetFile call works for decima
Spark 1.2
SchemaRDD has schema with decimal columns created like
x1 = new StructField("a", DecimalType(14,4), true)
x2 = new StructField("b", DecimalType(14,4), true)
Registering as SQL Temp table and doing SQL queries on these columns ,
including SUM etc. works fine, so the schema Decimal does