oops
sqlContext.setConf("spark.sql.parquet.binaryAsString", "true")
thois solved the issue important for everyone
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Reading-nested-JSON-data-with-Spark-SQL-tp19310p20936.html
Sent from the Apache Spark U
Also it looks like that when I store the String in parquet and try to fetch
them using spark code I got classcast exception
below how my array of strings are saved. each character ascii value is
present in array of ints
res25: Array[Seq[String]] r= Array(ArrayBuffer(Array(104, 116, 116, 112, 58
Hih
I am having simiiar problem and tries your solution with spark 1.2 build
withing hadoop
I am saving object to parquet files where some fields are of type Array.
When I fetch them as below I get
java.lang.ClassCastException: [B cannot be cast to java.lang.CharSequence
def fetchTags(rows
This works great, thank you!
Simone Franzini, PhD
http://www.linkedin.com/in/simonefranzini
On Wed, Nov 19, 2014 at 3:40 PM, Michael Armbrust
wrote:
> You can extract the nested fields in sql: SELECT field.nestedField ...
>
> If you don't do that then nested fields are represented as rows with
You can extract the nested fields in sql: SELECT field.nestedField ...
If you don't do that then nested fields are represented as rows within rows
and can be retrieved as follows:
t.getAs[Row](0).getInt(0)
Also, I would write t.getAs[Buffer[CharSequence]](12) as
t.getAs[Seq[String]](12) since we