Can you use the Varchar or String instead? Currently, Spark SQL will convert 
the varchar into string type internally(without max length limitation). 
However, "char" type is not supported yet.

-----Original Message-----
From: A.M.Chan [mailto:kaka_1...@163.com] 
Sent: Friday, March 20, 2015 9:56 AM
To: spark-dev
Subject: Add Char support in SQL dataTypes

case class PrimitiveData(
    charField: Char, // Can't get the char schema info
    intField: Int,
    longField: Long,
    doubleField: Double,
    floatField: Float,
    shortField: Short,
    byteField: Byte,

    booleanField: Boolean)
I can't get the schema from case class PrimitiveData.
An error occurred while I use schemaFor[PrimitiveData] Char (of class 
scala.reflect.internal.Types$TypeRef$$anon$6)
scala.MatchError: Char (of class scala.reflect.internal.Types$TypeRef$$anon$6)
at 
org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:112)





--

kaka1992

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to