Thanks Ryan
On Tue, Feb 5, 2019 at 10:28 PM Ryan Blue wrote:
> Shubham,
>
> DataSourceV2 passes Spark's internal representation to your source and
> expects Spark's internal representation back from the source. That's why
> you consume and produce InternalRow: "internal" indicates that Spark
> d
Shubham,
DataSourceV2 passes Spark's internal representation to your source and
expects Spark's internal representation back from the source. That's why
you consume and produce InternalRow: "internal" indicates that Spark
doesn't need to convert the values.
Spark's internal representation for a d
Hi All,
I am using custom DataSourceV2 implementation (*Spark version 2.3.2*)
Here is how I am trying to pass in *date type *from spark shell.
scala> val df =
> sc.parallelize(Seq("2019-02-05")).toDF("datetype").withColumn("datetype",
> col("datetype").cast("date"))
> scala> df.write.format("com