taSourceV2Relation = {
>>>
>>> val schema =
>>> userSpecifiedSchema.getOrElse(source.createReader(options,
>>> userSpecifiedSchema).readSchema())
>>>
>>> val ident = tableIdent.orElse(tableFromOptions(options))
>>>
>>> DataSourceV2Rel
>> source, schema.toAttributes, options, ident, userSpecifiedSchema)
>>
>> }
>>
>>
>>
>> Correct this?
>>
>>
>>
>> Or even creating a new create which simply gets the schema as non
>> optional?
>>
>&g
ema.toAttributes, options, ident, userSpecifiedSchema)
>
> }
>
>
>
> Correct this?
>
>
>
> Or even creating a new create which simply gets the schema as non optional?
>
>
>
> Thanks,
>
> Assaf
>
>
>
> *From:* Hyukjin Kwon [mailto:
: Hyukjin Kwon [mailto:gurwls...@gmail.com]
Sent: Thursday, October 11, 2018 10:24 AM
To: Mendelson, Assaf; Wenchen Fan
Cc: dev
Subject: Re: Possible bug in DatasourceV2
[EXTERNAL EMAIL]
Please report any suspicious attachments, links, or requests for sensitive
information.
See https://github.com/apache
See https://github.com/apache/spark/pull/22688
+WEnchen, here looks the problem raised. This might have to be considered
as a blocker ...
On Thu, 11 Oct 2018, 2:48 pm assaf.mendelson,
wrote:
> Hi,
>
> I created a datasource writer WITHOUT a reader. When I do, I get an
> exception: org.apache.s
Hi,
I created a datasource writer WITHOUT a reader. When I do, I get an
exception: org.apache.spark.sql.AnalysisException: Data source is not
readable: DefaultSource
The reason for this is that when save is called, inside the source match to
WriterSupport we have the following code:
val source =