Thanks Peter and Ryan for the info.
As identifier fields need to be "required", how can I alter an optional
column to be required in Spark SQL?
Thanks,
Manu
On Fri, Jan 5, 2024 at 12:50 AM Ryan Blue wrote:
> You can set the primary key fields in Spark using `ALTER TABLE`:
>
> `ALTER TABLE t SE
I think as far as the spec is concerned, we can treat those tags similar to
how the doc strings are treated. Currently, the spec statement on doc
strings is the following:
"Fields may have an optional comment or doc string."
I agree that the APIs should be designed in such a way that prevents
eng
JB,
I would draw a distinction between catalog and this proposed feature in
that the catalog is actually not part of the spec, so it is entirely up to
the engine and is optional.
When it comes to the table spec, "optional" does not mean that it does not
have to be implemented/supported. Any engi
Hi Dan,
I agree: it will depend on the engine capabilities. That said, it's
similar to catalog: each catalog might have different
approaches/features/capabilities, so engines might have different
capabilities as well.
If it's an optional feature in the spec, and each engine might or
might not impl