I have added more details just before you sent the last message :) Please
let me know if it answers your question.

On Tue, Nov 15, 2022 at 3:21 PM Marc Laforet <mlafor...@gmail.com> wrote:

> Hey guys,
>
> Thanks for the responses.
>
> Ryan - Thanks for confirming the behaviour. I'm wondering if you'd have
> any recommendations of how to approach this barring maintaining our own
> spark fork?
>
> Walaa - I tried creating the view using spark sql's standard `create view
> as select` statement (trying with the fully qualified table name as well as
> first setting catalog & namespace). Our iceberg tables are backed by a HMS
> so it would presumably be stored there?
>
> Thanks again for your responses!
>
> On Tue, Nov 15, 2022 at 5:38 PM Walaa Eldin Moustafa <
> wa.moust...@gmail.com> wrote:
>
>> Hi Marc,
>>
>> Could you clarify where you store the view definitions in this case, and
>> how the syntax looks like?
>>
>> Thanks,
>> Walaa.
>>
>>
>> On Tue, Nov 15, 2022 at 2:34 PM Ryan Blue <b...@tabular.io> wrote:
>>
>>> Hi Marc,
>>>
>>> This is expected. Although the ViewCatalog SPIP was approved by the
>>> Spark community, the implementation hasn't made it in yet for v2.
>>>
>>> Ryan
>>>
>>> On Tue, Nov 15, 2022 at 11:38 AM Marc Laforet <mlafor...@gmail.com>
>>> wrote:
>>>
>>>> Hi Iceberg folks,
>>>>
>>>> I'm working on a project where we're migrating tables from hive to
>>>> iceberg. We are revamping our ingestion pipeline in parallel from batch to
>>>> stream. Originally, our plan was to have two separate tables, a backfill
>>>> table and a live table, that would be stitched together via a view for
>>>> downstream consumers. This is proving rather difficult. In the absence of
>>>> engine agnostic views we were going to prepend views with the engine type
>>>> (ie trino_my_table and spark_my_table) but I receive a  
>>>> org.apache.spark.sql.AnalysisException:
>>>> Catalog iceberg_catalog does not support views error when trying to
>>>> create the spark view. With the ongoing work towards engine agnostic views
>>>> I'm unsure if this limitation is expected or easily surpassed with some
>>>> config/spark change?
>>>>
>>>> Thank you for your time,
>>>>
>>>> Marc
>>>>
>>>
>>>
>>> --
>>> Ryan Blue
>>> Tabular
>>>
>>

Reply via email to