ok fine.


LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*





*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Tue, 3 Nov 2020 at 14:51, Bartek Kotwica <bkotw...@gmail.com> wrote:

> I understand, but it looks strange as a query without the "create table"
> clause works. Obviously I use the workaround, but I think Hive as an
> application should be more predictable in interaction, so I created a JIRA
> for the issue.
>
> https://issues.apache.org/jira/browse/HIVE-24352
>
> Regards,
> Bartosz Kotwica
>
> wt., 3 lis 2020 o 11:57 Mich Talebzadeh <mich.talebza...@gmail.com>
> napisał(a):
>
>> well you have to be pragmatic. That may well be a bug due to Hive,
>> especially it says  "Also check for circular dependencies"
>>
>> you can raise a JIRA but not sure about its priority as you have a
>> work-around
>>
>> HTH
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>> On Tue, 3 Nov 2020 at 10:46, Bartek Kotwica <bkotw...@gmail.com> wrote:
>>
>>> Hi Mich,
>>> Thank you for the reply! Creating a stage table works well, a problem
>>> comes up when CTE or subquery in from clause is used.
>>>
>>> wt., 3 lis 2020 o 10:45 Mich Talebzadeh <mich.talebza...@gmail.com>
>>> napisał(a):
>>>
>>>> Hm,
>>>>
>>>> Hi Bartosz,
>>>>
>>>> Can you create a temporary table with your sub-query and see it works?
>>>>
>>>> create temporary table tab2 as ...
>>>>
>>>> HTH
>>>>
>>>>
>>>>
>>>> LinkedIn * 
>>>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>>> any loss, damage or destruction of data or any other property which may
>>>> arise from relying on this email's technical content is explicitly
>>>> disclaimed. The author will in no case be liable for any monetary damages
>>>> arising from such loss, damage or destruction.
>>>>
>>>>
>>>>
>>>>
>>>> On Tue, 3 Nov 2020 at 09:28, Bartek Kotwica <bkotw...@gmail.com> wrote:
>>>>
>>>>> Hi!
>>>>> I use Hive 3.1.0 and beeline.
>>>>> I have encountered a compilation error when issue a CTAS query from
>>>>> beeline, but without "create table" query works as expected,* narrowed 
>>>>> query
>>>>> to reproduce:*
>>>>>
>>>>> create table tab_error as
>>>>> with tab2 as (
>>>>>     select
>>>>>         id,
>>>>>         lead(id) over (partition by id order by id) as x
>>>>>     from
>>>>>         (select 1 id) a
>>>>> )
>>>>> select
>>>>>   lead(x) over (partition by id order by id) = 1
>>>>> from
>>>>>     tab2
>>>>> ;
>>>>>
>>>>> *ERROR:*
>>>>> Error: Error while compiling statement: FAILED: SemanticException
>>>>> Failed to breakup Windowing invocations into Groups. At least 1 group must
>>>>> only depend on input columns. Also check for circular dependencies.
>>>>> Underlying error: org.apache.hadoop.hive.ql.parse.SemanticException:
>>>>> Line 0:-1 Invalid column reference '1': (possible column names are: )
>>>>> (state=42000,code=40000)
>>>>>
>>>>> Please confirm the issue then I will create a Jira ticket.
>>>>>
>>>>> Kind regards,
>>>>> Bartosz Kotwica
>>>>>
>>>>

Reply via email to