I think it cannot be right.

> 在 2016年1月22日,下午4:53,汪洋 <tiandiwo...@icloud.com> 写道:
> 
> Hi,
> 
> Do we support distinct count in the over clause in spark sql? 
> 
> I ran a sql like this:
> 
> select a, count(distinct b) over ( order by a rows between unbounded 
> preceding and current row) from table limit 10
> 
> Currently, it return an error says: expression ‘a' is neither present in the 
> group by, nor is it an aggregate function. Add to group by or wrap in first() 
> if you don't care which value you get.;
> 
> Yang
> 

Reply via email to