Ah, I see,   

thanks, Yin  

--  
Nan Zhu


On Monday, July 21, 2014 at 5:00 PM, Yin Huai wrote:

> Hi Nan,
>  
> It is basically a log entry because your table does not exist. It is not a 
> real exception.  
>  
> Thanks,
>  
> Yin
>  
>  
> On Mon, Jul 21, 2014 at 7:10 AM, Nan Zhu <zhunanmcg...@gmail.com 
> (mailto:zhunanmcg...@gmail.com)> wrote:
> > a related JIRA: https://issues.apache.org/jira/browse/SPARK-2605  
> >  
> > --  
> > Nan Zhu
> >  
> >  
> > On Monday, July 21, 2014 at 10:10 AM, Nan Zhu wrote:
> >  
> > > Hi, all  
> > >  
> > > When I try hiveContext.hql("drop table if exists abc") where abc is a 
> > > non-exist table  
> > >  
> > > I still received an exception about non-exist table though "if exists" is 
> > > there
> > >  
> > > the same statement runs well in hive shell  
> > >  
> > > Some feedback from Hive community is here: 
> > > https://issues.apache.org/jira/browse/HIVE-7458  
> > >  
> > > “Your are doing hiveContext.hql("DROP TABLE IF EXISTS hivetesting") in 
> > > Scala schell of the Spark project.  
> > >  
> > > What this shell is doing ? Query to remote metastore on non existing 
> > > table (see on your provided stack).
> > > The remote metastore throws 
> > > NoSuchObjectException(message:default.hivetesting table not found)because 
> > > Spark code call 
> > > HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:854) on 
> > > non-existing table. It's the right behavior.
> > > You should check on Spark code why a query is done on non existing table.
> > >  
> > >  
> > > I think Spark does not handle well the IF EXISTS part of this query. 
> > > Maybe you could fill a ticket on Spark JIRA.
> > >  
> > > BUT, it's not a bug in HIVE IMHO.”
> > >  
> > > My question is the DDL is executed by Hive itself, doesn’t it?
> > >  
> > > Best,  
> > >  
> > > --  
> > > Nan Zhu
> > >  
> >  
>  

Reply via email to