Hi,

I am trying to upgrade to spark-1.5.1 with hadoop 2.7.1 and hive 1.2.1 from 
spark-1.3.1.

But with the above spark-assembly, the "USE DEFAULT" functionality seems to be 
broken with the message

"

   Cannot recognize input near 'default' '<EOF>' '<EOF>' in switch database 
statement; line 1 pos 4

   NoViableAltException(81@[])

   at 
org.apache.hadoop.hive.ql.parse.HiveParser_IdentifiersParser.identifier(HiveParser_IdentifiersParser.java:11577)

   at 
org.apache.hadoop.hive.ql.parse.HiveParser.identifier(HiveParser.java:46055)

"


According to my analysis this may be due to the removal of "KW_DEFAULT" keyword 
from the grammar of Hive parser according to the bug 
https://issues.apache.org/jira/browse/HIVE-6617


Tried adding the KW_DEFAULT keyword back to the HiveParser.g and HiveLexer.g 
but I guess I am missing something.


Can someone confirm if this is indeed an issue or if there is any fix planned 
for this in the upcoming releases.


Thanks

Kashish Jain

Reply via email to