Hi, 

I am using Hive 0.11 with Hadoop 2.1.0-beta1, and on running an INSERT 
OVERWRITE (with my own serde) I get this exception below in a lot of my 
mappers. Any ideas? I even turned hive.exec.parallel=false, with no luck.

Caused by:
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.
namenode.LeaseExpiredException): No lease on
/tmp/hive-hdfs/hive_2013-08-27_22-03-00_305_6489408868004948109/
_task_tmp.-ext-10002/_tmp.000001_0: File does not exist. Holder
DFSClient_attempt_1377640952593_0001_m_000001_0_-1310282481_1 does not
have any open files

Punnoose, Roshan
rashan.punnr...@merck.com



Notice:  This e-mail message, together with any attachments, contains
information of Merck & Co., Inc. (One Merck Drive, Whitehouse Station,
New Jersey, USA 08889), and/or its affiliates Direct contact information
for affiliates is available at 
http://www.merck.com/contact/contacts.html) that may be confidential,
proprietary copyrighted and/or legally privileged. It is intended solely
for the use of the individual or entity named on this message. If you are
not the intended recipient, and have received this message in error,
please notify us immediately by reply e-mail and then delete it from 
your system.

Reply via email to