Actual, the dir got nothing but 1MM empty sub-dirs. I am sure the name node 
won’t like it.


Ø  hadoop fs -du -s /tmp/hive/hive
0  /tmp/hive/hive

From: Frank Luo
Sent: Monday, February 22, 2016 12:26 PM
To: user@hive.apache.org
Subject: /tmp/hive/hive is exceeded: limit=1048576 items=1048576

Is there a setting somewhere to automatically remove old temp files from 
/tmp/hive/hive?

Otherwise, every hive user will be facing the problem and everyone has to 
develop something to fix it, righ?

This email and any attachments transmitted with it are intended for use by the 
intended recipient(s) only. If you have received this email in error, please 
notify the sender immediately and then delete it. If you are not the intended 
recipient, you must not keep, use, disclose, copy or distribute this email 
without the author’s prior permission. We take precautions to minimize the risk 
of transmitting software viruses, but we advise you to perform your own virus 
checks on any attachment to this message. We cannot accept liability for any 
loss or damage caused by software viruses. The information contained in this 
communication may be confidential and may be subject to the attorney-client 
privilege.

Reply via email to