Re: how to limit mappers for a hive job

2013-04-24 Thread Edward Capriolo
Also make sure hive is using CombinedHiveInputFormat (not just HiveInputFormat). Combined is the default for newer versions. On Wed, Apr 24, 2013 at 10:51 AM, Sanjay Subramanian < sanjay.subraman...@wizecommerce.com> wrote: > I use the following > > To specify the Mapper Input Split Size (1342

Re: Table present in HDFS but 'show table' Returns Empty

2013-04-24 Thread Xun TANG
That's exactly why! Thank you so much. Alice On Mon, Apr 22, 2013 at 11:12 PM, Ramki Palle wrote: > May be you are using derby as your metastore. It creates the metastore in > the current directory from where you started your hive session. You may > have started your hive session from a differ

Re: how to limit mappers for a hive job

2013-04-24 Thread Sanjay Subramanian
I use the following To specify the Mapper Input Split Size (134217728 is in bytes) == SET mapreduce.input.fileinputformat.split.maxsize=134217728; From: Frank Luo mailto:j...@merkleinc.com>> Reply-To: "user@hive.apache.org

how to limit mappers for a hive job

2013-04-24 Thread Frank Luo
I am trying to query a huge file with 370 blocks, but it errors out with message of "number of mappers exceeds limit" and my cluster has a "mapred.tasktracker.map.tasks.maximum" set to 50. I have tried to set parameters such as hive.exec.mappers.max/ mapred.tasktracker.tasks/ apred.tasktracker