Re: Intermittent BindException during long MR jobs

2015-03-25 Thread Krishna Rao
the IP >> address availability and run the job. >> >> >> >> *Thanks,* >> >> *S.RagavendraGanesh* >> >> ViSolve Hadoop Support Team >> ViSolve Inc. | San Jose, California >> Website: www.visolve.com >> >> email: servi...@visolve.c

Intermittent BindException during long MR jobs

2015-02-26 Thread Krishna Rao
Hi, we occasionally run into a BindException causing long running jobs to occasionally fail. The stacktrace is below. Any ideas what this could be caused by? Cheers, Krishna Stacktrace: 379969 [Thread-980] ERROR org.apache.hadoop.hive.ql.exec.Task - Job Submission failed with exception 'jav

Reduce the amount of logging going into /var/log/hive/userlogs

2014-06-13 Thread Krishna Rao
Last time I looked there wasn't much info available on how to reduce the size of the logs written here (the only suggestions being delete them after a day). Is there anything I can do now to reduce what's logged there in the first place? Cheers, Krishna

Hive query parser bug resulting in "FAILED: NullPointerException null"

2014-02-27 Thread Krishna Rao
Hi all, we've experienced a bug which seems to be caused by having a query constraint involving partitioned columns. The following query results in "FAILED: NullPointerException null" being returned nearly instantly: EXPLAIN SELECT col1 FROM tbl1 WHERE (part_col1 = 2014 AND part_col2 >= 2) OR

Failed to report status for x minutes

2013-11-29 Thread Krishna Rao
Hi all, We've been running into this problem a lot recently on a particular reduce task. I'm aware that I can work around it by uping the "mapred.task.timeout". However, I would like to know what the underlying problem is. How can I find this out? Alternatively, can I force a generated hive task

Re: Add external jars automatically

2013-03-13 Thread Krishna Rao
uxlib dir. There always is the > HIVE_AUX_JARS_PATH environment variable (but this introduces a dependency > on the environment). > > > On Wed, Mar 13, 2013 at 10:26 AM, Krishna Rao wrote: > >> Hi all, >> >> I'm using the hive json serde and need to run: &qu

Add external jars automatically

2013-03-13 Thread Krishna Rao
Hi all, I'm using the hive json serde and need to run: "ADD JAR /usr/lib/hive/lib/hive-json-serde-0.2.jar;", before I can use tables that require it. Is it possible to have this jar available automatically? I could do it via adding the statement to a .hiverc file, but I was wondering if there is

Re: hive commands from a file

2013-03-04 Thread Krishna Rao
Hi Sai, just use the "-f" arg together with the file name. For details see: https://cwiki.apache.org/Hive/languagemanual-cli.html Krishna On 4 March 2013 10:24, Sai Sai wrote: > Just wondering if it is possible to run a bunch of hive commands from a > file rather than one a time. > For ex: >

NoClassDefFoundError: org/apache/hadoop/mapreduce/util/HostUtil

2013-02-07 Thread Krishna Rao
Hi all, I'm occasionally getting the following error, usually after running an expensive Hive query (creating 20 or so MR jobs): *** Error during job, obtaining debugging information... Examining task ID: task_201301291405_1640_r_01 (and more) from job job_201301291405_1640 Exception in threa

Re: Find out what's causing an InvalidOperationException

2013-01-09 Thread Krishna Rao
ive table definition of both the tables? > > are both the columns of same type ? > > > On Wed, Jan 9, 2013 at 5:15 AM, Krishna Rao wrote: > >> Hi all, >> >> On running a statement of the form "INSERT INTO TABLE tbl1 PARTITION(p1) >> SELECT x1 FROM tb

Find out what's causing an InvalidOperationException

2013-01-09 Thread Krishna Rao
Hi all, On running a statement of the form "INSERT INTO TABLE tbl1 PARTITION(p1) SELECT x1 FROM tbl2", I get the following error: "Failed with exception java.lang.ClassCastException: org.apache.hadoop.hive.metastore.api.InvalidOperationException cannot be cast to java.lang.RuntimeException" How

Re: Job counters limit exceeded exception

2013-01-04 Thread Krishna Rao
f thumb for Hive: > count of operators * 4 + n (n for file ops and other stuff). > > cheers, > Alex > > > On Jan 2, 2013, at 10:35 AM, Krishna Rao wrote: > > > A particular query that I run fails with the following error: > > > > *** > &g

Re: Possible to set map/reduce log level in configuration file?

2012-12-18 Thread Krishna Rao
On 18 December 2012 02:05, Mark Grover wrote: > I usually put it in my home directory and that works. Did you try that? I need it to work for all users. So the cleanest non-duplicating solution, seems to be in the hive bin directory (and then conf dir, when I upgrade hive).

Re: Possible to set map/reduce log level in configuration file?

2012-12-17 Thread Krishna Rao
;> >> alternatively, you can create a .hiverc into your home directory and set >> the parameters you want, these will be included in each session >> >> >> On Fri, Dec 14, 2012 at 4:05 PM, Krishna Rao wrote: >> >>> Hi all, >>> >>> is

Possible to set map/reduce log level in configuration file?

2012-12-14 Thread Krishna Rao
Hi all, is it possible to set: mapreduce.map.log.level & mapreduce.reduce.log.level, within some config file? At the moment I have to remember to set these at the start of a hive session, or script. Cheers, Krishna

Re: Problems Sqoop importing columns with NULLs

2012-12-04 Thread Krishna Rao
> Currently suggested workaround is to use JDBC based import by dropping the > "--direct" argument. > > Links: > 1: https://issues.apache.org/jira/browse/SQOOP-654 > > On Tue, Dec 04, 2012 at 05:04:56PM +, Krishna Rao wrote: > > Hi all, > > > >

Problems Sqoop importing columns with NULLs

2012-12-04 Thread Krishna Rao
Hi all, I'm haivng trouble transfering NULLs in a VARCHAR column in a table in PostgresQL into Hive. A null value ends up as an empty value in Hive, rather than NULL. I'm running the following command: sqoop import --username -P --hive-import --hive-overwrite --null-string='\\N' --null-non-stri

Re: Hive compression with external table

2012-11-06 Thread Krishna Rao
are in block > format are always split table regardless of what compression for the > block is chosen.The Programming Hive book has an entire section > dedicated to the permutations of compression options. > > Edward > On Mon, Nov 5, 2012 at 10:57 AM, Krishna Rao > wrote: > >

Hive compression with external table

2012-11-05 Thread Krishna Rao
Hi all, I'm looking into finding a suitable format to store data in HDFS, so that it's available for processing by Hive. Ideally I would like to satisfy the following: 1. store the data in a format that is readable by multiple Hadoop projects (eg. Pig, Mahout, etc.), not just Hive 2. work with a