I am trying to do log analytics on the logs created by Flume. Hive queries
are failing with below error. "hadoop fs -cat" command works on all these
open files. Is there a way to read these open files? My requirement is to
read the data from open files too. I am using tez as execution engine.
Hi Nitin , How to check this, you mean to check hive-site.xml. please
let me know how to check this.
From: Nitin Pawar
To: "user@hive.apache.org"
Date: 07/09/2015 07:35 PM
Subject: Re: Hive Query Error
can u check your config?
host appears twice
can u check your config?
host appears twice 01hw357381.tcsgegdc.com: 01hw357381.tcsgegdc.com
it shd be hostname:port
also once you correct this, you do a nslookup on the host to make sure its
identified by the hive client
On Thu, Jul 9, 2015 at 7:19 PM, Ajeet O wrote:
> Hi All , I have installe
Hi All , I have installed Hadoop 2.0 , Hive 0.12 on Cent OS 7.
When I run a query in Hive - select count(*) from u_data ; it gives
following errors. , However I can run select * from u_data ; pls
help.
hive> select count(*) from u_data;
Total MapReduce jobs = 1
Launching Job 1 out o
file this one under RTFM.
On Wed, Feb 5, 2014 at 9:11 AM, Nitin Pawar wrote:
> its create table xyz stored as sequencefile as select blah from table
>
>
> On Wed, Feb 5, 2014 at 10:37 PM, Raj Hadoop wrote:
>
>> *I am trying to create a Hive sequence file from another table by running
>> the fo
its create table xyz stored as sequencefile as select blah from table
On Wed, Feb 5, 2014 at 10:37 PM, Raj Hadoop wrote:
> *I am trying to create a Hive sequence file from another table by running
> the following -*
>
> *Your query has the following error(s):*
> OK FAILED: ParseException line 5
I am trying to create a Hive sequence file from another table by running the
following -
Your query has the following error(s):
OK
FAILED: ParseException line 5:0 cannot recognize input near 'STORED' 'STORED'
'AS' in constant click the Error Log tab above for details
1
CREATE TABLE temp_xyz as
thanks Bing I found it
2013/8/22 Bing Li
> By default, hive.log should exist in /tmp/.
> Also, it could be set in $HIVE_HOME/conf/hive-log4j.properties and
> hive-exec-log4j.properties
> - hive.log.dir
> - hive.log.file
>
>
> 2013/8/22 闫昆
>
>> hi all
>> when exec hive query throw exception as
By default, hive.log should exist in /tmp/.
Also, it could be set in $HIVE_HOME/conf/hive-log4j.properties and
hive-exec-log4j.properties
- hive.log.dir
- hive.log.file
2013/8/22 闫昆
> hi all
> when exec hive query throw exception as follow
> I donnot know where is error log I found $HIVE_HOME/
hi all
when exec hive query throw exception as follow
I donnot know where is error log I found $HIVE_HOME/ logs not exist
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks not specified. Estimated from input data size: 3
In order to change the average load for a reducer (in
HI,ALL:
i execute a query ,but error,any one know what happened? BTW i use yarn
framework
2013-08-22 09:47:09,893 Stage-1 map = 28%, reduce = 1%, Cumulative CPU
4140.64 sec
2013-08-22 09:47:10,952 Stage-1 map = 28%, reduce = 1%, Cumulative CPU
4140.72 sec
2013-08-22 09:47:12,008 Stage-1 map = 28
11 matches
Mail list logo