try downloading the jar files and put it in the libraries folder
On Tue, Sep 23, 2014 at 10:58 AM, Shiang Luong
wrote:
> Hi All,
>
> I'm new to hive. I'm having some problems querying an hive table with
> JDBC. It fails when it is trying to run an map reduce job. It can't seem
> to find the s
Hi All,
I'm new to hive. I'm having some problems querying an hive table with
JDBC. It fails when it is trying to run an map reduce job. It can't seem
to find the serde jar file. When I query it through the command line it
works fine. Anyone have any hints on how I can get it working with JDB
Hi Dhaval,
1. you can add a new partitioned column based on the range of you
interested column.
2. try to use dynamic partition.
https://cwiki.apache.org/confluence/display/Hive/DynamicPartitions
On Tue, Sep 23, 2014 at 10:49 AM, Dhaval Patel wrote:
> -- Forwarded message --
Hive doesn't know it needs to skip your square brackets, so you numbers are
really [1, 2, and 3]. [1 and 3] cannot be parsed to numbers, so they become
null.
I think you interpret the second column as [1, 2, 3] of type string. Then
you can remove the brackets, and use a UDF (write your own if the
Hi,
The /tmp folder is already there with 777 permission.
Thanks,
KG
On Sun, Sep 21, 2014 at 11:42 PM, Abirami V
wrote:
> Do you have hdfs://hostname:9000/tmp directory in hdfs. if not create
> /tmp directory in hdfs and make it writable by giving 777
>
> Thanks,
> Abirami
>
> On Sun, Sep
-- Forwarded message --
From: "Dhaval Patel"
Date: 22/09/2014 7:33 pm
Subject: Queries : partitioning
To:
Cc:
Hey folks,
1) Hive table can be partitioned by column, is there any way to partition
by range?
2) while loading data into hive table we specify the partition column
value
Hi,
I have '|' delimited file where arrays are serialized with square brackets.
I am trying create a hive table to parse this file.
Example:
first|[1,2,3]|100
second|[11,12,13]|200
Create External Table H_histoTest(dim1 string, hist ARRAY, measure1
bigint)
ROW FORMAT DELIMITED FIELDS
TERMIN
Done. Thanks for contributing to the wiki, Naveen!
-- Lefty
On Mon, Sep 22, 2014 at 8:32 PM, Naveen Gangam wrote:
> Could the wiki admin(s) please grant me access to edit wiki pages. My
> confluence user id is "ngangam". I need to update a wiki page as part of a
> fix I made recently.
>
> --
>
Could the wiki admin(s) please grant me access to edit wiki pages. My
confluence user id is "ngangam". I need to update a wiki page as part of a
fix I made recently.
--
Thanks,
Naveen :)
So i found out more detail about this issue,
if in:
select cast('2999-12-31 23:59:59' as timestamp) from table;
if the table has 'orc' data, and you are using hive .13 and set
hive.vectorized.execution.enabled = true;
then this issue occurs, it maybe related to: hive-6656 i'm not certain of
that.
Dear hive users:
Quick question about hive longwritable convert to long.
I have a generic udf called protected_column, which works well as following:
Select protect_column(auction_id_64, ‘auction_id_64’, vp_bitmap) from table ;
And works well when I run
Select * from ( select protect_column(
Hi All,
I am using oracle as hive metastore. I could see the schema created in
oracle after logging into hive and executing create database command
successfully.
When i am trying to create table in hive its throwing error, please help
hive> create table test ( name string,age int) row format de
hi Siva,
If I were to address your problem, I will create a table in hive metastore
DB , and load the Hive query text + timing in it , by writing my own custom
shell script.
And I will need to do this on all hive client machines (not just one) ,
maybe by suitably adding $HOSTNAME as another variab
Hi Dev,
When i run the hive query in hive shell, the query is stored in
hive history file at default location.Since my metastore is sql for hive
whenever i create a table with some columns in hive shell,it gets populated
in sql metastore database.Simillarly,all the queries should be pop
Shushant -
What I believe what Stephen is sarcastically trying to say is that some
organizational education may be in order here. Hive itself is not even at
version 1.0, those of us who use Hive in production know this, and have to
accept that there will be bugs like the one you are trying to addr
15 matches
Mail list logo