o:nanoh...@gmail.com>> wrote:
Thanks for clarifying!
Best Wishes,
Chiming HUANG
On Sat, May 16, 2020 at 11:40 PM Shawn Weeks
mailto:swe...@weeksconsulting.us>> wrote:
Starting in Hive 3.x all internal tables are transactional by default. Unless
you need the buckets you should be abl
Starting in Hive 3.x all internal tables are transactional by default. Unless
you need the buckets you should be able to just say create table and drop
everything after partitioned by.
Thanks
Shawn
From: Huang Chiming
Reply-To: "user@hive.apache.org"
Date: Saturday, May 16, 2020 at 10:32 AM
T
Depending on what version of Hive you are looking for TimestampWritable or one
of it's related classes.
Thanks
Shawn
On 1/22/20, 6:51 AM, "Nicolas Paris" wrote:
Hi
I cannot find the way to implement hive UDF dealing with timestamp type.
I tried both java.sql.Timestamp and imp
Another note to go with this, you have to run the plsql shell to use
procedures. It’s not integrated into hiveserver2 nor accessible from JDBC so
it’s still fairly limited.
Thanks
Shawn
From: Suresh Kumar Sethuramaswamy
Reply-To: "user@hive.apache.org"
Date: Monday, January 13, 2020 at 9:06
That looks like you’ve encountered a file with no delimiter as that’s near the
max size for an array or string. Also I don’t think you can terminate fields
with a line feed as that’s the hard coded row delimiter.
Thanks
Shawn
From: xuanhuang <18351886...@163.com>
Reply-To: "user@hive.apache.org
I’m not sure specific to Hive 1.3 but in other versions the data is written to
a temp location and then at the end of the query the previous data is deleted
and the new data is renamed/moved. Something to watch out for is if the query
returns no rows than the old data isn’t removed.
Thanks
Shaw
ables and views I've created. Is
DatabaseMetaData broken in Hive 3.x?
Thanks
Shawn Weeks
I've been using DBeaver or the DBeaver plugin for Eclipse for most of my SQL
requirements. It supports virtually any JDBC connection. I've also used
JetBrains DataGrip a commercial solution that also seems to work pretty well.
Thanks
Shawn
From: Jon Morisi
Sent
I think they misunderstood, your talking about this
https://cwiki.apache.org/confluence/display/Hive/AdminManual+Metastore+Administration.
As far as I know the only authentication it supports is Kerberos.
Thanks
Shawn
From: Odon Copon
Sent: Wednesday, April 17, 2019 3:16 AM
To: user@hive.apach
In my company the Windows Servers aren’t part of the same domain as the Hadoop
Servers so we’ve been using Apache Knox to enable username/password auth to an
Kerberos enabled Hive instance. This has been tested with the Hortonworks HDP
2.6.5 distribution of Hive and Tableau.
Thanks
Shawn
From:
What proxy are you using and can you share the proxy config and beeline url
your trying.
Thanks
Shawn
Sent from my iPhone
> On Apr 3, 2019, at 2:39 AM, Andy Srine wrote:
>
> Team,
>
> Any ideas on how to connect to HiveServer2 from Beeline via a proxy server.
>
> I have tried everything I
Something like this should work on 1.2.1 an onward. Or if your accessing with
JDBC you can always bind in an array as well. We do this all over the place.
with
x as (select explode(split('1,2,3,4,5',',')) as y)
select *
from x;
From: Mainak Ghosh
Sent: Friday, March 29, 2019 11:02 AM
To
You'll need to create Ranger HDFS Policies to allow the specific user access to
the external table directory.
Thanks
Shawn Weeks
-Original Message-
From: Kristopher Kane
Sent: Friday, March 29, 2019 9:11 AM
To: user@hive.apache.org
Subject: External table data and Ranger Sec
ntainer running in the
default queue. Looking at the logs it's also being started by llap. I was
wondering what it was for since any queries I run goto the other tez container
in the llap queue.
Thanks
Shawn Weeks
Thanks Rajkumar for HIVE-21499, I was hoping this was a bug and not something
related to the class path bugs I'm working.
From: Shawn Weeks
Sent: Sunday, March 24, 2019 3:42 PM
To: user@hive.apache.org
Subject: Custom UDF Disappears After AlreadyExistsException
Looking to see if the
exists error.
Attempting to create the function again raises an AlreadyExistsException and
dropping and attempting to recreate does the same thing.
Thanks
Shawn Weeks
Does anyone know where it's documented that "dec" is now a reserved keyword in
Hive 3.1?
Thanks
8. delete jars; - For some reason this has to be the first command run
9. Execute select using xml_explode – Class path print shows 1 instances of
the jar.
10. It works
Thanks
Shawn
From: Shawn Weeks
Sent: Tuesday, March 5, 2019 4:04 PM
To: user@hive.apache.org
Subject: RE: FW: Custom UDF
to add custom UDFs. Not sure what
the current state is.
Kevin Risden
On Tue, Mar 5, 2019 at 4:46 PM Shawn Weeks
mailto:swe...@weeksconsulting.us>> wrote:
Didn't here anything on the dev mailing list, has anyone here seen this
scenario. Custom Hive Function starts throwing clas
as
these classes in there.
I'm on Hive 1.2.1 with HDP 2.6.5 patches
Thanks
Shawn Weeks
2019-03-04 19:13:31,175 WARN [HiveServer2-HttpHandler-Pool: Thread-70490]:
servlet.ServletHandler (ServletHandler.java:doHandle(571)) - Error for
/cliservice
java.lang.NoClassDefFoundError: net/sf
frame)
- org.apache.hadoop.util.RunJar.run(java.lang.String[]) @bci=450, line=233
(Interpreted frame)
- org.apache.hadoop.util.RunJar.main(java.lang.String[]) @bci=8, line=148
(Interpreted frame)
From: Shawn Weeks
Sent: Tuesday, February 19, 2019 6:19 PM
To
e. I'm attaching a
jstack trace as well. Not sure if this is a bug or a limitation.
Thanks
Shawn Weeks
Here is the debug logs from Beeline
0: jdbc:hive2:///> add jar
file:///home/1454256952/Projects/hive/lib/hive-custom-io-1.0.jar;
19/02/19 18:09:26 [main]: INFO session.HiveSessionImpl:
It looks like when you call getFunctions on DatabaseMetaData you get a semantic
exception if all functions aren't whitelisted. Is there a way around this or
specific version it's fixed in? I either wouldn't expect introspection calls to
get blocked or I'd expect restricted functions not to get r
Looking closer this looks like something DataGrip is breaking not Hive.
Thanks
Shawn
From: Shawn Weeks
Sent: Thursday, October 18, 2018 8:00 AM
To: user@hive.apache.org
Subject: Hive 1.2.1 - Error getting functions
I'm working on a small project to get embedded Hive instances running in D
(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
If anyone is curious the project is here
https://github.com/shawnweeks/hive_docker.
Thanks
Shawn Weeks
gions. The query is a very
simple create temporary as select * from hbase_table; but the HBase table has a
lot of records.
Thanks
Shawn Weeks
Let me rephrase that I've set compactor.mapreduce.map.memory.mb to 212992 which
is the largest container size the cluster can support.
Thanks
Shawn
From: Shawn Weeks
Sent: Monday, September 17, 2018 3:44:44 PM
To: user@hive.apache.org
Subject: Re:
I've already tried giving the compactor 256+ gigabytes of memory. All that
changes is how long for it run out of memory.
Thanks
Shawn Weeks
From: Owen O'Malley
Sent: Monday, September 17, 2018 3:37:09 PM
To: user@hive.apache.org
Subject: Re: Hive
Tried the Binary thing but since Hive Streaming in HDP 2.6 doesn't support
Binary column types that's not going to work. See HIVE-18613.
Thanks
Shawn Weeks
____
From: Shawn Weeks
Sent: Monday, September 17, 2018 12:28:25 PM
To: user@hive.apache.org
S
binary type that should help avoid this issue.
Thanks
Prasanth
On Mon, Sep 17, 2018 at 9:10 AM -0700, "Shawn Weeks"
mailto:swe...@weeksconsulting.us>> wrote:
Let me start off by saying I've backed myself into a corner and would rather
not reprocess the data if possible. I
working on changing how
the data get's loaded. But I've got this table with so many deltas that the
Hive Compaction runs out of memory and any queries on the table run out of
memory. Any ideas on how I might get the data out of the table and split it
into more buckets or something?
Thanks
Shawn Weeks
It doesn't help if you need concurrent threads writing to a table but we are
just using the row_number analytic and a max value subquery to generate
sequences on our star schema warehouse. It has worked pretty well so far. To
provide true sequence support would require changes on the hive meta d
How long is it taking to run the actual query if you create a temp table or
something with the result? How many rows are returned? Need to narrow down if
it’s the fetch taken a while or the actual query.
Thanks
Shawn
From: Sowjanya Kakarala
Sent: Monday, July 23, 2018 10:01 AM
To: user@hive.ap
Trying to figure out if the following is a bug or expected behavior. When LLAP
Execution Mode is set to 'only' you can't have a macro and window function in
the same select statement. However you can have a macro or a window function
without an issue. Below is a test case.
use default;
create
34 matches
Mail list logo