://apidog.com/blog/deepwiki/
Tks.
lisoda
r from the HIVE community about this research direction.
That's all.
Tks.
Lisoda.
At 2025-04-14 20:07:05, "Shohei Okumiya" wrote:
>Hi,
>
>I'm thrilled to see various opinions in this thread! I respect Ayush
>for initiating the discussion with the brave prop
Hello Team.
The documentation has been committed to the hive-site repository and is now
available from Apache Hive : Manual Installation.
Thank you to every member of the community who participated in the
collaboration.
Tks!
-lisoda
在 2024-12-30 16:05:11,"lisoda" 写道:
Hel
r commit operations are on the same metadata.json file. In this case, it
would be very easy to support multi-table transactions.
I would like to know how the HIVE community views this issue. Looking forward
to your reply!
-Lisoda
Hello sir.
If you have time,pls check this:[HIVE-28683] Add doc for install hive4 with old
version hadoop - ASF JIRA
Tks!
-lisoda
在 2024-11-12 16:10:02,"Stamatis Zampetakis" 写道:
Hi Lisoda,
I just gave you permissions to modify wiki. Please check that everything works
f
see again
在 2024-11-12 09:55:32,"lisoda" 写道:
Hello Sir.
I've checked my permissions and I still don't have write access to HIVE space.
If you have time, can you help me to open the WIKI permission?
Thank you.
Lisoda.
在 2024-10-24 21:25:28,"Ay
I agree with Butao that since the community has previously promised that jdk17
will be supported in 4.1.0, it would be best if we keep that promise.
But maybe we can go ahead and release a 4.0.2 version?And then wait for
HIVE-28665 to be merged before releasing a 4.1.0 version?
At
Hello Sir.
I've checked my permissions and I still don't have write access to HIVE space.
If you have time, can you help me to open the WIKI permission?
Thank you.
Lisoda.
在 2024-10-24 21:25:28,"Ayush Saxena" 写道:
You can submit a request for
Hello Sir.
I have applied for a wiki account.
Account name: lisoda
Email: lis...@yeah.net
Please help me to activate the access of HIVE-wiki.
Thank you.
lisoda
在 2024-10-24 21:25:28,"Ayush Saxena" 写道:
You can submit a request for account here:
https://selfserve.
Hello Sir.
I have applied for a wiki account.
Account name: lisoda
Email: lis...@yeah.net
Please help me to activate the access of HIVE-wiki.
Thank you.
lisoda
在 2024-10-24 21:25:28,"Ayush Saxena" 写道:
You can submit a request for account here:
https://selfserve.apache.org/
de the relevant
information?
Best
Lisoda
在 2024-10-22 13:01:41,"Ayush Saxena" 写道:
Sorry for coming back late, I don’t think there should be any problem in this
approach if things are working fine..
I think it doesn’t require any code changes, Do you plan to contribute the
st
g table on HDFS, then we'll discuss
reading and writing to the s3 file next.
If you also can't read or write to the iceberg table stored on HDFS, we need to
analyse the problem further.
Tks.
LiSoDa.
在 2024-09-20 17:01:54,"Awasthi, Somesh" 写道:
Hi Raghav,
How to
tend it to all
hive's users?
Tks.
LiSoDa.
在 2024-10-12 14:27:32,"Ayush Saxena" 写道:
If you already have a solution in place, feel free to create a Jira & PR with
it. However, third-party dependencies present significant challenges. Different
versions of Hadoop bring
way.
However, my idea may not be mature enough, so I would like to know what others
think. It would be great if someone could participate in this topic and discuss
it.
TKS.
LISODA.
The main problem is that HIVE4 changed the variable names in HIVECONF, which
caused some problems.
At 2024-10-07 23:32:46, "Denys Kuzmenko" wrote:
>Hi @lisoda
>
>Thanks for bringing up the Ranger issue. It would be very helpful if you could
>share the JIRA
projects so that they
can be supported with HIVE4.
At 2024-10-07 22:06:53, "lisoda" wrote:
As a long time user of HIVE, I would like to join this discussion.And share
some of my opinions.
Due to the bad situation of HIVE3, users are divided into two broad types:
1. Still use H
As a long time user of HIVE, I would like to join this discussion.And share
some of my opinions.
Due to the bad situation of HIVE3, users are divided into two broad types:
1. Still use HIVE1.X/2.X, and use Spark/Trino+HMS etc. to replace HIVEQL to
some extent or completely replace HIVEQL. This pa
17 matches
Mail list logo