That sounds strange. When my Hadoop 3.3.4 uses Java 8 then m/r jobs don't 
work so Hive throw an exception.
After I change to Java 11 for Hadoop they work right now.
That's to say, I am using Java 11 for Hadoop 3.3.4, and Java 8 for Hive 
3.1, on the same node.
 
Regards
 
 
 
 
-----Original-Nachricht-----
Betreff: Re: Make Hive 3.1 and Hadoop 3.3 work in the same node
Datum: 2022-12-24T15:54:32+0100
Von: "Ayush Saxena" <ayush...@gmail.com>
An: "user@hive.apache.org" <user@hive.apache.org>
 
 
 
  While most other Hadoop ecosystem including Hadoop 3.3, Spark 3.3, Kafka
  3.3 seem require java 11.
 
Hadoop 3.3.x line just supports Java-11 runtime support, the default 
supported JDK is still 8, atleast for hadoop, if it wasn't working with 
JDK-8, some issue with your hadoop installation or some bug there in the 
hadoop code, which you need to fix or reach out to hadoop ML/Jira
 
-Ayush

On Sat, 24 Dec 2022 at 11:26, yp...@t-online.de <mailto:yp...@t-online.de> 
<yp...@t-online.de <mailto:yp...@t-online.de> > wrote:


  Hello,

  Hive 3.1 requires java 8 to work.
  While most other Hadoop ecosystem including Hadoop 3.3, Spark 3.3, Kafka
  3.3 seem require java 11.
  I have all of them installed in the same system to work together.
  This is how I did,

  <https://blog.crypt.pw/How-to-run-Hadoop-3.3-and-Hive-3.1-in-same-node>


  Hope it also helps to you too.

  Thanks.


Reply via email to