Steve Chong created HADOOP-18411:
------------------------------------

             Summary: Unable to Find LoginModel Class using IBM Java openJ9 
version 8.0.332.0
                 Key: HADOOP-18411
                 URL: https://issues.apache.org/jira/browse/HADOOP-18411
             Project: Hadoop Common
          Issue Type: Bug
    Affects Versions: 3.3.4, 3.3.3, 3.3.2
         Environment: [link title|http://example.com][link 
title|http://example.com]
            Reporter: Steve Chong


Hi,

I am using Spark v.3.3.0 and Java version IBM Semeru 8.0.332.0.

When I run my Spark Job I get the following exception:

org.apache.hadoop.security.KerberosAuthException: failure to login: 
javax.security.auth.login.LoginException: unable to find LoginModule class: 
com.ibm.security.auth.module.JAASLoginModule
    at 
org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:1986)
    at 
org.apache.hadoop.security.UserGroupInformation.createLoginUser(UserGroupInformation.java:719)
    at 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:669)
    at 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:579)
    at 
org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2561)
    at scala.Option.getOrElse(Option.scala:138)
    at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2561)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:316)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
    at 
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
    at scala.Option.getOrElse(Option.scala:138)

This looks similar to a previously reported error that has been fixed: 
https://issues.apache.org/jira/browse/HADOOP-17971

N.B. The exception I am getting does not contain 'org.apache.hadoop.shaded" in 
the package name, whereas in HADOOP-17971 it does: 
(org.apache.hadoop.shaded.com.ibm.security.auth.module.JAASLoginModule).

 

Spark spark-core_2.12 library contains two Hadoop dependencies:

hadoop-client-api:jar:3.3.2:compile

hadoop-client-runtime:jar:3.3.2:compile

After getting the exception, I tried excluding those components from the Spark 
dependency in my pom.xml, and explicitly defined them as dependencies. I tried 
versions 3.3.4 and 3.3.3 but I still get the same error.

 

N.B. I don't get this exception with Java version IBM Semeru 8.0.312.0

I can move this to a Spark issue if this isn't the correct place to post it.

Thanks,

Steve

 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Reply via email to