Kerberos and PLAIN_TEXT were acceptable here in server.properties as of 2017
https://docs.confluent.io/3.0.0/kafka/sasl.html

since you're not getting SASL_256 or SASL_512 then a number of situations have 
to be checked

did you install SCRAM LoginModule ?

what are your SCRAM callback handlers (hint callback handler is usually an 
interface)?
are SCRAM callback handlers installed (hint interface is implemented by class 
accessible to all)

is ScramSaslServer installed?

are credentials stored in ZK /config/users/<user> in Zookeeper accessible by 
Kafka Scram Server?
does ZK /config/users/<user> look like:
// SCRAM credentials for user alice: Zookeeper persistence path 
/config/users/alice
{
        "version":1,
        "config": {
   "SCRAM-SHA-512" : 
"salt=djR5dXdtZGNqamVpeml6NGhiZmMwY3hrbg==,stored_key=sb5jkqStV9RwPVTGxG1ZJHxF89bqjsD1jT4S...==,server_key=...,iterations=4096",
  "SCRAM-SHA-256" : 
"salt=10ibs0z7xzlu6w5ns0n188sis5,stored_key=+Acl/wi1vLZ95Uqj8rRHVcSp6qrdfQIwZbaZBwM0yvo=,server_key=nN+fZauE6vG0hmFAEj/49+2yk0803y67WSXMYkgh77k=,iterations=4096"
        }
};

is JAAS.conf accessible by SCRAM LoginModule?
is jaas.conf accessible to ScramSaslServer?
do contents of jaas.conf look like:
KafkaClient {
    org.apache.kafka.common.security.scram.ScramLoginModule required
    username="alice"
    password="alice-secret";
};

KafkaServer {
    org.apache.kafka.common.security.scram.ScramLoginModule required
    username="admin"
    password="admin-secret";
}
does your kafka-configs.sh support --add-config 
'SCRAM-SHA-512=[password=alice-secret]' ?
does kafka-configs.sh generate salted-server-password?
does kafka-configs.sh generate salted-client-password ?
did you copy new salts to ZK /config/users/<user> ?

https://cwiki.apache.org/confluence/display/KAFKA/KIP-84%3A+Support+SASL+SCRAM+mechanisms#KIP-84:SupportSASLSCRAMmechanisms-JAASconfiguration
KIP-84: Support SASL SCRAM mechanisms - Apache Kafka - Apache Software 
Foundation - 
Dashboard<https://cwiki.apache.org/confluence/display/KAFKA/KIP-84%3A+Support+SASL+SCRAM+mechanisms#KIP-84:SupportSASLSCRAMmechanisms-JAASconfiguration>
This Confluence has been LDAP enabled, if you are an ASF Committer, please use 
your LDAP Credentials to login. Any problems file an INFRA jira ticket please.
cwiki.apache.org

as SCRAM_512 kip functionality is 2 years in dev and test it is reasonable 
SCRAM_512 should
work
be tested
please file issue in kafka jira at
https://issues.apache.org/jira/projects/KAFKA/issues/KAFKA-8372?filter=allopenissues

many of us would like to test this auth failure our own test-scram-servers 
where did you obtain distro?

bon chance
________________________________
From: Kieran JOYEUX <kjoy...@splio.com>
Sent: Thursday, May 16, 2019 5:17 AM
To: users@kafka.apache.org
Subject: Re: SASL + SSL : authentication error in broker-to-broker communication

Hello fellas,

I tried to simplify my configuration by removing the ssl configuration to ease 
up debugging. It didn't change anything regarding SCRAM but with SASL_PLAIN, it 
worked out of the box. I found that solution good enough as I'm still using SSL.

If it helps someone, here's my configuration.

# server.properties
auto.create.topics.enable=false
broker.id=1
compression.type=snappy
delete.topic.enable=true
listeners=SASL_SSL://:9093
log.dir=/var/lib/kafka
min.insync.replicas=2
sasl.enabled.mechanisms=PLAIN
sasl.mechanism.inter.broker.protocol=PLAIN
security.inter.broker.protocol=SASL_SSL
ssl.enabled.protocols=TLSv1.2
ssl.endpoint.identification.algorithm=
ssl.key.password=xxx
ssl.keystore.location=/opt/kafka/ssl/kafka.server.keystore.jks
ssl.keystore.password=xxx
ssl.keystore.type=JKS
ssl.secure.random.implementation=SHA1PRNG
ssl.truststore.location=/opt/kafka/ssl/kafka.server.truststore.jks
ssl.truststore.password=xxx
ssl.truststore.type=JKS
zookeeper.connect=zoo:2181/kafka

# /opt/kafka/config/kafka_server_jaas.conf
KafkaServer {
 org.apache.kafka.common.security.plain.PlainLoginModule required
 username="admin"
 password="adminpass"
 user_admin="adminpass"
 user_app="blabla";
};


Thanks anyway.

Kieran



________________________________
From: Kieran JOYEUX <kjoy...@splio.com>
Sent: Wednesday, May 15, 2019 5:46 PM
To: users@kafka.apache.org
Subject: Re: SASL + SSL : authentication error in broker-to-broker communication

Hello Martin,

First of all, thanks for your help on this matter.

However, pardon me but I don't understand the correlation between my kafka 
certs and these authentication problems ? Could you detail it please ?

Regarding kafka-configs.sh, I did use the same user/password as the one in the 
jaas files, which are identical on each broker thanks to Puppet epp templating. 
Seeing my ps faux, you can see that Kafka is using the jaas file as 
documentation is advising : 
-Djava.security.auth.login.config=/opt/kafka/config/kafka_server_jaas.conf

I also checked that every zookeeper is able to answer the same salt, server_key 
for each user described.

Anything else to check ?

Thanks a lot.

Sincerely,

Kieran


________________________________
From: Martin Gainty <mgai...@hotmail.com>
Sent: Wednesday, May 15, 2019 2:28 PM
To: users@kafka.apache.org
Subject: Re: SASL + SSL : authentication error in broker-to-broker communication

assuming ScramSaslProvider/ScramSaslServer your credentials are stored in ZK 
/config/users/<encoded-user>
but you cannot see plain-text attributes in ZK so use kafka tool to view
kafka-configs.sh -describe /config/users/<encoded-user>

/*2019 update for kafka-configs.sh */

For ease of use, kafka-configs.sh will take a password and an optional 
iteration count and generate a random salt, ServerKey and StoredKey as 
specified in in RFC 5802<https://tools.ietf.org/html/rfc5802>. For example:

bin/kafka-configs.sh --zookeeper localhost:2181 --alter --add-config 
'SCRAM-SHA-256=[iterations=4096,password=alice-secret],SCRAM-SHA-512=[password=alice-secret]'
 --entity-type users --entity-name alice

/*once you have verified username,password from  ZK credentials */
you can now export your cert from /opt/kafka/ssl/kafka.server.keystore.jks
keytool -exportcert -alias admin -keystore 
/opt/kafka/ssl/kafka.server.keystore.jks -keypass xxxx -storepass xxxx -file 
admin.cert

(note the storepass is for truststore!)

if you can view the admin.cert with cert-viewer and validate username(subject) 
are consistent with ZK creds
if you dont have cert-viewer you can convert to pem
 openssl pkcs12 -export -in "admin.p12" -out "admin.pem"
check UID in either cert or pem is consistent with ZK

finally check zk credentials are propagated to jaas.conf

#used by interbroker connections
KafkaServer {
    org.apache.kafka.common.security.scram.ScramLoginModule required
    username="admin"
    password="xxxx";
}

if there is consistency for all entities in
username
password
then your kafka-broker(s) *should* authenticate (assuming they all reference 
the same ZK server!)

bon chance
https://cwiki.apache.org/confluence/display/KAFKA/KIP-84%3A+Support+SASL+SCRAM+mechanisms#KIP-84:SupportSASLSCRAMmechanisms-JAASconfiguration
KIP-84: Support SASL SCRAM mechanisms - Apache Kafka - Apache Software 
Foundation - 
Dashboard<https://cwiki.apache.org/confluence/display/KAFKA/KIP-84%3A+Support+SASL+SCRAM+mechanisms#KIP-84:SupportSASLSCRAMmechanisms-JAASconfiguration>
This Confluence has been LDAP enabled, if you are an ASF Committer, please use 
your LDAP Credentials to login. Any problems file an INFRA jira ticket please.
cwiki.apache.org




________________________________
From: Kieran JOYEUX <kjoy...@splio.com>
Sent: Wednesday, May 15, 2019 4:42 AM
To: users@kafka.apache.org
Subject: SASL + SSL : authentication error in broker-to-broker communication

Hello,

I'm facing trouble activating SASL on my currrent  working SSL only cluster. I 
have read the doc many times and my configuration seems to be good. However, 
It's like Kafka cannot authenticate and broker to broker communication is not 
working at all.

Do you have any ideas ? (Descriptions below)

Thanks a lot.

Kieran

--------------------------------------------

# Versions
Kafka: 2.2.0
Zookeeper: 3.4.9-3+deb9u1

# Error message
[2019-05-15 10:14:00,811] DEBUG Set SASL server state to 
HANDSHAKE_OR_VERSIONS_REQUEST during authentication 
(org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)
[2019-05-15 10:14:00,811] DEBUG Handling Kafka request API_VERSIONS during 
authentication 
(org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)
[2019-05-15 10:14:00,811] DEBUG Set SASL server state to HANDSHAKE_REQUEST 
during authentication 
(org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)
[2019-05-15 10:14:00,812] DEBUG Handling Kafka request SASL_HANDSHAKE during 
authentication 
(org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)
[2019-05-15 10:14:00,812] DEBUG Using SASL mechanism 'SCRAM-SHA-512' provided 
by client 
(org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)
[2019-05-15 10:14:00,813] DEBUG Setting SASL/SCRAM_SHA_512 server state to 
RECEIVE_CLIENT_FIRST_MESSAGE 
(org.apache.kafka.common.security.scram.internals.ScramSaslServer)
[2019-05-15 10:14:00,813] DEBUG Set SASL server state to AUTHENTICATE during 
authentication 
(org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)
[2019-05-15 10:14:00,814] DEBUG Setting SASL/SCRAM_SHA_512 server state to 
FAILED (org.apache.kafka.common.security.scram.internals.ScramSaslServer)
[2019-05-15 10:14:00,814] DEBUG Set SASL server state to FAILED during 
authentication 
(org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)
[2019-05-15 10:14:00,814] INFO [SocketServer brokerId=2] Failed authentication 
with 10.101.60.15 (Authentication failed during authentication due to invalid 
credentials with SASL mechanism SCRAM-SHA-512) 
(org.apache.kafka.common.network.Selector)
[2019-05-15 10:14:00,815] DEBUG [SocketServer brokerId=2] Connection with 
10.101.60.15 disconnected (org.apache.kafka.common.network.Selector)
java.io.EOFException
at 
org.apache.kafka.common.network.SslTransportLayer.read(SslTransportLayer.java:573)
at 
org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:94)
at 
org.apache.kafka.common.security.authenticator.SaslServerAuthenticator.authenticate(SaslServerAuthenticator.java:267)
at org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:173)
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:536)
at org.apache.kafka.common.network.Selector.poll(Selector.java:472)
at kafka.network.Processor.poll(SocketServer.scala:830)
at kafka.network.Processor.run(SocketServer.scala:730)
at java.lang.Thread.run(Thread.java:748)


# User creation in ZK & output
/opt/kafka/bin/kafka-configs.sh --zookeeper xxxx:2181 --alter --add-config 
'SCRAM-SHA-512=[password=xxxx]' --entity-type users --entity-name admin
entity-name admin
Configs for user-principal 'admin' are 
SCRAM-SHA-512=salt=bnBicjI4NWd5dDBweGJoMmJ1bnlzdzFxYQ==,stored_key=xxxxx,server_key=xxxxxx==,iterations=4096


# ps fauxww
kafka     2523  7.1 15.9 5838668 972848 ?      Ssl  mai14  52:46 java -Xmx1G 
-Xms1G -server -XX:+UseG1GC -XX:MaxGCPauseMillis=20 
-XX:InitiatingHeapOccupancyPercent=35 -XX:+ExplicitGCInvokesConcurrent 
-Djava.awt.headless=true -Xloggc:/var/log/kafka/kafkaServer-gc.log -verbose:gc 
-XX:+PrintGCDetails -XX:+PrintGCDateStamps -XX:+PrintGCTimeStamps 
-XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=10 -XX:GCLogFileSize=100M 
-Dcom.sun.management.jmxremote 
-Dcom.sun.management.jmxremote.authenticate=false 
-Dcom.sun.management.jmxremote.ssl=false 
-Dcom.sun.management.jmxremote.port=9990 
-Djava.security.auth.login.config=/opt/kafka/config/kafka_server_jaas.conf 
-Djava.rmi.server.hostname=xxxxx -Dkafka.logs.dir=/var/log/kafka 
-Dlog4j.configuration=file:/opt/kafka/config/log4j.properties -cp 
/opt/kafka/bin/../libs/activation-1.1.1.jar:/opt/kafka/bin/../libs/aopalliance-repackaged-2.5.0-b42.jar:/opt/kafka/bin/../libs/argparse4j-0.7.0.jar:/opt/kafka/bin/../libs/audience-annotations-0.5.0.jar:/opt/kafka/bin/../libs/commons-lang3-3.8.1.jar:/opt/kafka/bin/../libs/connect-api-2.2.0.jar:/opt/kafka/bin/../libs/connect-basic-auth-extension-2.2.0.jar:/opt/kafka/bin/../libs/connect-file-2.2.0.jar:/opt/kafka/bin/../libs/connect-json-2.2.0.jar:/opt/kafka/bin/../libs/connect-runtime-2.2.0.jar:/opt/kafka/bin/../libs/connect-transforms-2.2.0.jar:/opt/kafka/bin/../libs/guava-20.0.jar:/opt/kafka/bin/../libs/hk2-api-2.5.0-b42.jar:/opt/kafka/bin/../libs/hk2-locator-2.5.0-b42.jar:/opt/kafka/bin/../libs/hk2-utils-2.5.0-b42.jar:/opt/kafka/bin/../libs/jackson-annotations-2.9.8.jar:/opt/kafka/bin/../libs/jackson-core-2.9.8.jar:/opt/kafka/bin/../libs/jackson-databind-2.9.8.jar:/opt/kafka/bin/../libs/jackson-datatype-jdk8-2.9.8.jar:/opt/kafka/bin/../libs/jackson-jaxrs-base-2.9.8.jar:/opt/kafka/bin/../libs/jackson-jaxrs-json-provider-2.9.8.jar:/opt/kafka/bin/../libs/jackson-module-jaxb-annotations-2.9.8.jar:/opt/kafka/bin/../libs/javassist-3.22.0-CR2.jar:/opt/kafka/bin/../libs/javax.annotation-api-1.2.jar:/opt/kafka/bin/../libs/javax.inject-1.jar:/opt/kafka/bin/../libs/javax.inject-2.5.0-b42.jar:/opt/kafka/bin/../libs/javax.servlet-api-3.1.0.jar:/opt/kafka/bin/../libs/javax.ws.rs-api-2.1.1.jar:/opt/kafka/bin/../libs/javax.ws.rs-api-2.1.jar:/opt/kafka/bin/../libs/jaxb-api-2.3.0.jar:/opt/kafka/bin/../libs/jersey-client-2.27.jar:/opt/kafka/bin/../libs/jersey-common-2.27.jar:/opt/kafka/bin/../libs/jersey-container-servlet-2.27.jar:/opt/kafka/bin/../libs/jersey-container-servlet-core-2.27.jar:/opt/kafka/bin/../libs/jersey-hk2-2.27.jar:/opt/kafka/bin/../libs/jersey-media-jaxb-2.27.jar:/opt/kafka/bin/../libs/jersey-server-2.27.jar:/opt/kafka/bin/../libs/jetty-client-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-continuation-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-http-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-io-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-security-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-server-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-servlet-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-servlets-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-util-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jopt-simple-5.0.4.jar:/opt/kafka/bin/../libs/kafka_2.11-2.2.0.jar:/opt/kafka/bin/../libs/kafka_2.11-2.2.0-sources.jar:/opt/kafka/bin/../libs/kafka-clients-2.2.0.jar:/opt/kafka/bin/../libs/kafka-log4j-appender-2.2.0.jar:/opt/kafka/bin/../libs/kafka-streams-2.2.0.jar:/opt/kafka/bin/../libs/kafka-streams-examples-2.2.0.jar:/opt/kafka/bin/../libs/kafka-streams-scala_2.11-2.2.0.jar:/opt/kafka/bin/../libs/kafka-streams-test-utils-2.2.0.jar:/opt/kafka/bin/../libs/kafka-tools-2.2.0.jar:/opt/kafka/bin/../libs/log4j-1.2.17.jar:/opt/kafka/bin/../libs/lz4-java-1.5.0.jar:/opt/kafka/bin/../libs/maven-artifact-3.6.0.jar:/opt/kafka/bin/../libs/metrics-core-2.2.0.jar:/opt/kafka/bin/../libs/osgi-resource-locator-1.0.1.jar:/opt/kafka/bin/../libs/plexus-utils-3.1.0.jar:/opt/kafka/bin/../libs/reflections-0.9.11.jar:/opt/kafka/bin/../libs/rocksdbjni-5.15.10.jar:/opt/kafka/bin/../libs/scala-library-2.11.12.jar:/opt/kafka/bin/../libs/scala-logging_2.11-3.9.0.jar:/opt/kafka/bin/../libs/scala-reflect-2.11.12.jar:/opt/kafka/bin/../libs/slf4j-api-1.7.25.jar:/opt/kafka/bin/../libs/slf4j-log4j12-1.7.25.jar:/opt/kafka/bin/../libs/snappy-java-1.1.7.2.jar:/opt/kafka/bin/../libs/validation-api-1.1.0.Final.jar:/opt/kafka/bin/../libs/zkclient-0.11.jar:/opt/kafka/bin/../libs/zookeeper-3.4.13.jar:/opt/kafka/bin/../libs/zstd-jni-1.3.8-1.jar
 kafka.Kafka /opt/kafka/config/server.properties


# Broker conf
auto.create.topics.enable=false
broker.id=1
compression.type=snappy
delete.topic.enable=true
listeners=SASL_SSL://:9093
log.dir=/var/lib/kafka
min.insync.replicas=2
sasl.enabled.mechanisms=SCRAM-SHA-512,PLAIN
sasl.mechanism.inter.broker.protocol=SCRAM-SHA-512
security.inter.broker.protocol=SASL_SSL
ssl.client.auth=required
ssl.enabled.protocols=TLSv1.2
ssl.endpoint.identification.algorithm=
ssl.key.password=xxx
ssl.keystore.location=/opt/kafka/ssl/kafka.server.keystore.jks
ssl.keystore.password=xxx
ssl.keystore.type=JKS
ssl.secure.random.implementation=SHA1PRNG
ssl.truststore.location=/opt/kafka/ssl/kafka.server.truststore.jks
ssl.truststore.password=xxx
ssl.truststore.type=JKS

# /opt/kafka/config/kafka_server_jaas.conf
KafkaServer {
 org.apache.kafka.common.security.scram.ScramLoginModule required
 username="admin"
 password="adminpass";

 org.apache.kafka.common.security.plain.PlainLoginModule required
 username="admin"
 password="adminpass"
 user_admin="adminpass"
 user_app="blabla";
};

Reply via email to