Kafka set up 2 authentications SASL_PLAINTEXT and SASL_SSL - apache-kafka

I intended to setup 2 authentication modes which are SASL_PLAINTEXT and SASL_SSL. SASL_PLAINTEXT will be used between brokers and zookeeper, and SASL_SSL will be used with external producers and consumers.
I can completely set either one of them, but can't set them both at the same time.
Now Broker can authenticate with Zookeeper, but I can't have Producer to authenticate to Broker via SASL_SSL:9093.
Server.properties
listeners=SASL_PLAINTEXT://172.22.10.21:9092,SASL_SSL://172.22.10.21:9093
advertised.listeners=SASL_PLAINTEXT://172.22.10.21:9092,SASL_SSL://172.22.10.21:9093
ssl.endpoint.identification.algorithm=
ssl.client.auth=required
ssl.truststore.location=/home/aaapi/ssl/kafka.server.truststore.jks
ssl.truststore.password=serversecret
ssl.keystore.location=/home/aaapi/ssl/kafka.server.keystore.jks
ssl.keystore.password=serversecret
ssl.key.password=serversecret
ssl.enabled.protocols=TLSv1.2
security.inter.broker.protocol=SASL_PLAINTEXT
sasl.mechanism.inter.broker.protocol=PLAIN
sasl.enabled.mechanisms=SCRAM-SHA-256,SCRAM-SHA-512,PLAIN
sasl.mechanism=SCRAM-SHA-512 here
server_jaas.conf
sasl_ssl.KafkaServer {
org.apache.kafka.common.security.scram.ScramLoginModule required
username="adminssl"
password="adminssl-secret";
};
sasl_plaintext.KafkaServer {
org.apache.kafka.common.security.plain.PlainLoginModule required
username="admin"
password="admin-secret"
user_admin="admin-secret"
user_kafkabroker1="kafkabroker1-secret";
};
Client {
org.apache.kafka.common.security.plain.PlainLoginModule required
username="admin"
password="admin-secret";
};
zookeeper_jaas.conf
Server {
org.apache.zookeeper.server.auth.DigestLoginModule required
user_admin="admin-secret";
};
client_ssl.properties
security.protocol=SASL_SSL
#bootstrap.servers=172.22.10.21:9093
sasl.mechanism=SCRAM-SHA-512
ssl.enabled.protocols=TLSv1.2
ssl.endpoint.identification.algorithm=
ssl.truststore.location=/home/aaapi/ssl/kafka.client.truststore.jks
ssl.truststore.password=clientsecret
ssl.keystore.location=/home/aaapi/ssl/kafka.server.keystore.jks
ssl.keystore.password=serversecret
ssl.key.password=serversecret
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required \
username="adminssl" \
password="adminssl-secret";
Error
/opt/kafka/bin/kafka-console-producer.sh --broker-list 172.22.10.21:9093 --topic test1 --producer.config /home/aaapi/client_config/consumer/client_ssl.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/kafka-3.2.1-src/tools/build/dependant-libs-2.13.6/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/kafka-3.2.1-src/trogdor/build/dependant-libs-2.13.6/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/kafka-3.2.1-src/connect/runtime/build/dependant-libs/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/kafka-3.2.1-src/connect/mirror/build/dependant-libs/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Reload4jLoggerFactory]
>[2022-11-11 19:45:06,787] ERROR [Producer clientId=console-producer] Connection to node -1 (ip-172-22-10-21.ap-southeast-1.compute.internal/172.22.10.21:9093) failed authentication due to: Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512 (org.apache.kafka.clients.NetworkClient)
[2022-11-11 19:45:06,788] WARN [Producer clientId=console-producer] Bootstrap broker 172.22.10.21:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient)
[2022-11-11 19:45:07,388] ERROR [Producer clientId=console-producer] Connection to node -1 (ip-172-22-10-21.ap-southeast-1.compute.internal/172.22.10.21:9093) failed authentication due to: Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512 (org.apache.kafka.clients.NetworkClient)
[2022-11-11 19:45:07,388] WARN [Producer clientId=console-producer] Bootstrap broker 172.22.10.21:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient)
[2022-11-11 19:45:08,323] ERROR [Producer clientId=console-producer] Connection to node -1 (ip-172-22-10-21.ap-southeast-1.compute.internal/172.22.10.21:9093) failed authentication due to: Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512 (org.apache.kafka.clients.NetworkClient)
[2022-11-11 19:45:08,323] WARN [Producer clientId=console-producer] Bootstrap broker 172.22.10.21:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient)
[2022-11-11 19:45:09,724] ERROR [Producer clientId=console-producer] Connection to node -1 (ip-172-22-10-21.ap-southeast-1.compute.internal/172.22.10.21:9093) failed authentication due to: Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512 (org.apache.kafka.clients.NetworkClient)
[2022-11-11 19:45:09,724] WARN [Producer clientId=console-producer] Bootstrap broker 172.22.10.21:9093 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient)
[2022-11-11 19:45:11,149] ERROR [Producer clientId=console-producer] Connection to node -1 (ip-172-22-10-21.ap-southeast-1.compute.internal/172.22.10.21:9093) failed authentication due to: Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512 (org.apache.kafka.clients.NetworkClient)

Related

Kafka Producer Connection Issue

I am trying send messages through kafka producer but getting following error. I am using following command to establish connection.
export KAFKA_OPTS="-Djava.security.auth.login.config=$OSI_HOME/jaas.conf -Djava.security.krb5.conf=$OSI_HOME/krb5.conf"$OSI_HOME/Custom/confluent/bin/kafka-console-producer --topic testTopic --bootstrap-server <server list> --producer.config producer.properties
Here's the JAAS config I am using
KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true storeKey=true keyTab="[Location of keytabfile]" serviceName="kafka" useTicketCache=true principal="svc_account#REALM"; };
I can see producer was able to connect. But when I try to send messages I get following error.
[2022-02-23 12:45:56,562] WARN [Producer clientId=console-producer] Connection to node -7 terminated during authentication. This may happen due to any of the following reasons: (1) Authentication failed due to invalid credentials with brokers older than 1.0.0, (2) Firewall blocking Kafka TLS traffic (eg it may only allow HTTPS traffic), (3) Transient network issue. (org.apache.kafka.clients.NetworkClient)
[2022-02-23 12:45:56,563] WARN [Producer clientId=console-producer] Bootstrap broker (id: -7 rack: null) disconnected (org.apache.kafka.clients.NetworkClient)
[2022-02-23 12:45:56,707] WARN [Producer clientId=console-producer] Connection to node -13 terminated during authentication. This may happen due to any of the following reasons: (1) Authentication failed due to invalid credentials with brokers older than 1.0.0, (2) Firewall blocking Kafka TLS traffic (eg it may only allow HTTPS traffic), (3) Transient network issue. (org.apache.kafka.clients.NetworkClient)
[2022-02-23 12:45:56,707] WARN [Producer clientId=console-producer] Bootstrap broker (id: -13 rack: null) disconnected (org.apache.kafka.clients.NetworkClient)
[2022-02-23 12:45:56,863] WARN [Producer clientId=console-producer] Connection to node -12 terminated during authentication. This may happen due to any of the following reasons: (1) Authentication failed due to invalid credentials with brokers older than 1.0.0, (2) Firewall blocking Kafka TLS traffic (eg it may only allow HTTPS traffic), (3) Transient network issue. (org.apache.kafka.clients.NetworkClient)
[2022-02-23 12:45:56,863] WARN [Producer clientId=console-producer] Bootstrap broker (id: -12 rack: null) disconnected (org.apache.kafka.clients.NetworkClient)
Does anyone know how to resolve this? I tried telnet command for broker list and it's connected so it doesn't seem to be a firewall issue. I can also do kinit from the server so doesn't seem to be an authentication issue.
git clone https://gerrit.googlesource.com/gerrit-release-tools && (cd gerrit-release-tools && f=git rev-parse --git-dir/hooks/commit-msg ; mkdir -p $(dirname $f) ; curl -Lo $f https://gerrit-review.googlesource.com/tools/hooks/commit-msg ; chmod +x $f) enter code here

How to fix kafka SCRAM authentication failure

version of confluent platform: 5.4.1
I followed the document and previous question to setup the SCRAM authentication:
https://docs.confluent.io/current/kafka/authentication_sasl/authentication_sasl_scram.html#
kafka SASL/SCRAM Failed authentication
After I modified my configurations, the SASL authentication of zookeeper server is successful but the kafka server is still failed. the below shows the log messages and my related configuration, please help advise on it
zookeeper server output:
[2020-07-18 23:53:42,917] INFO Successfully authenticated client: authenticationID=adminuser; authorizationID=adminuser. (org.apache.zookeeper.server.auth.SaslServerCallbackHandler)
[2020-07-18 23:53:43,143] INFO Setting authorizedID: adminuser (org.apache.zookeeper.server.auth.SaslServerCallbackHandler)
[2020-07-18 23:53:43,143] INFO adding SASL authorization for authorizationID: adminuser (org.apache.zookeeper.server.ZooKeeperServer)
[2020-07-18 23:53:51,162] INFO Successfully authenticated client: authenticationID=adminuser; authorizationID=adminuser. (org.apache.zookeeper.server.auth.SaslServerCallbackHandler)
[2020-07-18 23:53:51,162] INFO Setting authorizedID: adminuser (org.apache.zookeeper.server.auth.SaslServerCallbackHandler)
[2020-07-18 23:53:51,162] INFO adding SASL authorization for authorizationID: adminuser (org.apache.zookeeper.server.ZooKeeperServer)
kafka server error message:
org.apache.kafka.common.errors.DisconnectException: Cancelled fetchMetadata request with correlation id 11 due to node -1 being disconnected
[2020-07-19 00:23:59,921] INFO [SocketServer brokerId=0] Failed authentication with /192.168.20.10 (Unexpected Kafka request of type METADATA during SASL handshake.) (org.apache.kafka.common.network.Selector)
[2020-07-19 00:24:00,095] WARN [Producer clientId=confluent-metrics-reporter] Bootstrap broker 192.168.20.10:9092 (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient)
[2020-07-19 00:24:00,403] INFO [SocketServer brokerId=0] Failed authentication with /192.168.20.10 (Unexpected Kafka request of type METADATA during SASL handshake.) (org.apache.kafka.common.network.Selector)
[2020-07-19 00:24:00,597] INFO [SocketServer brokerId=0] Failed authentication with /192.168.20.10 (Unexpected Kafka request of type METADATA during SASL handshake.) (org.apache.kafka.common.network.Selector)
[2020-07-19 00:24:00,805] INFO [SocketServer brokerId=0] Failed authentication with /192.168.20.10 (Unexpected Kafka request of type METADATA during SASL handshake.) (org.apache.kafka.common.network.Selector)
zookeeper_server_jaas.conf:
Server {
org.apache.zookeeper.server.auth.DigestLoginModule required
user_adminuser="adminuserpwd";
};
zookeeper.properties:
server.001=192.168.20.10:2888:3888
authProvider.001=org.apache.zookeeper.server.auth.SASLAuthenticationProvider
requireClientAuthScheme=sasl
zookeeper-server-start:
...
export ZK_AUTH_ARGS=$base_dir/../data/zookeeper_server_jaas.conf
exec $base_dir/kafka-run-class $EXTRA_ARGS -Djava.security.auth.login.config=$ZK_AUTH_ARGS org.apache.zookeeper.server.quorum.QuorumPeerMain "$#"
Added user:
bin/kafka-configs --zookeeper 192.168.20.10:2181 --alter --add-config 'SCRAM-SHA-256=[password=adminuserpwd],SCRAM-SHA-512=[password=adminuserpwd]' --entity-type users --entity-name adminuser
bin/kafka-configs --zookeeper 192.168.20.10:2181 --describe --entity-type users --entity-name adminuser
Configs for user-principal 'adminuser' are SCRAM-SHA-512=salt=MTdxamZocWJlY2F2dDFhZGc0dmluZm5hcmo=,stored_key=o21ptVzTVZoR/hafmOgTSYmr2F1TORPo6xDaZGAph+6OncE1pw/AyLRwduCx0Qx97bKoPWmlYShfXtbug6u8kg==,server_key=1B/1/CzPTpMBO9MpfKZb504JFLZUia0D6LatAllSYkrTa8XWbaISDGQ29Yf4UU+jQmo+iQgK0jX+KaV+fUV6XA==,iterations=4096,SCRAM-SHA-256=salt=MWlrZGs5dHd4dDhiZmdqZGxnN2cwOGpuaGs=,stored_key=vSJ83eDvilj4JyQyehPaGmG3EZISRRfo3j8iY8uiWLU=,server_key=Bu/KfHnv6bSay/n4dO/h55O9WLLaAjiLtJQzfpr4cs0=,iterations=4096
kafka_server_jaas.conf:
KafkaServer {
org.apache.kafka.common.security.scram.ScramLoginModule required
username="adminuser"
password="adminuserpwd";
};
Client {
org.apache.zookeeper.server.auth.DigestLoginModule required
username="adminuser"
password="adminuserpwd";
};
kafka server.properties:
...
listeners=SASL_PLAINTEXT://192.168.20.10:9092
security.inter.broker.protocol=SASL_PLAINTEXT
sasl.mechanism.inter.broker.protocol=SCRAM-SHA-256
sasl.enabled.mechanisms=SCRAM-SHA-256
advertised.listeners=SASL_PLAINTEXT://192.168.20.10:9092
zookeeper.connect=192.168.20.10:2181
authorizer.class.name=io.confluent.kafka.security.authorizer.ConfluentServerAuthorizer
super.users=User:adminuser
allow.everyone.if.no.acl.found=false
...
kafka-server-start:
...
KAFKA_AUTH_ARGS=$base_dir/../data/kafka_server_jaas.conf
exec $base_dir/kafka-run-class $EXTRA_ARGS -Djava.security.auth.login.config=$KAFKA_AUTH_ARGS io.confluent.support.metrics.SupportedKafka "$#"

Kafka send to azure event hub

I've set up a kafka in my machine and I'm trying to set up Mirror Maker to consume from a local topic and mirror it to an azure event hub, but so far i've been unable to do it and I get the following error:
ERROR Error when sending message to topic dev-eh-kafka-test with key: null, value: 5 bytes with error: (org.apache.kafka.clients.producer.internals.ErrorLoggingCallback)
org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.
After some time I realized that this must be the producer part so I tried to simply use the kafka-console-producer tool directly to event hub and got the same error.
Here is my producer settings file:
bootstrap.servers=dev-we-eh-feed.servicebus.windows.net:9093
compression.type=none
max.block.ms=0
# for event hub
sasl.mechanism=PLAIN
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://dev-we-eh-feed.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=*****”;
Here is the command to spin the producer:
kafka-console-producer.bat --broker-list dev-we-eh-feed.servicebus.windows.net:9093 --topic dev-eh-kafka-test
My event hub namespace has an event hub named dev-eh-kafka-test.
Has anyone been able to do it? Eventually the idea would be to SSL this with a certificate but first I need to be able to do the connection.
I tried using both Apacha Kafka 1.1.1 or the Confluent Kafka 4.1.3 (because this is the version the client is using).
==== UPDATE 1
Someone showed me how to get more logs and this seems to be the detailed version of the error
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Initialize connection to node dev-we-eh-feed.servicebus.windows.net:9093 (id: -1 rack: null) for sending metadata request (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Initiating connection to node dev-we-eh-feed.servicebus.windows.net:9093 (id: -1 rack: null) (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Created socket with SO_RCVBUF = 32768, SO_SNDBUF = 102400, SO_TIMEOUT = 0 to node -1 (org.apache.kafka.common.network.Selector)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Completed connection to node -1. Fetching API versions. (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Initiating API versions fetch from node -1. (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Connection with dev-we-eh-feed.servicebus.windows.net/51.144.238.23 disconnected (org.apache.kafka.common.network.Selector)
java.io.EOFException
at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:124)
at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:93)
at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:235)
at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:196)
at org.apache.kafka.common.network.Selector.attemptRead(Selector.java:559)
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:495)
at org.apache.kafka.common.network.Selector.poll(Selector.java:424)
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:460)
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:239)
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:163)
at java.base/java.lang.Thread.run(Thread.java:830)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient)
So here is the configuration that worked (it seems I was missing client.id).
Also, it seems you can not choose the destination topic, it seems it must have the same name as the source...
bootstrap.servers=dev-we-eh-feed.servicebus.windows.net:9093
client.id=mirror_maker_producer
request.timeout.ms=60000
sasl.mechanism=PLAIN
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://dev-we-eh-feed.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=******";

Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)

I have installed Kafka and doing some basic testing. I am able to create topics using scripts provided under Kafka-broker/bin folder.
But when I am trying to produce message getting below WARNing every time I run this. And no message is getting generated. Please advice.
[root#node2 bin]# ./kafka-console-producer.sh --broker-list localhost:9092 --topic test_master
>testmsg1
[2019-05-15 06:25:19,092] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:19,197] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:19,349] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:19,562] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:20,017] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:20,876] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:21,987] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:22,957] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:23,818] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
^Corg.apache.kafka.common.KafkaException: Producer closed while send in progress
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:826)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:803)
at kafka.tools.ConsoleProducer$.send(ConsoleProducer.scala:75)
at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:57)
at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Caused by: org.apache.kafka.common.KafkaException: Requested metadata update after close
at org.apache.kafka.clients.Metadata.awaitUpdate(Metadata.java:188)
at org.apache.kafka.clients.producer.KafkaProducer.waitOnMetadata(KafkaProducer.java:938)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:823)
... 4 more
Open Server.xml of each broker of your cluster and make following changes
Change the listeners=PLAINTEXT://:9092 to listeners=PLAINTEXT://<our ip address>:9092
Just remove your local host and write the port
EXAMPLE:
{--broker-list 172.0.0.1:9092} will be changed to { --broker-list :9092 }

Kafka-Broker not available after some time of message transfer

Connection to node 1 could not be established after _consumer_offset-49,
I am not able to solve the issue,till _consumer_offset-49 consumer is able to get message but after offset-49 , it show WARNING message i.e Connection to node 1 could not be established,broker not may not be available.
C:\kafka_2.11-2.0.0>.\bin\windows\kafka-console-producer.bat --broker-list localhost:9092 --topic test1
>hi
>hello
>hey
>whatsupp??
>how are u
>[2019-02-25 03:53:14,876] WARN [Producer clientId=console-producer] Connection to node 1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-02-25 03:53:15,982] WARN [Producer clientId=console-producer] Connection to node 1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-02-25 03:53:17,240] WARN [Producer clientId=console-producer] Connection to node 1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
I solved the issue, I just changed the Java version to Java-1.8-181 and it worked.