Client ID authentication no longer works with Kafka API - apache-kafka

Using the Message Hub Kafka interface, I've found that my Bluemix app that was using client.id authentication is getting connections refused by the Kafka brokers. How do I fix this?

The Message Hub service is switching off client.id authentication in mid-January 2016. It is replaced by SASL authentication. Documentation on authenticating with the service using SASL can be found in the documentation:
https://www.ng.bluemix.net/docs/services/MessageHub/index.html#messagehub063
https://developer.ibm.com/messaging/2016/01/25/message-hub-beta-plan-ending/
A Java sample using the Message Hub SASL login library can be found at:
https://github.com/ibm-messaging/message-hub-samples
SASL is currently available on both port 9093 and 9094 for a grace period, after which 9094 will be switched off. The timeline for this change is also available in the Message Hub docs, as linked above.

Related

How to enable SASL mechanism in kafka locally

How to enable SASL mechanism with JAAS Authentication for kafka ? thus the consumer/producer have to provide username & password in order to be able to publish in the broker
The process of enabling SASL authentication in Kafka is extensively described in the Authentication using SASL section in the documentation. I suggest you follow the official documentation as it contains instructions for all the mechanisms and recommendations for production environments.
To give a bit of background, at a glance you need to:
Create a JAAS file for brokers with a KafkaServer block and the configuration for the specific mechanism.
Add -Djava.security.auth.login.config=<PATH_TO_JAAS_FILE> to your broker JVM command line argument.
Configure client to use SASL via the security.protocol, sasl.mechanism and sasl.jaas.config settings.

Create AWS amazon-MSK JavaScript Client connection

Has anybody successfully established client connection to Amazon MSK Kafka cluster using JavaScript? No YouTube video or online example AFAIK is out there. Attempts to use KafkaJs npm module are not working for me, because the SASL AWS I am roles is not supported without installing IamAWSLogin plugin on the brokers which you can’t ssh into.
Trying to use plain SASL method doesn’t work on KafkaJs because aws doesn’t use username and password.
I am not finding kafka-node useful as well.
Any leads?
There is a new feature in development that permits to inject mechanisms for auth with AWS.
https://medium.com/#jm18457_46341/using-amazon-msk-with-iam-access-control-for-apache-kafka-and-node-js-with-kafkajs-71638912fe88
Maybe is necessary to add a branch dpendency for your project, and it is a risk for production builds, however the good news is was reviewd and shoudl be merged soon :)
https://github.com/tulios/kafkajs/pull/1101
We've battled with IAM too, and it seems to be for Java clients only.
We have got it working with username/password. Details for MSK config are here https://docs.aws.amazon.com/msk/latest/developerguide/msk-password.html. I recommend when you set up MSK using a custom security group and setting up appropriate inbound access for the MSK ports.
When the cluser is set up, use the "View client information" button to get the brokers/ports to use.
Then this is your KafkaJS client setup:
new Kafka({
clientId: 'my-app',
brokers: ['something.kafka.us-east-1.amazonaws.com:9096', 'somethingelse.kafka.us-east-1.amazonaws.com:9096'],
ssl: true,
sasl: {
mechanism: 'scram-sha-512',
username,
password,
}
})
I was able to connect and use Amazon MSK Kafka cluster, via kafkajs library.
Initially I followed instructions found in docs of kafkajs library on how to use aws mechanism for sasl.
Considering that by default MSK Kafka cluster is not accessible from internet, I created a VPN client first following this video: https://www.youtube.com/watch?v=Bv70DoHDDCY, made sure that the client authorized users to access subnets of my VPC and after that I simply removed the sasl part from configuration.
so... I used something like:
const kafkaClient = new Kafka({
clientId: 'local-client',
brokers: [
'b-2.xxx.xxx.xx.xxx.xx.eu-central-1.amazonaws.com:9094',
'b-3.xxx.xxx.xx.xxx.xx.eu-central-1.amazonaws.com:9094',
'b-1.xxx.xxx.xx.xxx.xx.eu-central-1.amazonaws.com:9094'
],
ssl: true,
})
If sasl: {...} part would be there, I would get weird errors like "[BrokerPool] Failed to connect to seed broker, trying another broker from the list: Request is not valid given the current SASL state"
Most probably sasl is not needed anymore because of the VPN connection.

Zookeeper authentication not working when doing all the configurations

I followed the tutorial of the answer of this question:
Kafka SASL zookeeper authentication
And i setted zookeeper.set.acl=true in the server.propeties, but i still can access the zookeeper on port 2181 and this is available for anyone through the: kafka-topics --zookeeper <server-name>:2181 --list
ps: instead of <server-name> i put the DN of my server.
Authentication enforcement feature has recently been submitted in the ZooKeeper codebase and afaik there's no stable version released yet which supports it.
When you turn on SASL authentication, it will be available, but clients are still able to connect without it. Hence the recommendation is to use ACLs side by side with authentication to prevent non-authenticated user from accessing sensitive data.

HTTP error 403 when using Confluent Kafka REST Proxy

I use Confluent Kafka REST Proxy to send messages to Apache Kafka.
I set up basic authentication on the REST Proxy and whenever I submit a HTTP request to the proxy, I get the 403 HTTP Error !role.
The proxy requires Zookeeper, Kafka and Schema Registry to be running. I didn't configure any security on these services.
Without authentication, the proxy works and delivers messages to Kafka successfully.
How to I troubleshoot this problem? I spent multiple hours on that problem and I still can't fix it.
Check following:
Firewall allow the service or port
Is there any antivirus block the service or port
Rights given on kafka, confluent folder & respective log directory to kafka user.

Solace Spring Cloud Stream Binding

How do you initialize a Solace Binder with Spring Cloud Stream where the connection AUTHENTICATION_SCHEME is AUTHENTICATION_SCHEME_GSS_KRB?
solace:
java:
host: tcp://.....
msgVpn: myvpn
client-username: username
apiProperties:
AUTHENTICATION_SCHEME: AUTHENTICATION_SCHEME_GSS_KRB
KRB_SERVICE_NAME: HOST
JaasLoginContext: SolaceGSS
Error Response (403) - No matching configured Authorization Group was found
The error indicates that the Client Authorization is failing. Client Authorization is different from Client Authentication.
Once a client connection to a Message VPN is successfully authenticated, access to the event broker resources and messaging capabilities within that Message VPN must be authorized for the client.
The default authorization method is Internal. It looks like you have set LDAP as the authorization method but there is no matching LDAP group for your client.
You can refer to the Solace documentation for more information on configuring LDAP Authorization.