Does ibm-eventstreams support Kafka ACLs? - apache-kafka

Wanted to check if ibm-eventstreams that I can deploy on IBM Cloud Private (ICP) 2.1.0.3, supports Kafka SASL authentication and ACLs applied to specific topics?
I was referring to this developerWorks article about Kafka ACLs:
https://developer.ibm.com/opentech/2017/05/31/kafka-acls-in-practice/
Wondering whether it is available and supported with ibm-eventstreams.
If it is supported, are there any changes/enhancements to the ACL support that I see in the above doc? Is there any further documentation available?

Sorry, no - the current Tech Preview doesn't include any security or auth. (We're thinking hard about what the best way to do this would be though!)

Related

Any API gateway solution which has plugin to output data into Kafka

I need to ingest data into my application. There is a Kafka stream already build for data ingestion. One of the client's requirement needs an API interface for accepting the data. So I need an API management solution which provides API gateway functionality and is able to produce message on a Kafka topic.
I have analyzed Kong + Kong-upstream plugin. But looking for any other similar solution
You can check WSO2 Api Manager.
check this article for wso2 kafka integration. https://ei.docs.wso2.com/en/7.2.0/micro-integrator/references/connectors/kafka-connector/kafka-connector-producer-example/
Also you can write your custom mediator for WSO2.
Another Solution;
If you need different solution else that api gateway. You can check Apache Nifi. It's application that you create your flow. You can do anything what you want.
Apache Nifi's Documentation
For your problem you need to check this Apache Nifi's processors;
-PublishKafka + ConsumeKafka
-ListenHTTP
Edit after #OneCricketeer's comment:
Kafka Solution Without any Integration
Kafka REST Proxy

Connecting to Snowflake from Databricks through SSO

We are currently planning to use Databricks as compute platform and Snowflake as our DWH system. We have planned to use SSO-based login for both, with our corporate ADFS as the IdP and we are still in the planning phase.
Wanted to check if having SSO enabled at Snowflake will restrict our ability to run jobs on Databricks that interacts with Snowflake for reading/writing data. If yes, what are our alternatives for better login security?
If this set-up is actually possible, can someone please point to any documentation talking about connecting to Snowflake from Databricks through SSO. Didn't really find anything on the topic. The document below mentions that MFA, SSO or any browser based login won't work with Snowflake's Spark connector, not sure if that's relevant to this use case.
https://docs.snowflake.com/en/user-guide/spark-connector-use.html#authenticating-through-a-browser-is-not-supported
For Spark connector use OAuth for authentication.
It can be configured with Microsoft Azure AD, see here

Is pre-authentication against a second url supported in kafka-connect-http?

I want to use ready made kafka connector for fetching the data from the REST API. I found kafka-connect-http connector on the confluent hub but this connector does not support pre-authentication of the API.
I raised this as an issue in the (https://github.com/castorm/kafka-connect-http) and got the response that unfortunately this feature is not supported in the existing code of the connector. So if you have the implementation of the API without authentication then this is the readymade solution for you else you can go for streams etc.
Although the author had agreed that he will look into this feature in the coming future.

Vault for Kafka distributed connectors

I am using a JBoss based vault to secure sensitive data such as the database credentials.
I use a Java based HTTP REST client to create distributed Kafka connectors but ended up with a security concern such that a request for the connector's "config" exposes the sensitive credentials in the response.
I referred this official documentation but could not get much help in the context of JBoss vault.
Any pointers or references that directly addresses this specific problem is very much appreciated.
Any references to alternate open source (and free to use) Vault based solutions would also be of great help.
You'd have to write code that implements the ConfigProvider interface of the Connect API, mentioned there.
You can browse Kafka source code on Github to see the existing File one, but that KIP (which references Hashicorp Vault) and the source files are the only such documentation for now.
Connect doesn't use JBoss, either, so you'd have to find a way around that

How to integrate SnappyData with Kerberos

For enterprise usage, we need to integrate Kerberos for SnappyData. Do you have any documentation for doing that?
Thanks
SnappyData Enterprise supports LDAP for authentication but we now in the midst of adding support for kubernetes (coming in GA form soon) which will our primary mechanism to support a wide range of security options including using tickets(kerberos).
What security provider do you use for kerberos ?