Delete Topic from kafka within c++ code - apache-kafka

I am using c++ kafka implementation for C++ rdkafka/edenhill.
The question is about topic deletion. I'am creating lots of topics with GUID while my program is running, at the end of the program (destructors) I want to clean all those topics (there is no need for them any more).
How can I do it from within my c++ code ?
thanks ahead

Remove the all the folders that has your topic name from kafka-logs directory on kafka server. That you can do it through code in c++.

Related

How to use Kafka connector using java code?

Currently I am using Kafka SpoolDir connector in standalone mode. After adding the required configurations to the properties file, I start the connector using
kafka/bin/connect-standalone.sh connect-standalone.properties file-source.properties
Is there any way to start the connector(stadalone/ distributed) using a java code only, in the way we can write consumer and producer java codes?
ConnectStandalone is the Java class that this command starts, but Connect Framework is not meant to be ran as an embedded service
You can see the source code here that starts the server and parses the config file

Sending data before execution of scenarios

I am working on a scala application. I have some files in my resouce folder of project. Those are json files. I want to load all of them as string and send them over to kafka topic. I already have kafka producer code but just don't know how to all files and send them. I am using following code
Source.fromResource(path_of_file).mkstring
But with this I am able to send only one file which I pass but how can I write a generic code to load them and send them one by one. This thing I need to do in BeforeAll of my cucumber test. In short I just want to send these files before my any scenario begin to execute
Which sbt version are you using? Please note that sbt 1.2.8 has a bug in listing directories. Otherwise the following should do that:
new File(getClass.getResource("/my-dir").getFile).listFiles().foreach(println)

updating kafka dependency in camus is causing messages not read by EtlRecordReader

In my project camus is used for long time and it is never get updated.
The camus project uses kafka version 0.8.2.2. I want to find a workaround to use kafka 1.0.0.
So I cloned the directory and updated the dependency. When I do that the Message here requires additional parameters here.
As given in the github link above, the code compiles but the messages are not read from the kafka due to the condition here.
Is it possible to update the kafka dependency along with appropriate data constructors of kafka.message.Message and make it work.

New download of Kafka already contains the old topic

I was working with a Kafka download pack and was following the Kafka getting started guide. Thus, I created a sample topic called test.
Then when I wanted to try setting some access control lists using the kafka-acls.sh script. For some reason, I did not find that script inside the bin directory of my kafka pack.
So, I downloaded a fresh kafka pack from their website to check and this script was available. (I don't know why or how it wasn't there in the earlier pack)
However, when I started kafka from my new pack and tried to create the same topic test, I get an error saying that the topic already exists.
I am trying to figure out how this is possible even with a freshly downloaded instance? Does kafka save topics in some common directory or something?
Shabir
Found the reason. I figured that if the topics are persisted even across different bundles of Kafka then it must be stored in some place in disk other than the bundle itself.
A little bit of searching proved that the zookeeper stores its details in the directory pointed to by dataDir inside the zookeeper.properties file which is by default /tmp/zookeeper.
Once I deleted this folder and started a fresh Kafka pack all previous topics were gone and it behaved like a new fresh pack.
Thanks
Shabir

SpringXD/Spring Integration: Using different versions of spring-integration-kafka for producer and consumer

I have the following configuration:
Spring-integration-kafka 1.3.1.RELEASE
I have a custom kafka-sink and a custom kafka-source
The configuration I want to have:
I'd like to still using Spring-integration-kafka 1.3.1.RELEASE with my custom kafka-sink.
I'm changing my kafka-source logic to use Spring-integration-kafka-2.1.0.RELEASE. I noticed the way to implement a consumer/producer is way different to prior versions of Spring-integration-kafka.
My question is: could I face some compatibily issues?
I'm using Rabbit.
You should be ok then; it would probably work with the newer kafka jars in the source's /lib directory since each module is loaded in its own classloader so there should be no clashes with the xd/lib jars.
However, you might have to remove the old kafka jars from the xd/lib directory (which is why I asked about the message bus).