Can we load data into any of the database using kafkastreams - apache-kafka

I'm using Kafka streams to fetch the data from a topic and now I would like to load these data to Postgres. Is it possible?

Kafka Streams is a 'small footprint' client libraries meant only to work with data from kafka to kafka. To copy data from / into kafka you should use kafka connect , or building your own kafka consumer/producer ,

Related

Kafka Connector To read from a Topic and write to a topic

I want to build a Kafka connector which needs to read from the Kafka topic and make a call to the GRPC service to get some data and write the whole data into another kafka topic.
I have written a Kafka Sink connector which reads from a topic and called a GRPC service. But not sure how to redirect this data into a Kafka topic.
Kafka Streams can read from topics, call external services as necessary, then forward this data to a new topic in the same cluster.
MirrorMaker2 can be used between different clusters, but using Connect transforms is generally not recommended with external services.
Or you could make your gRPC service into a Kafka producer.

Kafka Streams application integrate with Kafka JDBC sink connector

I am trying to use kafka streams for some sort of computation, and send the result of computation to a topic which is sinked to database by JDBC sink connector. The result needs to be serialized using avro with confluent schema registry. Is there any demo or guide to show how to handle this scenario?
Not clear what you mean by "integrate"; Kafka Streams is independent from Kafka Connect, however both can be used from ksqlDB
The existing examples of Kafka Connect should be adequate enough using the output topic of your Streams tasks
As for Kafka Streams, you'd need to use the Confluent Avro Serde's and add Schema Registry URL to the StreamsConfig.

how to call a springboot service endpoint from apache kafka and pass all the messages from a topic?

I am new to kafka technology and I have a requirement of fetch all the realtime data from a DB and pass it to a springboot microservice for its processing. in my analysis found that apache kafka with kafka source connect can pull all real time data from the DB to kafka Topics.
Can someone tell is there any way to pick this data form kafka topics and share to microservice by trigger a restcall from the kafka service ?
The idea is whenever a new entry added to the database table kafka can pull that data via kafka connect and somehow kafka should call the microservice and share this new entry. is it possible with kafka ?
Database --> kafka connect --> kafka (Topic) ---> some service that call microservice ---> microservice
form kafka topics and share to microservice
Kafka doesn't push. You would add a consumer in your service to pull from Kafka, perhaps using spring-kafka or spring-cloud-streams
Alternatively, Kafka Connect Sink could be used with an HTTP POST connector, but then you'd need to somehow deal with not committing offsets for messages that have failed requests.

Is there any way that we can use Kafka streams for loading file to database?

It can be any file. I just wanna know whether it's possible using Kafka stream.
Use Kafka Connect's JDBC Sink connector to stream data from Kafka to a database.
Here's an example of it in use: https://rmoff.dev/kafka-jdbc-video

Can we use a Kafka consumer to read directly from AWS Kinesis Stream?

Is there a way for a Kafka consumer to read directly from an AWS Kinesis Stream?
Azure EventHub provides the option of enabling Kafka, so that a Kafka consumer can seamlessly read from EventHub. Is there something similar in AWS Kinesis?
There is not. Kafka and Kinesis use different methods of communication and have different representations of their records
You could use Kafka Connect to source data from Kinesis into Kafka, then consume from that, though