Use exchange message inside the .to() method in apache camel - apache-kafka

Im new to camel and would like to change my route dynamically according to some logic preformed before hand
camelContext.addRoutes(new RouteBuilder() {
public void configure() {
PropertiesComponent pc = getContext().getComponent("properties", PropertiesComponent.class);
pc.setLocation("classpath:application.properties");
log.info("About to start route: Kafka Server -> Log ");
from("kafka:{{consumer.topic}}?brokers={{kafka.host}}:{{kafka.port}}"
+ "&maxPollRecords={{consumer.maxPollRecords}}"
+ "&consumersCount={{consumer.consumersCount}}"
+ "&seekTo={{consumer.seekTo}}"
+ "&groupId={{consumer.group}}"
+ "&valueDeserializer=" + BytesDeserializer.class.getName())
.routeId("FromKafka")
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
System.out.println(" message: " + exchange.getIn().getBody());
Bytes body = exchange.getIn().getBody(Bytes.class);
HashMap data = (HashMap)SerializationUtils.deserialize(body.get());
// do some work on data;
Map messageBusDetails = new HashMap();
messageBusDetails.put("topicName", "someTopic");
messageBusDetails.put("producerOption", "bla");
exchange.getOut().setHeader("kafka", messageBusDetails);
exchange.getOut().setBody(SerializationUtils.serialize(data));
}
}).choice()
.when(header("kafka"))
.to("kafka:"+ **getHeader("kafka").get("topicName")** )
.log("${body}");
}
});
getHeader("kafka").get("topicName")
this is what im trying to achieve.
But i dont know how to access the headers value ( which is a map - cause a kafka producer might have more configuration) inside the .to()
I understand i might be using it totally wrong... buts thats what i managed to understand until now...
The main goal is to have multiple message busses as .from()
and multiple message bus options in the .to() that will be decided via an external source (like config file) that way the same route will apply to many logic scenarios
and i thought the choice() method is the best answer
Thanks!

Instead of to(), you can use toD(), which is the "Dynamic To"
See this for details
And for the syntax to use to pull in various headers etc., see the Simple expression page

Related

I want to make a stream of small data by calling it again and again

I have a question, I've got a small CSV data that I'm able to launch on flink with help of kafka . My question is can I call the same data, again and again, using window and trigger, or it'll call my data only once?
1,35
2,45
3,55
4,65
5,555
This is the data that I want to call again and again. Though I myself don't think so it's better to take 2nd opinion as I'm a beginner. Thanks for the help
Not sure what you mean by call data again and again. But you can create a stream of that data in Flink using SourceFunction. For example, the following source creates a stream of that csv file and emits it every second.
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<String> csvStream = env.addSource(new SourceFunction<String>() {
#Override
public void run(SourceContext<String> sourceContext) throws Exception {
String data = "1,35\n" +
"2,45\n" +
"3,55\n" +
"4,65\n" +
"5,555";
while(true) {
sourceContext.collect(data);
TimeUnit.SECONDS.sleep(1);
}
}
#Override
public void cancel() {
}
});

Send message to Kafka when SessionWindows was started and ended

I want to send a message to the Kafka topic when new SessionWindow was created and when was ended. I have the following code
stream
.filter(user -> user.isAdmin)
.keyBy(user -> user.username)
.window(ProcessingTimeSessionWindows.withGap(Time.seconds(10)))
//what now? Trigger?
Now I want to send message when new session was started (with some metadata like web browser and timestamps, these informations are available in each element of stream) and send message to Kafka when session was ended (in this example 10 seconds after last element I think) with number of total requests.
It's possible in Flink? I think I should use some trigger but I don't know how and I can't find any example.
If You want to do this when the window is processed, then You can simply use the WindowProcessFunction, basically what You need to do is to add .process(new MyProcessFunction() to Your code. In the ProcessFunction You can have access to the whole window including its first (start) and last (end) element. You can simply use the Side output to just output the beginning and the end of the given window. You can then create a stream from side output and sink it to Kafka. More on Side outputs can be found here.
You can write a custom window trigger.
How to tell a new session is started?
You can create a ValueState with the default value to null, so in case the state value is null, it is a session start.
When the session ended?
Just before TriggerResult.FIRE.
Here is a demo based on ProcessingTimeTrigger of Flink, I only put the question-related logics here, you can check other details from the source code.
public class MyProcessingTimeTrigger extends Trigger<Object, TimeWindow> {
// a state which keeps a session start.
private final ValueStateDescriptor<Long> stateDescriptor = new ValueStateDescriptor<Long>("session-start", Long.class);
#Override
public TriggerResult onElement(Object element, long timestamp, TimeWindow window, TriggerContext ctx) throws Exception {
ValueState<Long> state = ctx.getPartitionedState(stateDescriptor);
if(state.value() == null) {
// if value is null, it's a session start.
state.update(window.getStart());
}
ctx.registerProcessingTimeTimer(window.maxTimestamp());
return TriggerResult.CONTINUE;
}
#Override
public TriggerResult onProcessingTime(long time, TimeWindow window, TriggerContext ctx) {
// here is a session end.
return TriggerResult.FIRE;
}
#Override
public void clear(TimeWindow window, TriggerContext ctx) throws Exception {
ctx.getPartitionedState(stateDescriptor).clear();
ctx.deleteProcessingTimeTimer(window.maxTimestamp());
}
}

Kafka listener, get all messages

Good day collegues.
I have Kafka project using Spring Kafka what listen a definite topic.
I need one time in a day listen all messages, put them into a collection and find specific message there.
I couldn't understand how to read all messages in one #KafkaListener method.
My class is:
#Component
public class KafkaIntervalListener {
public CountDownLatch intervalLatch = new CountDownLatch(1);
private final SCDFRunnerService scdfRunnerService;
public KafkaIntervalListener(SCDFRunnerService scdfRunnerService) {
this.scdfRunnerService = scdfRunnerService;
}
#KafkaListener(topics = "${kafka.interval-topic}", containerFactory = "intervalEventKafkaListenerContainerFactory")
public void intervalListener(IntervalEvent event) throws UnsupportedEncodingException, JSONException {
System.out.println("Recieved interval message: " + event);
IntervalType type = event.getType();
Instant instant = event.getInterval();
List<IntervalEvent> events = new ArrayList<>();
events.add(event);
events.size();
this.intervalLatch.countDown();
}
}
My events collection always has size = 1;
I tried to use different loops, but then, my collection become filed 530 000 000 times the same message.
UPDATE:
I have found a way to do it with factory.setBatchListener(true); But i need to find launch it with #Scheduled(cron = "${kafka.cron}", zone = "Europe/Moscow"). Right now this method is always is listening. Now iam trying something like this:
#Scheduled(cron = "${kafka.cron}", zone = "Europe/Moscow")
public void run() throws Exception {
kafkaIntervalListener.intervalLatch.await();
}
It doesn't work, in debug mode my breakpoint never works on this site.
The listener container is, by design, message-driven.
For fetching messages on-demand, it's better to use the Kafka Consumer API directly and fetch messages using the poll() method.

List files under a folder within a Camel rest service

I am creating a rest service using Camel Rest DSL. Within the service I need to list all files under a folder and do some processing on them. PFB the code -
from("direct:postDocument")
.to("file:/home/s469457/service/content-util/composite?noop=true")
.setBody(constant(null))
.log("Scanning file ${file:name.noext}.${file:name.ext}...");
Please advice.
~ Arunava
I would suggest to write a processor or a bean to list files in directory. I think that would be more efficient and so much simpler. Using Camel's file component you would have to deal with intricacies you might not expect.
Regardless. You will need to do pollEnrich and afterwards aggregate the whole result. I also think that you would run into trouble and will not be able to read files multiple times, to solve that you might need to create idempotent repository, but when reading files there might be concurrency/file locking issues...
Here's some pseudocode to get you started if you want to go that way:
from("direct:listFiles")
.pollEnrich("file:"+getFullPath()+"?noop=true")
.aggregate(new AggregationStrategy {
public Exchange aggregate(Exchange oldExchange, Exchange newExchange) {
String filename = newExchange.getIn().getHeader("CamelFileName", String.class)
if (oldExchange == null) {
newExchange.getIn().setBody(new ArrayList<String>(Arrays.asList(filename)));
return newExchange;
} else {
...
}
})
//Camel Rest Api to list files
rest().path("/my-api/")
.get()
.produces("text/plain")
.to("direct:listFiles");
//Camel Route to list files
List<String> fileList = new ArrayList<String>();
from("direct:listFiles")
.loopDoWhile(body().isNotNull())
.pollEnrich("file:/home/s469457/service/content-util/composite?noop=true&recursive=true&idempotent=false&include=.*.csv")
.choice()
.when(body().isNotNull())
.process( new Processor(){
#Override
public void process(Exchange exchange) throws Exception {
File file = exchange.getIn().getBody(File.class);
fileList.add(file.getName());
}
})
.otherwise()
.process( new Processor(){
#Override
public void process(Exchange exchange) throws Exception {
if (fileList.size() != 0)
exchange.getOut().setBody(String.join("\n", fileList));
fileList.clear();
}
})
.end();

Form Instantiation time in Restlet

I am new to Restlet framework and I have the following time issue in the post method of my server resource.
My post method code
#Post
public Representation represent(Representation entity){
try{
//Thread.sleep(1000);
long start = System.currentTimeMillis();
Form aForm = new Form(getRequestEntity());
System.err.println("FORM Instantiation TIME: " + (System.currentTimeMillis()-start));
}catch(Exception ex){
ex.printStackTrace();
}
return new StringRepresentation("hello");
}
On different trails, the output that I am getting is 1900-1999 ms. But if I uncomment the line Thread.sleep(1000), then the time output is 900-999 ms. Can any one please confirm what is happening when instantiation the Form object and why the time is always 1900+ ms. Sorting out this time issue is important for me as I have to implement token based authentication to reduce the post method processing time.
Sorry for late reply. The restlet version I am using is 2.0.7
Here is the details
public static void main(String[] args) throws Exception {
Component component = new Component();
component.getServers().add(Protocol.HTTP, 8182);
VirtualHost aHost = component.getDefaultHost();
aHost.attach("/sample", new MyApplication());
component.getLogger().setLevel(Level.OFF);
component.start();
System.err.println("REST SERVICE STARTED ON PORT NUMBER 8182...");
}
I am running this application in local and not in any Web/App Server.