How to add custom shutdown hook to Kafka Connect? - apache-kafka

Is there a way to add custom shutdown hook to kafka connect which I can put in the classpath using the plugins.path property?
Use Case:
The Kafka Connect cluster tries to connect to Kafka Cluster.
If it fails it logs and shutsdown immediately
The error logs does not reach to the remote log server like Splunk
I need to delay the shutdown so that the log collector agent can push the logs to the target log server.
I am using Confluent Platform v 6.1

The best way to accomplish what you are looking for is write the log file to a location that outlives the container. This could be a persistent volume like #OneCricketeer mentioned previously, or a network based logging service.
If for some reason you can't do that, you can create a JVM shutdown hook using a Java Agent. This hook could delay the JVM's shutdown long enough (risky) or it could force a flush of the logging library, or any other cleanup. Since the agent is configured as a JVM command line argument, it should work for your kafka-connect workers. You just need to modify the command line that runs the workers.
There are good instructions for creating a Java Agent here and an example for setting up a shutdown hook here: Java shutdown hook
Here is a super simple example class that has both the applications main() method and the Agent's premain() in the same class:
public class App
{
public static void main( String[] args ) throws InterruptedException {
System.out.println( System.currentTimeMillis() + ": Main Started!" );
Thread.sleep(1000);
System.out.println( System.currentTimeMillis() + ": Main Ended!" );
}
public static void premain(String agentArgs, Instrumentation inst) {
System.out.println(System.currentTimeMillis() + ": Agent Started");
Runtime.getRuntime()
.addShutdownHook(
new Thread() {
#Override
public void run() {
System.out.println(System.currentTimeMillis() + ": Shutdown Hook is running!");
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
// do nothing
}
System.out.println(System.currentTimeMillis() + ": Shutdown hook is completed");
}
});
}
}
Note that in your case, you only need the premain method, as the main method is implemented by the connect worker.
Running the above class with the following command line:
java -javaagent:<MY_AGENT_FILE>.jar -classpath <MY_APP_FILE>.jar
org.example.App
generates the following output:
1630187996652: Agent Started
1630187996672: Main Started!
1630187997672: Main Ended!
1630187997673: Shutdown Hook is running!
1630188002675: Shutdown hook is completed
So you have your delay.

Related

spring batch database connection failure is not returning proper exit status

I have gone through few posts related to the similar issue, i am calling spring batch application through shell script and getting the exit status.Everything works fine on successful execution. ExitStatus gets populated as 0. However if there is any database error(to create database error i gave the wrong port of database) then ExitStatus is being returned as empty. Code is below
I have referred below posts and implemented similarly
Make a spring-batch job exit with non-zero code if an exception is thrown
Spring batch return custom process exit code
Shell Script:
java -jar $JOBDIR/lib/feed*.jar
result=$?
echo $result
Java:
public static void main(String[] args) {
ConfigurableApplicationContext context
=SpringApplication.run(App.class, args);
int exitCode = SpringApplication.exit(context);
System.out.print("Exit code is" + exitCode);
System.exit(exitCode);
}
#Primary
#Bean(destroyMethod = "")
public DataSource dataSource() throws Exception {
return BatchDataSource.create(url, user, password);
}
in case of database error it is not even reaching end of the main method System.exit(exitCode); Can any one guide me what is wrong??
if there is any database error(to create database error i gave the wrong port of database) then ExitStatus is being returned as empty.
That's because in that case, your job is not executed at all. According to your configuration, the dataSource bean creation error prevent the Spring application context from being correctly started and your job is not even executed.

Vertx Java Client throwing "SMF AD bind response error" while connecting solace vmr Server

When I am trying to connect solace VMR Server and deliver the messages from a Java client called Vertx AMQP Bridge.
I am able to connect the Solace VMR Server but after connecting, not able to send messages to solace VMR.
I am using below sender code from vertx client.
public class Sender extends AbstractVerticle {
private int count = 1;
// Convenience method so you can run it in your IDE
public static void main(String[] args) {
Runner.runExample(Sender.class);
}
#Override
public void start() throws Exception {
AmqpBridge bridge = AmqpBridge.create(vertx);
// Start the bridge, then use the event loop thread to process things thereafter.
bridge.start("13.229.207.85", 21196,"UserName" ,"Password", res -> {
if(!res.succeeded()) {
System.out.println("Bridge startup failed: " + res.cause());
return;
}
// Set up a producer using the bridge, send a message with it.
MessageProducer<JsonObject> producer =
bridge.createProducer("T/GettingStarted/pubsub");
// Schedule sending of a message every second
System.out.println("Producer created, scheduling sends.");
vertx.setPeriodic(1000, v -> {
JsonObject amqpMsgPayload = new JsonObject();
amqpMsgPayload.put(AmqpConstants.BODY, "myStringContent" + count);
producer.send(amqpMsgPayload);
System.out.println("Sent message: " + count++);
});
});
}
}
I am getting the error below:
Bridge startup failed: io.vertx.core.impl.NoStackTraceThrowable:
Error{condition=amqp:not-found, description='SMF AD bind response
error', info={solace.response_code=503, solace.response_text=Unknown
Queue}} Apr 27, 2018 3:07:29 PM io.vertx.proton.impl.ProtonSessionImpl
WARNING: Receiver closed with error
io.vertx.core.impl.NoStackTraceThrowable:
Error{condition=amqp:not-found, description='SMF AD bind response
error', info={solace.response_code=503, solace.response_text=Unknown
Queue}}
I have created queue and also topic correctly in solace VMR but not able to send/receive messages. Am I missing any configuration from solace VMR Server side? Is there any code-change required in the Vertx Sender Java code above? I am getting the error trace above when delivering message. Can someone help on the same?
Vertx AMQP Bridge Java client :https://vertx.io/docs/vertx-amqp-bridge/java/
There are a few different reason why you may be encountering this error.
It could be that the client is not authorized to publish guaranteed messages. To fix this, you need to enable "guaranteed endpoint create" in the client-profile on the Solace router side.
It may also be that the application is using Reply Handling. This is not currently supported with the Solace router. Support for this will be added in the 8.11 release of the Solace VMR. A workaround for this would be to set ReplyHandlingSupport to false.
AmqpBridgeOptions().setReplyHandlingSupport(false);
There is also a known issue in the Solace VMR which causes this error when unsubscribing from a durable topic endpoint. A fix for this issue will also be in the 8.11 release of the Solace VMR. A workaround for this is to disconnect the client without first unsubscribing.

Relation between rest camel and mongodb

I novice in camel.
What i have:
- rest app deployed on tomcat
- mongodb
What i want to do:
I want to send request from rest app to camel and camel send request to mongodb and then camel send response to the rest app. (request rest -> camel -> mongodb , response mongodb->camel->rest )
I can't find information about it.
how i can do this?
my Rest class
#Path("/leave")
public class Leave {
#GET
#Path("/all")
#Produces("application/json")
public String getLeaveRequestList(){
return "{\"status\":200}";
}}
my route
public class CamelRouteConfig extends RouteBuilder {
#Override
public void configure() throws Exception {
restConfiguration().host("localhost").port(8080);
rest("/leave")
.post("/all")
.consumes("application/json")
.to("stream:out");
}
}
it do nothing. why? - i have no idea
contex method
CamelRouteConfig routeConfig = new CamelRouteConfig();
CamelContext context = new DefaultCamelContext();
try {
context.addRoutes(routeConfig);
context.start();
}finally {
context.stop();
}
thx for your attention!
You are running Camel as standalone. When you call context.start(), the method does not block. It means that after starting, context.stop() is called immediately. Camel shuts down and REST service goes down with it.
See the following articles: Running Camel standalone and have it keep runninge and running Camel standalone.
Use class org.apache.camel.main.Main and use run() method:
From run javadoc:
Runs this process with the given arguments, and will wait until completed, or the JVM terminates.
Throws:
Exception

loadbalanced ribbon client initialization against discovery service (eureka)

I have service which runs some init scripts after application startup (implemented with ApplicationListener<ApplicationReadyEvent>). In this scripts I need to call another services with RestTemplate which is #LoadBalanced. When the call to service is invoked there's no information about instances of remote service because discovery server was not contacted at that time (I guess).
java.lang.IllegalStateException: No instances available for api-service
at org.springframework.cloud.netflix.ribbon.RibbonLoadBalancerClient.execute(RibbonLoadBalancerClient.java:79)
So is there way how to get list of available services from discovery server at application startup, before my init script will execute?
Thanks
edit:
The problem is more related to fact, that in current environment (dev) all services are tied together in one service (api-service). So from within api-service I'm trying to call #LoadBalanced client api-service which doesn't know about self? Can I register some listener or something similar to know when api-service (self) will be available?
here are the sample applications. I'm mainly interested how to have working this method
edit2:
Now there could be the solution to create EurekaListener
public static class InitializerListener implements EurekaEventListener {
private EurekaClient eurekaClient;
private RestOperations restTemplate;
public InitializerListener(EurekaClient eurekaClient, RestOperations restTemplate) {
this.eurekaClient = eurekaClient;
this.restTemplate = restTemplate;
}
#Override
public void onEvent(EurekaEvent event) {
if (event instanceof StatusChangeEvent) {
if (((StatusChangeEvent) event).getStatus().equals(InstanceInfo.InstanceStatus.UP)) {
ResponseEntity<String> helloResponse = restTemplate.getForEntity("http://api-service/hello-controller/{name}", String.class, "my friend");
logger.debug("Response from controller is {}", helloResponse.getBody());
eurekaClient.unregisterEventListener(this);
}
}
}
}
and then register it like this:
EurekaEventListener initializerListener = new InitializerListener(discoveryClient, restTemplate);
discoveryClient.registerEventListener(initializerListener);
However this is only executed only when application is registered to discovery service first time. Next time when I stop the api-service and run it again, event is not published. Is there any other event which can I catch?
Currently, in Camden and earlier, applications are required to be registered in Eureka before they can query for other applications. Your call is likely too early in the registration lifecycle. There is an InstanceRegisteredEvent that may help. There are plans to work on this in the Dalston release train.

Netbeans Run->Stop Build does not stop the running java processes

I'm using Netbeans 7.1.2 on Mac OS X Lion 10.7.3.
I have a simple java server (using netty) project and after I build and run the project then try to stop the running by Run->Stop Build it does not terminate the java server process.
For example, my server app uses port 8080 and even after I stop running from the netbeans the port 8080 is in use and the app keeps running. I have to manually kill the java process from activity monitor.
What's the correct procedure to end the running app started by Netbeans?
Greatly appreciate your help. Thanks in advanced.
Have a look at the documentation: http://netty.io/docs/stable/guide/html/. Scroll down to section 9. Shutting Down Your Application.
Your main app ahs to look something like:
package org.jboss.netty.example.time;
public class TimeServer {
static final ChannelGroup allChannels = new DefaultChannelGroup("time-server"(35));
public static void main(String[] args) throws Exception {
...
ChannelFactory factory = ...;
ServerBootstrap bootstrap = ...;
...
Channel channel = bootstrap.bind(...);
allChannels.add(channel);
...
// Shutdown code
ChannelGroupFuture future = allChannels.close();
future.awaitUninterruptibly();
factory.releaseExternalResources();
}
}
You handler needs:
public void channelOpen(ChannelHandlerContext ctx, ChannelStateEvent e) {
TimeServer.allChannels.add(e.getChannel());
}
You have to:
1. Store all your channels in a ChannelGroup
2. When shutting down, close all channel and release resources.
Hope this helps.