Rundeck StreamingLogWriter - Stream Options and Results - rundeck

How to send options (variables) and steps results from a Job to Logstash using StreamingLogWriter?
Is it possible?

I found a solution using NotificationPlugin.

Related

Spring batch integration using OutBoundGateway and ReplyingKafkaTemplate

My Goal
I need to read a file and divide each line as a message and send to kafka from a spring batch project and another spring integration project will be receiving the messages to process it in a async way. I want to return those messages after processing to the batch project and create 4 different files out of those messages.
Here I am trying to use OutBoundGateway and ReplyingKafkaTemplate. I am unable to configure it properly... Is there any example or reference guide to configure it.
I have checked spring batch integration samples github repository... There is no sample for outBoundGateway or ReplyingKafkaTemplate.
Thanks in Advance.
For ReplyingKafkaTemplate logic in Spring Integration there is a dedicated KafkaProducerMessageHandler which can be configured with a ReplyingKafkaTemplate.
See more info in docs:
https://docs.spring.io/spring-integration/docs/current/reference/html/kafka.html#kafka-outbound-gateway
And more about ReplyingKafkaTemplate:
https://docs.spring.io/spring-kafka/reference/html/#replying-template
Probably on the other a KafkaInboundGateway must be configured, respectively:
https://docs.spring.io/spring-integration/docs/current/reference/html/kafka.html#kafka-inbound-gateway

Making a determination whether DB/Server is down before I kick-off a pipeline

I want to check whether the database/Server is Online before I kick off a pipeline. In the database is down I want to cancel the pipeline processing. I also would like to log the results in a table.
format (columns) : DBName Status Date
If the DB/Server is down then I want to send an email to concerned team with formatted table showing which DB/Servers are down.
Approach:
Run a query on each of the servers. If there is a result, then format output as shown above. I am using ADF pipeline to achive this. My issue is how do I combine various outputs from different servers.
For e.g.
Server1:
DBName: A Status: ONLINE runDate:xx/xx/xxxx
Server2:
DBName: B Status: ONLINE runDate:xx/xx/xxxx
I would like to combine them as follows:
Server DBName Status runDate
1 A ONLINE xx/xx/xxxx
2 B ONLINE xx/xx/xxxx
Use this to update the logging table as well as in the email if I were to send one out.
Is this possible using the Pipeline activities or do I have to use mapping dataflows?
I did similar work a few weeks ago. We make an API where we put all server-related settings or URL endpoint which we need to ping.
You don't require to store username-password (of SQL Server) at all. When you ping the SQL server, it will timeout if it isn't online. If it's online it will give you password related error. This way you can easily figure out whether it's up and running.
AFAIK, If you are using azure-DevOps you can use your service account to log into the SQL server. If you have set up an AD to log into DevOps, this thing can be done in the build script.
Both way you will be able to make sure whether SQL Server is Up and Running or not.
You can have all the actions as tasks in a yaml pipeline
You need something like below:
steps:
task: Check database status
register: result
task: Add results to a file
shell: "echo text >> filename"
task: send e-mail
when: some condition is met
There are several modules to achieve what you need. You need to find the right modules. You can play around with the flow of tasks by registering results and using the when clause.

Passing config values to OWASP ZAP rest api script as a file: format?

I wanted to automate API pentesting.
I referred this blog:
https://zaproxy.blogspot.in/2017/06/scanning-apis-with-zap.html
Could you direct me to where I can get a sample zap-options file that we pass with -z option to the zap-api-scan.py script, or where I can get documentation regarding the format in which config values has to be specified in the file? I could not find the official ZAP docs.
See this FAQ https://github.com/zaproxy/zaproxy/wiki/FAQconfigValues - thats the best we've got at the moment I'm afraid

UI console to browse topics on Message Hub

I have a Message Hub instance on Bluemix, and am able to produce / consume messages off it. I was looking for a quick, reasonable way to browse topics / messages to see what's going on. Something along the lines of kafka-topics-ui.
I installed kafka-topics-ui locally, but could not get it to connect to Message Hub. I used the kafka-rest-url value from the MessageHub credentials in the kafka-topics-ui configuration file (env.js), but could not figure out where to provide the API key.
Alternatively, in the Bluemix UI, under Kibana, I can see log entries for creating the topic. Unfortunately, I could not see log entries for messages in the topic (perhaps I'm looking the wrong place or have wrong filters?).
My guess is I'm missing something basic. Is there a way to either:
configure a tool such as kafka-topics-ui to connect to MessageHub,
or,
browse topic messages easily?
Cheers.
According to Using the Kafka REST API on Bluemix you need an additional header in all API requests:
-H "X-Auth-Token: APIKEY"
A quick solution is to edit the topic-ui code and include your token in every request. Another solution would be to use a Chrome plugin that can inject the above header. For a more formal solution, i have opened a ticket on github

Mongo MapReduce - Checking the Standard Output

I am trying to execute mongo mapreduce commands via the java api - com.mongodb.MapReduceCommand . Prior to this, when I was executing the command via the mongo shell, I was able to view the output from the standard output stream. When I invoke the command via the java api, how could I check for any output for progress/error info ?
I have enabled the verbosity to true (by default) .
Thanks!
The MapReduceCommand is blocking, so there is no way to check the progress. However, you can use MapReduceOutput class to capture the result of a map/reduce operation.