Passing input to ECS task from CloudWatch rule - amazon-ecs

I have requirement of scheduling ECS task with two Boolean variable as an input to it, I am aware of cloudWatch scheduling through cron expression but I cant pass input to ECS task, any suggestion if its doable without intermediate Lambda ?

I had a similar requirement, and ended up naming my cloudwatch rules meaningfully, so I can derive info from the invoking cloudwatch rule. I then parse and extract fields from the name, and treat these as parameters to configure my task.

Related

How to get job submission date from IBM workload scheduler (IWS)

I use IWS to submit job but I want to get job submission date from IWS and assign it into the variable. Then, I want to use the variable parameter to pass date thru batch file.
I don't know how to do it. Can anyone suggest me about this?
If you submit a job stream (even with a single job), the schedTime of the job stream is the by default the submission time.
From basic UNIX/Windows jobs, on FTA or dynamic agents, you can retrieve the schedTime with the environment variables UNISON_SCHED_IA (e.g 202210260856), or if only need the date with UNISON_SCHED_DATE (e.g. 20221026).
If you are using an Executable job type on a Dynamic agent you can get the same value with ${tws.job.iawstz} variable.

Cron Binding - Does it have a load balancing option?

I did some tests with the Dapr Cron Binding and noticed that each instance of my application is triggered by that binding.
However, I'm afraid this will cause multiple unnecessary actions as a result (eg multiple requests to a third application).
I would like to know if it is possible to load balace the Cron Binding calls, similarly to what happens with the service invocation.
I also take the opportunity to ask if this also applies to the other Input Bindings, that is, is a trigger sent to each instance or to a single one?

Spring Batch - Is it possible for Job_Instance to have multiple Job_execution?

I'm just wondering whether its possible for a single Job_instance in spring Batch to have multiple Job_execution if so can anyone please explain the process? And also for step_execution please !
Yes, a job instance can have multiple job executions. The typical use case is when a job instance has a first execution that fails, and a second (or subsequent) execution that succeeds.
This is explained in details with concrete examples in the Domain language of Batch section of the reference documentation.

run azure pipeline task as per dynamic conditions

Can we run task of azure pipeline as per dynamic condition?
Actually i want to run some of the task from the listed task, and every time choice of the task to run maybe different as per user requirement.
For your issue,I think you can set conditions through Control Options in task to control the run of task. If this does not meet your needs, you can give a specific example, so that I can better understand your request.
Inside the Control Options of each task you can specify the conditions under which the task will run.
If the built-in conditions don't meet your needs, then you can specify custom conditions
Conditions are written as expressions. The agent evaluates the expression beginning with the innermost function and works its way out. The final result is a boolean value that determines if the task should run or not. See the expressions topic for a full guide to the syntax.
Here is a document provided some examples,you can refer to it.

deploying Batch with CloudFormation

I've been able to create a Compute Environment, a Job Queue and about a dozen Job Definitions using CloudFormation. Great!
Unless I'm missing something, there doesn't seem to be an element to actually submit my Job Definitions using CloudFormation. :(
At first, I thought I had it figured out because you can create CloudWatch Events that trigger a Job Submission. However, I notice that the Event Rule in CloudFormation does not have support for Batch like the CLI/SDK does. Lame!
Anyone else deploying Batch with CloudFormation? How are you submitting jobs? I guess I can create a Custom Resource, but that seems harder than it should be.
Does https://docs.aws.amazon.com/batch/latest/userguide/batch-cwe-target.html solve your problem?
AWS Batch jobs are available as CloudWatch Events targets. Using simple rules that you can quickly set up, you can match events and submit AWS Batch jobs in response to them.
When you create a new rule, add the batch job as a target.
The easiest way would be to create a Lambda function. You can create it via CF and capture your requirement in the function code.
Or like you mentioned, you can create a custom resource.