How to get job submission date from IBM workload scheduler (IWS) - scheduler

I use IWS to submit job but I want to get job submission date from IWS and assign it into the variable. Then, I want to use the variable parameter to pass date thru batch file.
I don't know how to do it. Can anyone suggest me about this?

If you submit a job stream (even with a single job), the schedTime of the job stream is the by default the submission time.
From basic UNIX/Windows jobs, on FTA or dynamic agents, you can retrieve the schedTime with the environment variables UNISON_SCHED_IA (e.g 202210260856), or if only need the date with UNISON_SCHED_DATE (e.g. 20221026).
If you are using an Executable job type on a Dynamic agent you can get the same value with ${tws.job.iawstz} variable.

Related

Using value from macro variable to create an scheduled pipeline-job

I wonder if it's possible in azure devOps to get the following work:
The user run a pipeline and enter 3 Macro-Variables: startDate, startTime, duration.
Based on the macro-variables, a pipeline-job should be scheduled to do something (e.g. a pipeline which set webpage in maintenance mode.
Or is the better way to solve this, to immediately run a pipeline, which create a scheduled command (e.g. via at on linux-bash) that will be executed in the future?

Do pipeline variables persist between runs?

I'm doing a simple data flow pipeline between 2 cosmos dbs. The pipeline starts with the dataflow, which grabs the pipeline variable "LastPipelineStartTime" and passes that parameter to the dataflow for the query to use in order to get all new data where c._ts >= "LastPipelineStartTime". Then, on data flow success, updates the variable via Set Variable to the pipeline.TriggerTime(). Essentially so I'm always grabbing new data between pipeline runs.
My question is: it looks like the variable during each debug run reverts back to its Default Value of 0, and instead grabs everything each time. Am I misunderstanding or using pipeline variables wrong? Thanks!
As i know,the variable which is set in the Set Variable Activity has it's own life cycle: during current execution of pipeline.Any change of variable can't persist until next execution stage.
To implement your needs,pls refer to my workarounds as below:
1.If you execute ADF pipeline in the schedule,you could just pass the schedule time as parameter into it to make sure you grab new data.
2.If the frequency is random,persist the trigger time into other residence(e.g. simple file in the blob storage),before data flow activity,use LookUp Activity to grab that time from blob storage file.

Append date and time to the std_out_file in jil

I have a job in Autosys. I want new log file to be created for its every run. Hence, I want to append date and time to the logfile name I am giving in std_out_file. Is there any way to do this other than creating a global variable and then updating it everyday using another autosys job?
There is no default to do this. You can use this:
\\logpath\%auto_job_name%.%autorun%.OUT.txt
This will create a run number and it will not overwrite the logs even if there are multiple runs a day. You can use the date/time that is when the file is created for sorting.

Quartz capabilities

I am trying to create a Quartz scheduler using Java which will be able to call an API and pass in data.
I am totally new to Quartz but now I understand the Job concept and how to create one. I understand the trigger concept and how to trigger one
and I understand how the scheduler works.
What I am having difficult with is how can I pass in the information which is required to be passed to the API. I have an example of an API being called and the data is entered into the DB but the information has been hard coded into the class be passed into the JobDetails.
Ie. the user passes a message to the system which needs to be sent to the user in 12 hours and not before, so what i was planning was create a Job and a trigger in to set the execute time to 12 hours. How to do i pass the message into the scheduler? Where should this message be stored? Is what I am trying to do possible? Have i misunderstood what Quartz is capable of doing?
Thank you for your time. Any assistance would be greatly appreciated.
Take a look at JobDataMap. If you are creating a new job for each user action you can store the message in there which will be available during the execution.
JobDataMap Holds state information for Job instances.
JobDataMap instances are stored once when the Job is added to a scheduler. They are also re-persisted after every execution of jobs annotated with #PersistJobDataAfterExecution.
JobDataMap instances can also be stored with a Trigger. This can be useful in the case where you have a Job that is stored in the scheduler for regular/repeated use by multiple Triggers, yet with each independent triggering, you want to supply the Job with different data inputs.
The JobExecutionContext passed to a Job at execution time also contains a convenience JobDataMap that is the result of merging the contents of the trigger's JobDataMap (if any) over the Job's JobDataMap (if any).
In case you have a single job but for each user action you are creating a new trigger, you can follow the solution given here.
Third option will be, for each user action, persist the message and time to send email to the database. Have a job that runs periodically and scans through database for eligible records for which email has to be sent

Can I schedule a workflow in CQ5?

In CQ5, there is an option to schedule page activation on a particular date. I want to be able to do the same with a workflow — that I can initiate/queue it today, but it will only start executing its steps on a specified date.
Is this possible to implement this feature via a custom workflow step, using the Workflow API? Or is there another way this could be done, e.g. using Sling Events/Scheduling?
There's a process step called the AbsoluteTimeAutoAdvancer which reads a property named absoluteTime from the WorkflowData's MetaData. This is expected to be numeric long value which represents the activation time since Epoch in milliseconds.
The trick is to set this value in the metadata. I would suggest reading
extending workflows the section entitled Saving Property Values in Workflow Metadata