I'm using Rundeck-2.6.4 and I'm trying to reuse jobs of common task together with option param to control the variables.
Task B has some option params defined and i've added a workflow step on Task A to link to Task B. However I do not have any way to customize the option parameter in Task A to input to Task B.
Any inputs on this?
Think I found the answer,
http://rundeck.org/1.6.2/manual/job-workflows.html#job-reference-step
Finally, if the Job defines Options, you can specify them in the commandline arguments text field and can include variable expansion to pass any input options for the current job. Format:
-optname -optname ...
Related
I have a question and I don't find anything about it in microsoft documents.
I use this agentles job:
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/http-rest-api?view=azure-devops
I couldn't find how I can use the output variables of this task without using any code / script for use in another task.
Any help / document would be appreciated.
I think it is impossible.
The task uses agentless job but that job could not support environment variable. And Invoke REST API task did not have the option to output the result variable.
One alternative approach is that you could use agent job with a powershell task(that could invoke rest api) and then use custom variable to get that output variables.
Otherwise, there is no function to realize that.
I am having a runtime argument set at namespace which is business_date: ${logicalStartTime(yyyy-MM-dd)} . I am using this argument in my pipeline and want to use the same in other pipeline.
There are many pipelines back to back and I want to the value to be same throughout the pipelines once calculated in the first pipeline.
suppose the value is calculates as '2020-08-20 20:14:11' and once the pipeline one succeeded i am passing this argument to pipeline 2, but as this arguments are defined at namespace level it is getting overrided when pipeline2 starts.
How can I have prevent this value to be calculated again ?
As it was commented before, you could set up one pipeline to trigger another pipeline; you can set a runtime variable in the first pipeline and this variable will be set in the triggered pipelines. You can create inbound trigger by following the next steps:
Once you have created your pipelines, select the last pipeline you want to be run. In my case I have DataFusionQuickstart2 pipeline.
Into the pipeline application, on the left side, click on "Inbound triggers" -> "Set pipeline triggers" and you will see the pipes which you can be triggered. Check the event that will trigger the DataFusionQuickstart2 pipeline from DataFusionQuickstart and enable it.
If you take a look to the previous pipeline DataFusionQuickstar you will see, into the outbound trigger option (right side), the pipelines that will be triggered by DataFusionQuickstar.
Finally set your runtime argument.
Additional information
In this post, it was mentioned that there are three ways you can set the runtime argument of a pipeline:
Argument Setter plugin (You can write down that value in a file into the first pipeline. In all subsequent pipelines, create a parameter to read that file.)
Passing runtime argument when starting a pipeline (The one described above)
Setting Preferences (It provides the ability to save configuration information at various levels of the system, including the CDAP instance, namespace, application, and program levels.)
You can write down that value in a file in first pipeline. In all subsequent pipelines, create a parameter to read that file. That way, objective should be achieved.
#Sudhir, you may explore PREFERENCES. https://cdap.atlassian.net/wiki/spaces/DOCS/pages/477561058/Preferences+HTTP+RESTful+API
You have set the variable at namespace level and as per your finding it is getting evaluated each time its being used.
Can you try setting it at Application level?
And pass it to next pipeline. I believe in that case, it should be evaluated only once in that specific application (pipeline) and thereafter value would be passed.
Preference is also available at program level.
With designer/class build pipeline, you can define pipeline variables with default values to be passed into the tasks. How do I do the same for a YAML-based pipeline?
I want to create three build pipelines, each would have a single variable set to a different value. All three point to a single YAML file. The documentation states:
You can choose which variables are allowed to be set at queue time and which are fixed by the pipeline author. If a variable appears in the variables block of a YAML file, it is fixed and cannot be overridden at queue time. To allow a variable to be set at queue time, make sure it doesn't appear in the variables block of a pipeline or job. You can set a default value in the editor, and that value can be overridden by the person queuing the pipeline.
It's not clear how to do this for YAML file.
I can create a template YAML file, and an individual YAML file for each config value that calls the template file, but then I can't set configuration value at run-time.
when you edit the build definition (not when you create it, at least with default experience). you'd need to click on 3 dots and pick variables from the list:
and there you would be able to define variables, and they would have a checkbox - Settable at queue time.
I have to create a VSTS Build/Release task extension which takes a dictionary as the input type.
I need to pass n number of parameters to the task and also the parameter name and the parameter value should be given dynamically (while configuring the CI and CD).
Is there any input type which is similar to the Variable section where we can specify the Key-Value pair and also we can add n number of variables.
You can take a look at the Azure RG task which has a key value input. See the overrideParameters input, it has a special editor extension.
EDIT: search for editorExtension and you will find more examples of grids.
In luigi, I know how to use its parameter mechanism to pass command-line parameters into a task. However, if I do so, the parameter becomes part of the task's signature.
But there are some cases -- for example, if I want to optionally pass a --debug or --verbose flag on the command line -- where I don't want the command-line parameter to become part of the task's signature.
I know I can do this outside of the luigi world, such as by running my tasks via a wrapper script which can optionally set environment variables to be read within my luigi code. However, is there a way I can accomplish this via luigi, directly?
Just declare them as insignificant parameters, ie instantiate the parameter class passing significant=False as keyword argument.
Example:
class MyTask(DateTask):
other = luigi.Parameter(significant=False)