How to get an option's name in a Rundeck JOB? - rundeck

I have a JOB rundeck called "TEST"
I have an option called country
this option retreives a list of key, value from a remote URL as :
[
{"name":"FRANCE", "value":"FR"},
{"name":"ITALY", "value":"IT"},
{"name":"ALGERIA", "value":"DZ"}
]
I would like to use both of the name and the value in a job step.
echo ${option.country.name}
echo ${option.country.value}
But this doesn't work and I'm not able to get the name of the parameter
getting the value can be done using ${option.country}
Is there any trick to get the parameter name ???

Just for the record answer: Maybe the best approach is to create some script-step that reads the JSON file and extracts the name, also, you can use the same value name like this example (of course is not applicable for all cases).

Related

How to select object attribute in ADF using variable

I'm trying to parametrize a pipeline in Azure Data Factory in order to enable a certain functionality to mulptiple environments. The idea is that the current environment is always available through a global parameter. I'd like to use this parameter to look up an array of environments to process data to. Example:
targetEnvs = [{ "dev": ["dev"], "test": ["dev", "test"], "acc": [], "prod": ["acc", "prod"] }]
Then one should be able to select the targetEnv array with something like targetEnvs[environment] or targetEnvs.environment. Subsequently a ForEach is used to execute some logic on these target environments.
I tried setting this up with targetEnvs as a pipeline parameter (with default value mapping each env directly to targetEnv, as follows: {"dev": ["dev"], "test": ["test"]}) Then I have a Set variable step to take value from the targetEnvs parameter, as follows:.
I'm now looking for a way to use the current environment (stored in a global parameter) instead of hardcoding "dev" in the Set Variable expression, but I'm not sure how to do this.
.
Using this expression won't even start the pipeline.
.
Question: how do I select this attribute of the object? Any other suggestions on how to do tackle this problem are welcome as well!
(Python analogy would be to have a dictionary target_envs and taking a value from it by using the key "current_env": target_envs[current_env].)
When I tried to access the object same as you, the same error occurred. I have taken the parameter targetEnv (given array) and global parameter environment with value as dev.
You can use the following dynamic content to access the key value.
#pipeline().parameters.targetEnv[0][pipeline().globalParameters.environment]

How to get the id of the run from within a component?

I'm doing some experimentation with Kubeflow Pipelines and I'm interested in retrieving the run id to save along with some metadata about the pipeline execution. Is there any way I can do so from a component like a ContainerOp?
You can use kfp.dsl.EXECUTION_ID_PLACEHOLDER and kfp.dsl.RUN_ID_PLACEHOLDER as arguments for your component. At runtime they will be replaced with the actual values.
I tried to do this using the Python's DSL but seems that isn't possible right now.
The only option that I found is to use the method that they used in this sample code. You basically declare a string containing {{workflow.uid}}. It will be replaced with the actual value during execution time.
You can also do this in order to get the pod name, it would be {{pod.name}}.
Since kubeflow pipeline relies on argo, you can use argo variable to get what you want.
For example,
#func_to_container_op
def dummy(run_id, run_name) -> str:
return run_id, run_name
#dsl.pipeline(
name='test_pipeline',
)
def test_pipeline():
dummy('{{workflow.labels.pipeline/runid}}', '{{workflow.annotations.pipelines.kubeflow.org/run_name}}')
You will find that the placeholders will be replaced with the correct run_id and run_name.
For more argo variables: https://github.com/argoproj/argo-workflows/blob/master/docs/variables.md
To Know what are recorded in the labels and annotation in the kubeflow pipeline run, just get the corresponding workflow from k8s.
kubectl get workflow/XXX -oyaml
create_run_from_pipeline_func which returns RunPipelineResult, and has run_id attribute
client = kfp.Client(host)
result = client.create_run_from_pipeline_func(…)
result.run_id
Your component's container should have an environment variable called HOSTNAME that is set to its unique pod name, from which you derive all necessary metadata.

Data factory lookup (dot) in the item() name

I am having lookup wherein salesforce query is there. I am using elements (item()) in subsequent activities. Till now i had item().name or item().email but now i have item().NVMStatsSF__Related_Lead__r.FirstName which has (dot) in the field name.
How should i parse it through body tag so that it reads it correctly?
So I have the following data in item()
{
"NVMStatsSF__Related_Lead__c": "00QE000egrtgrAK",
"NVMStatsSF__Agent__r.Name": "ABC",
"NVMStatsSF__Related_Lead__r.Email": "geggegg#gmail.com",
"NVMStatsSF__Related_Lead__r.FirstName": "ABC",
"NVMStatsSF__Related_Lead__r.OwnerId": "0025434535IIAW"
}
now when i use item().NVMStatsSF__Agent__r.Name it will not parse because of (dot) after NVMStatsSF__Agent__r. And it is giving me the following error.
'item().NVMStatsSF__Related_Lead__r.Email' cannot be evaluated because property 'NVMStatsSF__Related_Lead__r' doesn't exist, available properties are 'NVMStatsSF__Related_Lead__c, NVMStatsSF__Agent__r.Name, NVMStatsSF__Related_Lead__r.Email, NVMStatsSF__Related_Lead__r.FirstName, NVMStatsSF__Related_Lead__r.OwnerId'.",
"failureType": "UserError",
"target": "WebActivityToAddPerson"
this is because ADF uses '.' for object reading.
Could you find a way to rename the field name which contains '.'?
Seems like you need a built-in function to get the value of an object according to the key. Like getValue(item(), 'key.nestkey'). But unfortunately, seems there isn't such a function. You may need handle your key first.
Finally, it worked. I was being silly.
Instead of taking the value from the child table with the help of (dot) operator I just used subquery. Silly see.
And it worked.

using get-odbcdsn across all domain computers

So I need to create a report of what odbc-dsns are on computers.
The first hurdle I'm trying to do is get the csv to output correctly on MINE. Then I figure I'll just deploy a runonce group policy with the script set to append.
The problem is that get-odbcdns returns an object. That object has:
Name: the friendly name of the odbc
Attribute : {Description}
I just want to be able to formatlist with the VALUE of the {Description} value inside the Attribute property. I can't figure out how to do that.
Whenever I go | fl name, attribute
it returns 1sl-den-db01 and {description} I would like for it to actually parse out description from Attribute. No idea how. Thoughts?

Exception while reading password from a file as a Talend context variable

I currently have a Talend job which reads from a context file and feeds into context variables. I have a field called ftppassword and store the hard coded password in the context file. I then have a context variable in the job and refer to that in my job.
With this setup my job runs fine but if I change the context file to contain a location to a password file instead of the hard coded password, I get the following exception:
Exception in component
tFTPConnection_1 com.enterprisedt.net.ftp.FTPException: 530 Login
incorrect. at
com.enterprisedt.net.ftp.FTPControlSocket.validateReply(FTPControlSocket
.java:11‌​79) at
com.enterprisedt.net.ftp.FTPClient.password(FTPClient.java:1844) at
com.enterprisedt.net.ftp.FTPClient.login(FTPClient.java:1766) –
**Edit - 2014-12-08 ****
Output of context parameters:
Implicit_Context_Context set key "ftphost" with value "ftp.host.com"
Implicit_Context_Context set key "ftpport" with value "21"
Implicit_Context_Context set key "ftpusername" with value "myuser"
Implicit_Context_Context set key "ftppassword" with value "/opt/password_files/DW/test1.password"
Implicit_Context_Context set key "ftpremotepath" with value "/Output/"
Implicit_Context_Context set key "ftpfilemask" with value "test_dn.zip"
Have also tried changing the data type of ftppassword to File and Password but had no luck with that.
The implicit tContextLoad option on the job is the equivalent of putting a tFileInputDelimited component at the start of your job with a schema of 2 columns: key and value. This is then read into a tContextLoad (hence the option name) to load the contexts in your job.
If your password file isn't in a key-value format then you can't use it this way.
The simplest option is to stick with the way you had it working before and use an implicit tContextLoad to load a delimited file with key-value pairs of your context variables.
Another option would be to no longer do this using the implicit tContextLoad option and instead to do it explicitly.
To do this you'd want to read in your password file using an appropriate connector such as a tFileInputDelimited. If you were reading in something that looked like /etc/passwd then you could split it on : to get:
username
password
user id
group id
user id info
home directory
shell location
You could then use a tMap to populate an output schema of:
key
value
You would then enter "ftppassword" as the key and connect the password value to the value column. You'll also want to filter this record set so you only get one password being set so you might want to use something like "ftpUser".equals(row1.username) in the expression filter of your output table in the tMap.
Then just connect this to a tContextLoad component and your job should load the password from /etc/passwd for the "ftpUser" user account.
If you are looking to pass a file path to another file containing the password so that you can split the dependencies and allow one file to contain all the other contexts for the job but to keep the password file elsewhere then instead you'd want to pass a context variable pointing to the password file but then you'd have to explicitly consume it in the job.
In this case you may have a context file that is loaded at run time with contexts such as ftpremotepath, ftphost and ftpfilemask that can be set directly in the file and then a ftpusercredentials context variable that is a file path to a separate credentials file.
This file could then be another delimited file containing key-value pairs of context name and value such as:
ftpuser,myuser
ftppasswd,p4ssw0rd
Then at the start of your job you would explicitly read this in using a tFileInputDelimited component with a schema of 2 columns: key and value. You could then connect this to a tContextLoad component and this will load the second set of context variables into memory as well.
You could then use these as normal by referring to them as context.ftpuser and context.ftppasswd.