How to print Talend Job Description and other details like job creation date? - talend

like jobName variable ,what is the variable for jobDescription , along with jobName I want to print job Description and job Creation date, please see attached image.

Job metadata like description, author, creation date are not added to the generated java code, and so it is not possible to retrieve them from the job. They are only stored in source code. You can however print project name, jobname and version, using these variables:
job name: jobName
job version: jobVersion
project: projectName

Related

How to refer previous task and stop the build in azure devops if there is no new data to publish an artifact

Getsolution.exe will give New data available or no new data available, if new data available then next jobs should be executed else nothing should be executed. How should i do it? (i am working on classic editor)
example: i have set of tasks, consider 4 tasks:
task-1: builds the solution
task-2: runs the Getstatus.exe which get the status of data available or no data available
task-3: i should be able to use the above task and make a condition/use some api query and to proceed to publish an artifact if data is available if not cleanly break out of the task and stop the build. it Shouldn't proceed to publish artifact or move to the next available task
task-4:publish artifact
First what you need is to set a variable in your task where you run Getstatus.exe:
and then set condition in next tasks:
If you set doThing to different valu than Yes you will get this:
How to refer previous task and stop the build in azure devops if there is no new data to publish an artifact
Since we need to execute different task based on the different results of Getstatus.exe running, we need set the condition based on the result of Getstatus.exe running.
To resolve this, just like the Krzysztof Madej said, we could set variable(s) based on the return value of Getstatus.exe in the inline powershell task:
$dataAvailable= $(The value of the `Getstatus.exe`)
if ($dataAvailable -eq "True")
{
Write-Host ("##vso[task.setvariable variable=Status]Yes")
}
elseif ($dataAvailable -eq "False")
{
Write-Host ("##vso[task.setvariable variable=Status]No")
}
Then set the different condition for next task:
You could check the document Specify conditions for some more details.

get s3 url path of metaflow artifact

Is there a way to get the full s3 url path of a metaflow artifact, which was stored in a step?
I looked at Metaflow's DataArtifact class but didn't see an obvious s3 path property.
Yep, you can do
Flow('MyFlow')[42]['foo'].task.artifacts.bar._object['location']
where MyFlow is the name of your flow, 42 is the run ID, foo is the step under consideration and bar is the artifact from that step.
Based on #Savin's answer, I've written a helper function to get the S3 URL of an artifact given a Run ID and the artifact's name:
from metaflow import Flow, Metaflow, Run
from typing import List, Union
def get_artifact_s3url_from_run(
run: Union[str, Run], name: str, legacy_names: List[str] = [], missing_ok: bool = False
) -> str:
"""
Given a MetaFlow Run and a key, scans the run's tasks and returns the artifact's S3 URL with that key.
NOTE: use get_artifact_from_run() if you want the artifact itself, not the S3 URL to the artifact.
This allows us to find data artifacts even in flows that did not finish. If we change the name of an artifact,
we can support backwards compatibility by also passing in the legacy keys. Note: we can avoid this by resuming a
specific run and adding a node which re-maps the artifact to another key. This will assign the run a new ID.
Args:
missing_ok: whether to allow an artifact to be missing
name: name of the attribute to look for in task.data
run: a metaflow.Run() object, or a run ID
legacy_names: backup names to check
Returns:
the value of the attribute. If attribute is not found
Raises:
DataartifactNotFoundError if artifact is not found and missing_ok=False
ValueError if Flow not found
ValueError if Flow is found but run ID is not.
"""
namespace(None) # allows us to access all runs in all namespaces
names_to_check = [name] + legacy_names
if isinstance(run, str):
try:
run = Run(run)
except Exception as e:
# run ID not found. see if we can find other runs and list them
flow = run.split(sep="/")[0]
try:
flow = Flow(flow)
raise ValueError(f"Could not find run ID {run}. Possible values: {flow.runs()}") from e
except Exception as e2:
raise ValueError(f"Could not find flow {flow}. Available flows: {Metaflow().flows}") from e2
for name_ in names_to_check:
for step_ in run:
for task in step_:
print(f"task {task} artifacts: {task.artifacts} \n \n")
if task.artifacts is not None and name_ in task.artifacts:
# https://stackoverflow.com/a/66361249/4212158
return getattr(task.artifacts, name_)._object["location"]
if not missing_ok:
raise DataArtifactNotFoundError(
f"No data artifact with name {name} found in {run}. Also checked legacy names: {legacy_names}"
)

variable in deployment job doesnt extend value

I am having trouble getting a deployment job in a template to expand a variable it is given via a parameter. Ive used some short hand stuff below.
If you want to see the code, there is a prototype that shows the problem at https://github.com/ausfestivus/azureDevOpsPrototypes
The pipeline looks like this:
stage00
buildjob00
task produces output vars (name: taskName.VAR_NAME)
buildjob01
task is able to reference the variable and retrieve/display the variable value via
dependency notation. [dep.buildjob00.taskName.VAR_NAME]
template:
parameters:
bunchOfVarsAsSequenceFormat:
var1: [dep.buildjob00.taskName.VAR_NAME]
var2: [dep.buildjob00.taskName.VAR_NAME]
template contains:
buildjob02
this build job will see the variables values fine
deplomentjob00
this deploy job will see the variable names but contain empty values
Apologies if this is not well explained, hopefully the above prototype helps illustrate it better than the pseudo code above.
What a super help you shared your YAML scripts here! Otherwise, it's too difficult to understand your structure:-)
To display the variable in tmpl: deploy, you need change its corresponding dependsOn as job00, rather than templateJob.
- deployment: templateDeploy
displayName: 'tmpl: deploy'
continueOnError: false
dependsOn: job00
Then you would see the value could display successfully:

Azure DevOps: Getting variable value by concatenating other variables'value as task input

I have my release pipeline Variables tab set like:
I would like to access my123 variable in task's display name by concatenating initialVariable's result.
Outputs
I have tried so far referencing only initialVariable and it returned proper value in Job's display name.
But when I try to create my123 value by using initialVariable(=123), I am not getting proper value (was hoping that $(initialVariable) would convert to 123 and $(my123) would get proper "finalValue").
Azure DevOps: Getting variable value by concatenating other variables'value as task input
This is a known issue. Because the value of nested variables (like $(my$(initialVariable)) are not yet supported in the build/release pipelines.
Check my other thread for some details.
The workaround is add a Run Inline Powershell task to set the variable based on the input pipeline variables, just like Josh answered.
For you case, I test it by following Powershell scripts:
if ($(initialVariable)-eq "123")
{
Write-Host "##vso[task.setvariable variable=my123]finalvalue"
}
else
{
Write-Host "##vso[task.setvariable variable=my123]otherValue"
}
Then we could get the variable my123 based on the value of variable initialVariable in following task, I add command line task to display the value:
In the result, the value in the command line task is correct finalvalue. But the display name is still $(my123):
Important:
That is also the question in your comment. This behavior is expected. That because the variable in the display name is just to get the predefined value. It's static acquisition, not dynamic. The variable my123 is assigned when running powershell. The static variable my123 in the display name does not go in to the environment where the powershell code is running.
So, the variable my123 in the title could not get the value in the task powershell. But other task could use it very well.
Hope this answer clean your puzzle.
It's ugly, but...
Like I mentioned in my comment, I don't think you're going to get this to work in the UI by default.
Luckily you can use PowerShell to hack this together if your REALLY need the ability to address a variable name based on the value of another variable.
All the variables (secrets are handled a little differently) in your build or release pipeline definition are made available to your powershell script FILE (not inline) via environment variables (ie. $env:initialVariable).
Suppose your situation is thus:
selector = selectable1 //this is the value that can change
selectable1 = theFirstSelection
selectable2 = theSecondSelection
selectable3 = theThirdSelection
In this case (assuming I understand your request) you want to be able to change the value of the selector and force tasks to access the appropriate selectable variable.
So...
Define a new variable in your pipeline.
selector = selectable1 //this is the value that can change
selected = "" //this is the variable you use in your tasks
selectable1 = theFirstSelection
selectable2 = theSecondSelection
selectable3 = theThirdSelection
Write a VariableSelection.ps1 script. This powershell script will be what you need to run to assign the value of $(selected) before it gets used.
# VariableSelection.ps1
Write-Host "select variable: $env:selector"
$selectedValue = (gci env:"$env:selector").value
Write-Host "##vso[task.setvariable variable=selected]$selectedValue"
Note: it is my observation that if you write this script inline, it will not work b/c the environment variable functionality is different for scripts run from a file.
Given the value of $(selector) is selectable2, when the script is run, then the value of the $(selected) will be theSecondSelection.
Example in a Pipeline
Powershell
YAML
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- master
pool:
name: Hosted VS2017
variables:
- name: "selector"
value: "var1"
- name: "selected"
value: ""
- name: "var1"
value: "var1_value"
- name: "var2"
value: "var2_value"
steps:
- task: PowerShell#2
inputs:
filePath: '$(build.sourcesdirectory)/varSelector.ps1'
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Write-Host "env:selected: $env:selected"
Write-Host "selected: $(selected)"
Results

Jbehave : Story File Execution with Meta Filter on Eclipse (Local)

I have a smoke suite having xx test Cases. I have applied the Meta filter on only smoke level test cases. But when I tried with *.story as argument for eclipse, its hangup the execution while excluding the test cases as per the Meta Filter.
ENVIRONMENT: variable:- STORY_META_FILTER
Value:- +smoke
Story file Structure:
Scenario : test_xyz
Meta : #smoke
Given TEST1
When TEST2
Then TEST3
Can Anyone know the correct way to implement it in Eclipse? What will be the argument for Eclipse?
Set environment Variable with Meta_Filter and having value as +smoke
and run the story file with *.story