Argo Workflow not passing input parameters to WorkflowTemplate - kubernetes

I have broken down my workflow scenario into 2 separate WorkflowTemplates. outer-template would just define the steps and inner-template would hold that job definition that will spin up desired container, with all other fancy stuff. Now when I submit a request request.yaml, it does pass the parameter message down to outer and inner template and fails with this error:
hello-59jg8-394098346:
Boundary ID: hello-59jg8-1953291600
Children:
hello-59jg8-534805352
Display Name: [0]
Finished At: 2021-06-15T00:41:45Z
Id: hello-59jg8-394098346
Message: child 'hello-59jg8[0].init-step[0].step-1' errored
Name: hello-59jg8[0].init-step[0]
Phase: Error
Started At: 2021-06-15T00:41:45Z
Template Name: HelloWorld
Template Scope: namespaced/outer-template
Type: StepGroup
hello-59jg8-534805352:
Boundary ID: hello-59jg8-1953291600
Display Name: step-1
Finished At: 2021-06-15T00:41:45Z
Id: hello-59jg8-534805352
Message: inputs.parameters.message was not supplied
Name: hello-59jg8[0].init-step[0].step-1
Phase: Error
Started At: 2021-06-15T00:41:45Z
Template Ref:
Name: inner-template
Template: InnerJob
Template Scope: namespaced/outer-template
Type: Skipped
Phase: Failed
Started At: 2021-06-15T00:41:45Z
Stored Templates:
Below 2 are WorkflowTemplates and third one is the request.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: inner-template
namespace: cali
labels:
workflows.argoproj.io/controller-instanceid: cali
spec:
templates:
- name: InnerJob
metadata:
annotations:
sidecar.istio.io/inject: "false"
inputs:
parameters:
- name: message
- name: stepName
value: ""
resource:
action: create
successCondition: status.succeeded > 0
failureCondition: status.failed > 0
manifest: |
apiVersion: batch/v1
kind: Job
metadata:
generateName: hello-pod-
annotations:
sidecar.istio.io/inject: "false"
spec:
template:
metadata:
annotations:
sidecar.istio.io/inject: "false"
spec:
containers:
- name: hellopods
image: centos:7
command: [sh, -c]
args: ["echo ${message}; sleep 5; echo done; exit 0"]
env:
- name: message
value: "{{inputs.parameters.message}}"
- name: stepName
value: "{{inputs.parameters.stepName}}"
restartPolicy: Never
outputs:
parameters:
- name: job-name
valueFrom:
jsonPath: '{.metadata.name}'
- name: job-obj
valueFrom:
jqFilter: '.'
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: outer-template
namespace: cali
labels:
workflows.argoproj.io/controller-instanceid: cali
spec:
entrypoint: HelloWorld
templates:
- name: HelloWorld
inputs:
parameters:
- name: message
steps:
- - name: step-1
templateRef:
name: inner-template
template: InnerJob
arguments:
parameters:
- name: message
- name: stepName
value: "this is step 1"
- - name: step-2
templateRef:
name: inner-template
template: InnerJob
arguments:
parameters:
- name: message
- name: stepName
value: "this is step 2"
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: hello-
namespace: cali
labels:
workflows.argoproj.io/controller-instanceid: cali
spec:
entrypoint: HelloWorld
serviceAccountName: argo
templates:
- name: HelloWorld
steps:
- - arguments:
parameters:
- name: message
value: "Hello World....."
name: init-step
templateRef:
name: outer-template
template: HelloWorld

When passing an argument to a template in a step, you have to explicitly set the argument value.
In the outer-template WorkflowTemplate, you invoke inner-template twice. In each case you have half-specified the message argument. You have to also set the value for each parameter.
You should set value: "{{inputs.parameters.message}}" in step-1 and step-2. That will pull the message input parameter from outer-template.HelloWorld.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: outer-template
namespace: cali
labels:
workflows.argoproj.io/controller-instanceid: cali
spec:
entrypoint: HelloWorld
templates:
- name: HelloWorld
inputs:
parameters:
- name: message
steps:
- - name: step-1
templateRef:
name: inner-template
template: InnerJob
arguments:
parameters:
- name: message
value: "{{inputs.parameters.message}}"
- name: stepName
value: "this is step 1"
- - name: step-2
templateRef:
name: inner-template
template: InnerJob
arguments:
parameters:
- name: message
value: "{{inputs.parameters.message}}"
- name: stepName
value: "this is step 2"

Related

Tekton YAML TriggerTemplate - string substitution

I have this kind of yaml file to define a trigger
`
apiVersion: triggers.tekton.dev/v1alpha1
kind: TriggerTemplate
metadata:
name: app-template-pr-deploy
spec:
params:
- name: target-branch
- name: commit
- name: actor
- name: pull-request-number
- name: namespace
resourcetemplates:
- apiVersion: tekton.dev/v1alpha1
kind: PipelineRun
metadata:
generateName: app-pr-$(tt.params.actor)-
labels:
actor: $(tt.params.actor)
spec:
serviceAccountName: myaccount
pipelineRef:
name: app-pr-deploy
podTemplate:
nodeSelector:
location: somelocation
params:
- name: branch
value: $(tt.params.target-branch)
** - name: namespace
value: $(tt.params.target-branch)**
- name: commit
value: $(tt.params.commit)
- name: pull-request-number
value: $(tt.params.pull-request-number)
resources:
- name: app-cluster
resourceRef:
name: app-location-cluster
`
The issue is that sometime target-branch is like "integration/feature" and then the namespace is not valid
I would like to check if there is an unvalid character in the value and replace it if there is.
Any way to do it ?
Didn't find any valuable way to do it beside creating a task to execute this via shell script later in the pipeline.
This is something you could do from your EventListener, using something such as:
apiVersion: triggers.tekton.dev/v1alpha1
kind: EventListener
metadata:
name: xx
spec:
triggers:
- name: demo
interceptors:
- name: addvar
ref:
name: cel
params:
- name: overlays
value:
- key: branch_name
expression: "body.ref.split('/')[2]"
bindings:
- ref: your-triggerbinding
template:
ref: your-triggertemplate
Then, from your TriggerTemplate, you would add the "branch_name" param, parsed from your EventListener.
Note: payload from git notification may vary. Sample above valid with github. Translating remote/origin/master into master, or abc/def/ghi/jkl into ghi.
I've created a separate task that is doing all the magic I needed and output a valid namespace name into a different variable.
Then instead of use namespace variable, i use valid-namespace all the way thru the pipeline.
apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
name: validate-namespace-task-v1
spec:
description: >-
This task will validate namespaces
params:
- name: namespace
type: string
default: undefined
results:
- name: valid-namespace
description: this should be a valid namespace
steps:
- name: triage-validate-namespace
image: some-image:0.0.1
script: |
#!/bin/bash
echo -n "$(params.namespace)" | sed "s/[^[:alnum:]-]/-/g" | tr '[:upper:]' '[:lower:]'| tee $(results.valid-namespace.path)
Thanks

Use environment variable as default for another env variable in Kubernetes

Is there a way to use an environment variable as the default for another? For example:
apiVersion: v1
kind: Pod
metadata:
name: Work
spec:
containers:
- name: envar-demo-container
image: gcr.io/google-samples/node-hello:1.0
env:
- name: ALWAYS_SET
value: "default"
- name: SOMETIMES_SET
value: "custom"
- name: RESULT
value: "$(SOMETIMES_SET) ? $(SOMETIMES_SET) : $(ALWAYS_SET)"
I don't think there is a way to do that but anyway you can try something like this
apiVersion: v1
kind: Pod
metadata:
name: Work
spec:
containers:
- name: envar-demo-container
image: gcr.io/google-samples/node-hello:1.0
args:
- RESULT=${SOMETIMES_SET:-${ALWAYS_SET}}; command_to_run_app
command:
- sh
- -c
env:
- name: ALWAYS_SET
value: "default"
- name: SOMETIMES_SET
value: "custom"

How can you trigger an existing workflow/workflow-template outside argo-events template or namespace?

Based on documentation, we can trigger the creation of a workflow. Is there is a way to trigger an existing workflow (deployed in argo namespace) from a sensor in argo-events namespace?
Something like:
apiVersion: argoproj.io/v1alpha1
kind: Sensor
metadata:
name: webhook
spec:
template:
serviceAccountName: operate-workflow-sa
dependencies:
- name: test-dep
eventSourceName: webhook
eventName: example
triggers:
- template:
name: webhook-workflow-trigger
argoWorkflow:
source:
resource: existing-workflow-in-another-namespace
Existing Workflow:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: sb1-
labels:
workflows.argoproj.io/archive-strategy: "false"
spec:
entrypoint: full
serviceAccountName: argo
volumes:
- name: kaniko-secret
secret:
secretName: regcred
items:
- key: .dockerconfigjson
path: config.json
- name: github-access
secret:
secretName: github-access
items:
- key: token
path: token
templates:
- name: full
dag:
tasks:
- name: build
templateRef:
name: container-image
template: build-kaniko-git
clusterScope: true
arguments:
parameters:
- name: repo_url
value: git://github.com/letthefireflieslive/test-app-sb1
- name: repo_ref
value: refs/heads/main
- name: container_image
value: legnoban/test-app-sb1
- name: container_tag
value: 1.0.2
- name: promote-dev
templateRef:
name: promote
template: promote
clusterScope: true
arguments:
parameters:
- name: repo_owner
value: letthefireflieslive
- name: repo_name
value: vcs
- name: repo_branch
value: master
- name: deployment_path
value: overlays/eg/dev/sb1/deployment.yml
- name: image_owner
value: legnoban
- name: image_name
value: test-app-sb1
- name: tag
value: 1.0.2
dependencies:
- build
In Argo, a Workflow is representation of a job that is running or has completed running, as such this is probably not what you want to do.
What you can do is create a template that will create a workflow (run a job) and then refer to this template in your trigger. In this way you can create a workflow based on the template.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: sb1-workflowtemplate
spec:
entrypoint: full
templates:
- name: full
dag:
tasks:
- name: build
templateRef:
name: container-image
template: build-kaniko-git
clusterScope: true
arguments:
parameters:
- name: repo_url
value: git://github.com/letthefireflieslive/test-app-sb1
- name: repo_ref
value: refs/heads/main
- name: container_image
value: legnoban/test-app-sb1
- name: container_tag
value: 1.0.2
- name: promote-dev
templateRef:
name: promote
template: promote
clusterScope: true
arguments:
parameters:
- name: repo_owner
value: letthefireflieslive
- name: repo_name
value: vcs
- name: repo_branch
value: master
- name: deployment_path
value: overlays/eg/dev/sb1/deployment.yml
- name: image_owner
value: legnoban
- name: image_name
value: test-app-sb1
- name: tag
value: 1.0.2
dependencies:
- build
apiVersion: argoproj.io/v1alpha1
kind: Sensor
metadata:
name: webhook
spec:
template:
serviceAccountName: operate-workflow-sa
dependencies:
- name: test-dep
eventSourceName: webhook
eventName: example
triggers:
- template:
name: webhook-workflow-trigger
argoWorkflow:
source:
resource:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: sb1-
spec:
workflowTemplateRef:
name: sb1-workflowtemplate
You should be able to do this, but you need to have serviceaccount in the sensor who can manage workflows. This means clusterrole and clusterrbinding assigned to this account:
apiVersion: v1
kind: ServiceAccount
metadata:
name: argo-events-core
namespace: argo-events
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRole
metadata:
name: argo-events-core
namespace: argo-events
rules:
- apiGroups:
- argoproj.io
resources:
- workflows
- workflowtemplates
- cronworkflows
- clusterworkflowtemplates
verbs:
- "*"
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: argo-events-core
namespace: argo-events
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: argo-events-core
subjects:
- kind: ServiceAccount
name: argo-events-core
namespace: argo-events

Unable to pass output parameters from one workflowTemplate to a workflow via another workflowTemplate

I've Two workflowTemplates generate-output, lib-read-outputs and One workflow output-paramter as follows
generate-output.yaml
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: generate-output
spec:
entrypoint: main
templates:
- name: main
dag:
tasks:
# Generate Json for Outputs
- name: read-outputs
arguments:
parameters:
- name: outdata
value: |
{
"version": 4,
"terraform_version": "0.14.11",
"serial": 0,
"lineage": "732322df-5bd43-6e92-8f46-56c0dddwe83cb4",
"outputs": {
"key_alias_arn": {
"value": "arn:aws:kms:us-west-2:123456789:alias/tetsing-key",
"type": "string",
"sensitive": true
},
"key_arn": {
"value": "arn:aws:kms:us-west-2:123456789:alias/tetsing-key",
"type": "string",
"sensitive": true
}
}
}
template: retrieve-outputs
# Create Json
- name: retrieve-outputs
inputs:
parameters:
- name: outdata
script:
image: python
command: [python]
env:
- name: OUTDATA
value: "{{inputs.parameters.outdata}}"
source: |
import json
import os
OUTDATA = json.loads(os.environ["OUTDATA"])
with open('/tmp/templates_lst.json', 'w') as outfile:
outfile.write(str(json.dumps(OUTDATA['outputs'])))
volumeMounts:
- name: out
mountPath: /tmp
volumes:
- name: out
emptyDir: { }
outputs:
parameters:
- name: message
valueFrom:
path: /tmp/templates_lst.json
lib-read-outputs.yaml
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: lib-read-outputs
spec:
entrypoint: main
templates:
- name: main
dag:
tasks:
# Read Outputs
- name: lib-wft
templateRef:
name: generate-output
template: main
output-paramter.yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: output-paramter-
spec:
entrypoint: main
templates:
- name: main
dag:
tasks:
# Json Output data task1
- name: wf
templateRef:
name: lib-read-outputs
template: main
- name: lib-wf2
dependencies: [wf]
arguments:
parameters:
- name: outputResult
value: "{{tasks.wf.outputs.parameters.message}}"
template: whalesay
- name: whalesay
inputs:
parameters:
- name: outputResult
container:
image: docker/whalesay:latest
command: [cowsay]
args: ["{{inputs.parameters.outputResult}}"]
I am trying to pass the output parameters generated in workflowTemplate generate-output to workflow output-paramter via lib-read-outputs
When I execute them, it's giving the following error - Failed: invalid spec: templates.main.tasks.lib-wf2 failed to resolve {{tasks.wf.outputs.parameters.message}}
DAG and steps templates don't produce outputs by default
DAG and steps templates do not automatically produce their child templates' outputs, even if there is only one child template.
For example, the no-parameters template here does not produce an output, even though it invokes a template which does have an output.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
spec:
templates:
- name: no-parameters
dag:
tasks:
- name: get-a-parameter
template: get-a-parameter
This lack of outputs makes sense if you consider a DAG template with multiple tasks:
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
spec:
templates:
- name: no-parameters
dag:
tasks:
- name: get-a-parameter
template: get-a-parameter
- name: get-another-parameter
depends: get-a-parameter
template: get-another-parameter
Which task's outputs should no-parameters produce? Since it's unclear, DAG and steps templates simply do not produce outputs by default.
You can think of templates as being like functions. You wouldn't expect a function to implicitly return the output of a function it calls.
def get_a_string():
return "Hello, world!"
def call_get_a_string():
get_a_string()
print(call_get_a_string()) # This prints nothing.
But a DAG or steps template can forward outputs
You can make a DAG or a steps template forward an output by setting its outputs field.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: get-parameters-wftmpl
spec:
templates:
- name: get-parameters
dag:
tasks:
- name: get-a-parameter
template: get-a-parameter
- name: get-another-parameter
depends: get-a-parameter
template: get-another-parameter
# This is the critical part!
outputs:
parameters:
- name: parameter-1
valueFrom:
expression: "tasks['get-a-parameter'].outputs.parameters['parameter-name']"
- name: parameter-2
valueFrom:
expression: "tasks['get-another-parameter'].outputs.parameters['parameter-name']"
---
apiVersion: argoproj.io/v1alpha1
kind: Workflow
spec:
templates:
- name: print-parameter
dag:
tasks:
- name: get-parameters
templateRef:
name: get-parameters-wftmpl
template: get-parameters
- name: print-parameter
depends: get-parameters
template: print-parameter
arguments:
parameters:
- name: parameter
value: "{{tasks.get-parameters.outputs.parameters.parameter-1}}"
To continue the Python analogy:
def get_a_string():
return "Hello, world!"
def call_get_a_string():
return get_a_string() # Add 'return'.
print(call_get_a_string()) # This prints "Hello, world!".
So, in your specific case...
Add an outputs section to the main template in the generate-parameter WorkflowTemplate to forward the output parameter from the retrieve-parameters template.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: generate-parameter
spec:
entrypoint: main
templates:
- name: main
outputs:
parameters:
- name: message
valueFrom:
expression: "tasks['read-parameters'].outputs.parameters.message"
dag:
tasks:
# ... the rest of the file ...
Add an outputs section to the main template in the lib-read-parameters WorkflowTemplate to forward generate-parameter's parameter.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: lib-read-parameters
spec:
entrypoint: main
templates:
- name: main
outputs:
parameters:
- name: message
valueFrom:
expression: "tasks['lib-wft'].outputs.parameters.message"
dag:
tasks:
# ... the rest of the file ...

How to trigger an existing Argo cronworkflow?

I have tried many versions of this template below
apiVersion: argoproj.io/v1alpha1
kind: Sensor
metadata:
name: tibco-events-sensor
spec:
template:
metadata:
annotations:
sidecar.istio.io/inject: 'false'
serviceAccountName: operate-workflow-sa
dependencies:
- name: tibco-dep
eventSourceName: tibco-events-source
eventName: whatever
triggers:
- template:
name: has-wf-event-trigger
argoWorkflow:
group: argoproj.io
version: v1alpha1
resource: Workflow
operation: resubmit
metadata:
generateName: has-wf-argo-events-
source:
resource:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
name: has-wf-full-refresh
Keep getting errors of workflows not found
"rpc err
or: code = NotFound desc = workflows.argoproj.io \"has-wf-full-refresh\" not found"
I have hundreds of workflows launched as cronworkflows. And i would like to switch them to be event driven vs cron based. Id prefer not to change already existing flows. I just want to submit or resubmit them.
I figured out that the argoWorkflow trigger template doesnt support CronWorkflows. I ended up using the httptrigger template.
apiVersion: argoproj.io/v1alpha1
kind: Sensor
metadata:
name: tibco-events-sensor
spec:
template:
metadata:
annotations:
sidecar.istio.io/inject: 'false'
serviceAccountName: operate-workflow-sa
dependencies:
- name: tibco-dep
eventSourceName: tibco-events-source
eventName: whatever
triggers:
- template:
name: http-trigger
http:
url: http://argo-workflows.argo-workflows:2746/api/v1/workflows/lab-uat/submit
secureHeaders:
- name: Authorization
valueFrom:
secretKeyRef:
name: argo-workflows-sa-token
key: bearer-token
payload:
- src:
dependencyName: tibco-dep
value: CronWorkflow
dest: resourceKind
- src:
dependencyName: tibco-dep
value: coinflip
dest: resourceName
- src:
dependencyName: tibco-dep
value: coinflip-event-
dest: submitOptions.generateName
method: POST
retryStrategy:
steps: 3
duration: 3s
policy:
status:
allow:
- 200