I am trying to pass parameters from an outer step template to an inner step template in argo. Below is my workflow definition.
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: fanout-
spec:
templates:
- name: process-workflows
steps:
- - name: fanout
template: fan-out
- - name: fanout-step
template: parallel-process
arguments:
parameters:
- name: batch
value: '{{item}}'
withParam: '{{steps.fanout.outputs.result}}'
- name: fan-out
script:
name: main
image: 'node:lts-alpine3.14'
command:
- node
resources: {}
source: |
inputlist = JSON.parse({{=toJson(workflow.parameters.inputlist)}})
fanout = {{workflow.parameters.fanout}}
var i,j, result=[];
for (i = 0,j = inputlist.length; i < j; i += fanout) {
result.push(inputlist.slice(i, i + fanout));
}
console.log(JSON.stringify(result))
- name: parallel-process
inputs:
parameters:
- name: batch
steps:
- - name: actualprocessor
template: process
arguments:
parameters:
- name: input
value: {{inputs.parameters.batch}}
- - name: aggregate-result
template: aggregate
arguments:
parameters:
- name: aggregate
value: {{steps.actualprocessor.outputs.parameters.res}}
- name: process
inputs:
parameters:
- name: input
outputs:
parameters:
- name: res
valueFrom:
path: /tmp/res.txt
script:
name: main
image: 'alpine:latest'
command:
- sh
source: |
sleep 5
echo 'awakened...'
echo processing-{{=toJson(inputs.parameters.input)}}
echo {{=toJson(inputs.parameters.input)}} > /tmp/res.txt
- name: aggregate
inputs:
parameters:
- name: aggregate
container:
name: main
image: 'alpine:latest'
command:
- sh
- '-c'
args:
- 'echo received {{inputs.parameters.aggregate}}'
entrypoint: process-workflows
arguments:
parameters:
- name: inputlist
value: |
[
{"k" : "v1", "a" : [{ "k": true}]},
{"k" : "v2", "a" : [{ "k": true}]}
]
- name: fanout
value: '1'
Use case:
The fanout-step step (outer step) uses parallel-process template (inner step). It provides a batch argument to the parallel-process template. The parallel-process template needs to provide the value of batch to the input parameters in the target step.
Issue: The input parameter inside the actualprocessor step is empty. I can see that the batch input param is getting populated correctly.
What am I missing here?
The issue is resolved by encasing the parameters in quotes. Thanks to Tom Slabbaer for pointing it out.
Below is the working template.
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: fanout-
spec:
templates:
- name: process-workflows
steps:
- - name: fanout
template: fan-out
- - name: fanout-step
template: parallel-process
arguments:
parameters:
- name: batch
value: '{{item}}'
withParam: '{{steps.fanout.outputs.result}}'
- name: fan-out
script:
name: main
image: 'node:lts-alpine3.14'
command:
- node
resources: {}
source: |
inputlist = JSON.parse({{=toJson(workflow.parameters.inputlist)}})
fanout = {{workflow.parameters.fanout}}
var i,j, result=[];
for (i = 0,j = inputlist.length; i < j; i += fanout) {
result.push(inputlist.slice(i, i + fanout));
}
console.log(JSON.stringify(result))
- name: parallel-process
inputs:
parameters:
- name: batch
steps:
- - name: actualprocessor
template: process
arguments:
parameters:
- name: input
value: "{{inputs.parameters.batch}}"
- - name: aggregate-result
template: aggregate
arguments:
parameters:
- name: aggregate
value: "{{steps.actualprocessor.outputs.parameters.res}}"
- name: process
inputs:
parameters:
- name: input
outputs:
parameters:
- name: res
valueFrom:
path: /tmp/res.txt
script:
name: main
image: 'alpine:latest'
command:
- sh
source: |
sleep 5
echo 'awakened...'
echo processing-{{=toJson(inputs.parameters.input)}}
echo {{=toJson(inputs.parameters.input)}} > /tmp/res.txt
- name: aggregate
inputs:
parameters:
- name: aggregate
container:
name: main
image: 'alpine:latest'
command:
- sh
- '-c'
args:
- 'echo received {{inputs.parameters.aggregate}}'
entrypoint: process-workflows
arguments:
parameters:
- name: inputlist
value: |
[
{"k" : "v1", "a" : [{ "k": true}]},
{"k" : "v2", "a" : [{ "k": true}]}
]
- name: fanout
value: '1'
Related
kind: Workflow
metadata:
generateName: small-
spec:
entrypoint: fan-out-in-params-workflow
arguments:
parameters:
- name: jira-ticket
value: INFRA-000
templates:
- name: fan-out-in-params-workflow
steps:
- - name: generate
template: gen-host-list
- - name: pre-conditions
template: pre-conditions
arguments:
parameters:
- name: host
value: "{{item}}"
withParam: "{{steps.generate.outputs.result}}"
- name: gen-host-list
inputs:
artifacts:
- name: host
path: /tmp/host.txt
s3:
key: host.txt
script:
image: python:alpine3.6
command: [python]
source: |
import json
import sys
filename="{{ inputs.artifacts.host.path }}"
with open(filename, 'r', encoding='UTF-8') as f:
json.dump([line.rstrip() for line in f], sys.stdout)
- name: pre-conditions
inputs:
parameters:
- name: host
steps:
- - name: online-check
template: online-check
arguments:
parameters:
- name: host
value: {{inputs.parameters.host}}
- name: online-check
inputs:
parameters:
- name: host
script:
image: python:alpine3.6
command:
- python
source: |
print({{inputs.parameters.host}})
Hi there, I'm quite new to argoworkflow. Now I'm trying to call the template pre-conditions inputs parameters host like I posted above. But it seems the host params passes to template pre-conditions successfully but I can't get it in steps online-check, anyone can give me some advice ? Anything will be appreciated !
apiVersion: argoproj.io/v1alpha1
kind: Workflow
.
.
- name: mytemplate
steps:
- - name: mytask
templateRef:
name: ABCDworkflowtemplate
template: taskA
arguments:
parameters:
- name: mylist
value: [10,"some",false]
....................
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: ABCDworkflowtemplate
spec:
templates:
- name: taskA
inputs:
parameters:
- name: mylist
.
My question is how to use every element of this list {{input.parameters.?}} ? Help me with some reference. Thank you
You didn't really specify what exactly you want to do with these values so i'll explain both ways to use this input.
The more common usage with arrays is to iterate over them using "withParam", With this syntax a new task (pod) will be created for each of the items.
templates:
- name: taskA
inputs:
parameters:
- name: mylist
steps:
- - name: doSomeWithItem
template: doSomeWithItem
arguments:
parameters:
- name: item
value: "{{item}}"
withParam: "{{inputs.parameters.mylist}}"
- name: doSomeWithItem
inputs:
parameters:
- name: item
container:
image: python:alpine3.6
command: [ python ]
source: |
print("{{inputs.parameters.item}}")
The other option is just to pass the entire array as a variable to the pod, and use custom logic based on needs:
templates:
- name: taskA
inputs:
parameters:
- name: mylist
steps:
- - name: doSomethingWithList
template: doSomethingWithList
arguments:
parameters:
- name: list
value: "{{inputs.parameters.mylist}}"
- name: doSomethingWithList
inputs:
parameters:
- name: list
container:
image: python:alpine3.6
command: [ python ]
source: |
if (list[2] == 'some'):
// do somehting
else if (list[0] == 10]:
// do something
I am trying to run 2 python files in one container , save the output as parameters:
- name: training
serviceAccountName: argo-service
outputs:
parameters:
- name: output-param-1
valueFrom:
path: ./tmp/output.txt
- name: output-param-2
valueFrom:
path: ./tmp/output2.txt
container:
image: (image name)
command: [python]
args: ["/data/argo/model_build.py","/data/argo/model_build_2.py"]
Use that output as input in another container :
- name: evaluate
serviceAccountName: argo-service
inputs:
parameters:
- name: value-1
- name: value-2
container:
image: (image name)
command: ["python", "/data/argo/evaluate.py"]
args:
- '{{inputs.parameters.value-1}}'
- '{{inputs.parameters.value-2}}'
and have defined the chain as :
- name: dag-chain
dag:
tasks:
- name: src
template: clone-repo
- name: prep
template: data-prep
dependencies: [src]
- name: train
template: training
dependencies: [prep]
- name: eval
template: evaluate
dependencies: [train]
args:
parameters:
- name: value-1
value: '{{dag-chain.tasks.train.outputs.parameters.output-param-1}}'
- name: value-2
value: '{{dag-chain.tasks.train.outputs.parameters.output-param-2}}'
But with these steps I'm getting the error :
" Internal Server Error: templates.dag-chain.tasks.eval templates.evaluate inputs.parameters.value-1 was not supplied: "
Please help me identify the mistakes I'm making.
I have tried the steps mentioned above but it's not working.
I don't have Argo accessible just now to test, but a couple of things to try:
args in DAG tasks should be arguments (see field reference for a DAG task here).
Try removing dag-chain from the parameters (see example here).
- name: dag-chain
dag:
tasks:
- name: src
template: clone-repo
- name: prep
template: data-prep
dependencies: [src]
- name: train
template: training
dependencies: [prep]
- name: eval
template: evaluate
dependencies: [train]
arguments:
parameters:
- name: value-1
value: '{{tasks.train.outputs.parameters.output-param-1}}'
- name: value-2
value: '{{tasks.train.outputs.parameters.output-param-2}}'
If that doesn't work I'll try some more steps with Argo.
I've Two workflowTemplates generate-output, lib-read-outputs and One workflow output-paramter as follows
generate-output.yaml
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: generate-output
spec:
entrypoint: main
templates:
- name: main
dag:
tasks:
# Generate Json for Outputs
- name: read-outputs
arguments:
parameters:
- name: outdata
value: |
{
"version": 4,
"terraform_version": "0.14.11",
"serial": 0,
"lineage": "732322df-5bd43-6e92-8f46-56c0dddwe83cb4",
"outputs": {
"key_alias_arn": {
"value": "arn:aws:kms:us-west-2:123456789:alias/tetsing-key",
"type": "string",
"sensitive": true
},
"key_arn": {
"value": "arn:aws:kms:us-west-2:123456789:alias/tetsing-key",
"type": "string",
"sensitive": true
}
}
}
template: retrieve-outputs
# Create Json
- name: retrieve-outputs
inputs:
parameters:
- name: outdata
script:
image: python
command: [python]
env:
- name: OUTDATA
value: "{{inputs.parameters.outdata}}"
source: |
import json
import os
OUTDATA = json.loads(os.environ["OUTDATA"])
with open('/tmp/templates_lst.json', 'w') as outfile:
outfile.write(str(json.dumps(OUTDATA['outputs'])))
volumeMounts:
- name: out
mountPath: /tmp
volumes:
- name: out
emptyDir: { }
outputs:
parameters:
- name: message
valueFrom:
path: /tmp/templates_lst.json
lib-read-outputs.yaml
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: lib-read-outputs
spec:
entrypoint: main
templates:
- name: main
dag:
tasks:
# Read Outputs
- name: lib-wft
templateRef:
name: generate-output
template: main
output-paramter.yaml
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: output-paramter-
spec:
entrypoint: main
templates:
- name: main
dag:
tasks:
# Json Output data task1
- name: wf
templateRef:
name: lib-read-outputs
template: main
- name: lib-wf2
dependencies: [wf]
arguments:
parameters:
- name: outputResult
value: "{{tasks.wf.outputs.parameters.message}}"
template: whalesay
- name: whalesay
inputs:
parameters:
- name: outputResult
container:
image: docker/whalesay:latest
command: [cowsay]
args: ["{{inputs.parameters.outputResult}}"]
I am trying to pass the output parameters generated in workflowTemplate generate-output to workflow output-paramter via lib-read-outputs
When I execute them, it's giving the following error - Failed: invalid spec: templates.main.tasks.lib-wf2 failed to resolve {{tasks.wf.outputs.parameters.message}}
DAG and steps templates don't produce outputs by default
DAG and steps templates do not automatically produce their child templates' outputs, even if there is only one child template.
For example, the no-parameters template here does not produce an output, even though it invokes a template which does have an output.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
spec:
templates:
- name: no-parameters
dag:
tasks:
- name: get-a-parameter
template: get-a-parameter
This lack of outputs makes sense if you consider a DAG template with multiple tasks:
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
spec:
templates:
- name: no-parameters
dag:
tasks:
- name: get-a-parameter
template: get-a-parameter
- name: get-another-parameter
depends: get-a-parameter
template: get-another-parameter
Which task's outputs should no-parameters produce? Since it's unclear, DAG and steps templates simply do not produce outputs by default.
You can think of templates as being like functions. You wouldn't expect a function to implicitly return the output of a function it calls.
def get_a_string():
return "Hello, world!"
def call_get_a_string():
get_a_string()
print(call_get_a_string()) # This prints nothing.
But a DAG or steps template can forward outputs
You can make a DAG or a steps template forward an output by setting its outputs field.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: get-parameters-wftmpl
spec:
templates:
- name: get-parameters
dag:
tasks:
- name: get-a-parameter
template: get-a-parameter
- name: get-another-parameter
depends: get-a-parameter
template: get-another-parameter
# This is the critical part!
outputs:
parameters:
- name: parameter-1
valueFrom:
expression: "tasks['get-a-parameter'].outputs.parameters['parameter-name']"
- name: parameter-2
valueFrom:
expression: "tasks['get-another-parameter'].outputs.parameters['parameter-name']"
---
apiVersion: argoproj.io/v1alpha1
kind: Workflow
spec:
templates:
- name: print-parameter
dag:
tasks:
- name: get-parameters
templateRef:
name: get-parameters-wftmpl
template: get-parameters
- name: print-parameter
depends: get-parameters
template: print-parameter
arguments:
parameters:
- name: parameter
value: "{{tasks.get-parameters.outputs.parameters.parameter-1}}"
To continue the Python analogy:
def get_a_string():
return "Hello, world!"
def call_get_a_string():
return get_a_string() # Add 'return'.
print(call_get_a_string()) # This prints "Hello, world!".
So, in your specific case...
Add an outputs section to the main template in the generate-parameter WorkflowTemplate to forward the output parameter from the retrieve-parameters template.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: generate-parameter
spec:
entrypoint: main
templates:
- name: main
outputs:
parameters:
- name: message
valueFrom:
expression: "tasks['read-parameters'].outputs.parameters.message"
dag:
tasks:
# ... the rest of the file ...
Add an outputs section to the main template in the lib-read-parameters WorkflowTemplate to forward generate-parameter's parameter.
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: lib-read-parameters
spec:
entrypoint: main
templates:
- name: main
outputs:
parameters:
- name: message
valueFrom:
expression: "tasks['lib-wft'].outputs.parameters.message"
dag:
tasks:
# ... the rest of the file ...
I am trying to capture the response of a HTTPTemplate request and pass it to the next task within a DAG workflow, but the argument to the next template is not being populated/ expanded.
The problem isn't necessarily the capture of the HTTP response (although this is what I'm actually trying to test), it seems to be the 'injection' of the argument within my second step.
This is the worklow, with the two templates included:
metadata:
name: blimin-annoying
namespace: argo
labels:
example: 'true'
spec:
entrypoint: passing
templates:
- name: passing
dag:
tasks:
- name: step1
template: execute
arguments:
parameters:
- name: schema
value: kunveno
- name: method
value: GET
- name: step2
depends: step1
template: echo
arguments:
parameters:
- name: message
value: '{{tasks.step1.outputs.parameters.response}}'
#--------------------------
- name: execute
inputs:
parameters:
- name: schema
- name: method
outputs:
parameters:
- name: response
value: '{{outputs.result}}'
http:
method: '{{inputs.parameters.method}}'
url: 'http://host.minikube.internal:3007/query/$event_stores?countOnly=true'
headers:
- name: query-schema
value: '{{inputs.parameters.schema}}'
- name: content-type
value: application/json
#--------------------------
- name: echo
inputs:
parameters:
- name: message
container:
image: 'docker/whalesay:latest'
command:
- cowsay
args:
- '{{inputs.parameters.message}}'
And the output I get, regardless of how I seem to quote/ specify the 'message' argument of step2.
loegnhfdn1-1675440399: _________________________________________
loegnhfdn1-1675440399: / {{tasks.step1.outputs.parameters.resp \
loegnhfdn1-1675440399: \ onse}} /
loegnhfdn1-1675440399: -----------------------------------------
It's as if the variable is being treated as a plain String.
Ideally I'd like to pass the response as a file to the next step, but am currently stuck at this basic stage.
Note; This is on v3.2.4 2021-11-18
After coding up my template creation through the JavaSDK, I accidentally changed the output.parameters.name to 'result', compared to 'response' as posed in the question. This 'typo' actually fixed the issue of passing the response to the next step.
It is unclear whether 'response' is a reserved word, or 'result' is, but the following now works.
Hopefully of use to someone else!
metadata:
name: now-working
namespace: argo
labels:
example: 'true'
spec:
entrypoint: passing
templates:
- name: passing
dag:
tasks:
- name: step1
template: execute
arguments:
parameters:
- name: schema
value: public
- name: method
value: GET
- name: step2
depends: step1
template: echo
arguments:
parameters:
- name: message
value: '{{tasks.step1.outputs.parameters.result}}'
#--------------------------
- name: execute
inputs:
parameters:
- name: schema
- name: method
outputs:
parameters:
- name: response
value: '{{outputs.result}}'
http:
method: '{{inputs.parameters.method}}'
url: 'http://host.minikube.internal:3007/query/$event_stores?countOnly=true'
headers:
- name: query-schema
value: '{{inputs.parameters.schema}}'
- name: content-type
value: application/json
#--------------------------
- name: echo
inputs:
parameters:
- name: message
container:
image: 'docker/whalesay:latest'
command:
- cowsay
args:
- '{{inputs.parameters.message}}'