I am trying to capture the response of a HTTPTemplate request and pass it to the next task within a DAG workflow, but the argument to the next template is not being populated/ expanded.
The problem isn't necessarily the capture of the HTTP response (although this is what I'm actually trying to test), it seems to be the 'injection' of the argument within my second step.
This is the worklow, with the two templates included:
metadata:
name: blimin-annoying
namespace: argo
labels:
example: 'true'
spec:
entrypoint: passing
templates:
- name: passing
dag:
tasks:
- name: step1
template: execute
arguments:
parameters:
- name: schema
value: kunveno
- name: method
value: GET
- name: step2
depends: step1
template: echo
arguments:
parameters:
- name: message
value: '{{tasks.step1.outputs.parameters.response}}'
#--------------------------
- name: execute
inputs:
parameters:
- name: schema
- name: method
outputs:
parameters:
- name: response
value: '{{outputs.result}}'
http:
method: '{{inputs.parameters.method}}'
url: 'http://host.minikube.internal:3007/query/$event_stores?countOnly=true'
headers:
- name: query-schema
value: '{{inputs.parameters.schema}}'
- name: content-type
value: application/json
#--------------------------
- name: echo
inputs:
parameters:
- name: message
container:
image: 'docker/whalesay:latest'
command:
- cowsay
args:
- '{{inputs.parameters.message}}'
And the output I get, regardless of how I seem to quote/ specify the 'message' argument of step2.
loegnhfdn1-1675440399: _________________________________________
loegnhfdn1-1675440399: / {{tasks.step1.outputs.parameters.resp \
loegnhfdn1-1675440399: \ onse}} /
loegnhfdn1-1675440399: -----------------------------------------
It's as if the variable is being treated as a plain String.
Ideally I'd like to pass the response as a file to the next step, but am currently stuck at this basic stage.
Note; This is on v3.2.4 2021-11-18
After coding up my template creation through the JavaSDK, I accidentally changed the output.parameters.name to 'result', compared to 'response' as posed in the question. This 'typo' actually fixed the issue of passing the response to the next step.
It is unclear whether 'response' is a reserved word, or 'result' is, but the following now works.
Hopefully of use to someone else!
metadata:
name: now-working
namespace: argo
labels:
example: 'true'
spec:
entrypoint: passing
templates:
- name: passing
dag:
tasks:
- name: step1
template: execute
arguments:
parameters:
- name: schema
value: public
- name: method
value: GET
- name: step2
depends: step1
template: echo
arguments:
parameters:
- name: message
value: '{{tasks.step1.outputs.parameters.result}}'
#--------------------------
- name: execute
inputs:
parameters:
- name: schema
- name: method
outputs:
parameters:
- name: response
value: '{{outputs.result}}'
http:
method: '{{inputs.parameters.method}}'
url: 'http://host.minikube.internal:3007/query/$event_stores?countOnly=true'
headers:
- name: query-schema
value: '{{inputs.parameters.schema}}'
- name: content-type
value: application/json
#--------------------------
- name: echo
inputs:
parameters:
- name: message
container:
image: 'docker/whalesay:latest'
command:
- cowsay
args:
- '{{inputs.parameters.message}}'
Related
kind: Workflow
metadata:
generateName: small-
spec:
entrypoint: fan-out-in-params-workflow
arguments:
parameters:
- name: jira-ticket
value: INFRA-000
templates:
- name: fan-out-in-params-workflow
steps:
- - name: generate
template: gen-host-list
- - name: pre-conditions
template: pre-conditions
arguments:
parameters:
- name: host
value: "{{item}}"
withParam: "{{steps.generate.outputs.result}}"
- name: gen-host-list
inputs:
artifacts:
- name: host
path: /tmp/host.txt
s3:
key: host.txt
script:
image: python:alpine3.6
command: [python]
source: |
import json
import sys
filename="{{ inputs.artifacts.host.path }}"
with open(filename, 'r', encoding='UTF-8') as f:
json.dump([line.rstrip() for line in f], sys.stdout)
- name: pre-conditions
inputs:
parameters:
- name: host
steps:
- - name: online-check
template: online-check
arguments:
parameters:
- name: host
value: {{inputs.parameters.host}}
- name: online-check
inputs:
parameters:
- name: host
script:
image: python:alpine3.6
command:
- python
source: |
print({{inputs.parameters.host}})
Hi there, I'm quite new to argoworkflow. Now I'm trying to call the template pre-conditions inputs parameters host like I posted above. But it seems the host params passes to template pre-conditions successfully but I can't get it in steps online-check, anyone can give me some advice ? Anything will be appreciated !
apiVersion: argoproj.io/v1alpha1
kind: Workflow
.
.
- name: mytemplate
steps:
- - name: mytask
templateRef:
name: ABCDworkflowtemplate
template: taskA
arguments:
parameters:
- name: mylist
value: [10,"some",false]
....................
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: ABCDworkflowtemplate
spec:
templates:
- name: taskA
inputs:
parameters:
- name: mylist
.
My question is how to use every element of this list {{input.parameters.?}} ? Help me with some reference. Thank you
You didn't really specify what exactly you want to do with these values so i'll explain both ways to use this input.
The more common usage with arrays is to iterate over them using "withParam", With this syntax a new task (pod) will be created for each of the items.
templates:
- name: taskA
inputs:
parameters:
- name: mylist
steps:
- - name: doSomeWithItem
template: doSomeWithItem
arguments:
parameters:
- name: item
value: "{{item}}"
withParam: "{{inputs.parameters.mylist}}"
- name: doSomeWithItem
inputs:
parameters:
- name: item
container:
image: python:alpine3.6
command: [ python ]
source: |
print("{{inputs.parameters.item}}")
The other option is just to pass the entire array as a variable to the pod, and use custom logic based on needs:
templates:
- name: taskA
inputs:
parameters:
- name: mylist
steps:
- - name: doSomethingWithList
template: doSomethingWithList
arguments:
parameters:
- name: list
value: "{{inputs.parameters.mylist}}"
- name: doSomethingWithList
inputs:
parameters:
- name: list
container:
image: python:alpine3.6
command: [ python ]
source: |
if (list[2] == 'some'):
// do somehting
else if (list[0] == 10]:
// do something
I am trying to run 2 python files in one container , save the output as parameters:
- name: training
serviceAccountName: argo-service
outputs:
parameters:
- name: output-param-1
valueFrom:
path: ./tmp/output.txt
- name: output-param-2
valueFrom:
path: ./tmp/output2.txt
container:
image: (image name)
command: [python]
args: ["/data/argo/model_build.py","/data/argo/model_build_2.py"]
Use that output as input in another container :
- name: evaluate
serviceAccountName: argo-service
inputs:
parameters:
- name: value-1
- name: value-2
container:
image: (image name)
command: ["python", "/data/argo/evaluate.py"]
args:
- '{{inputs.parameters.value-1}}'
- '{{inputs.parameters.value-2}}'
and have defined the chain as :
- name: dag-chain
dag:
tasks:
- name: src
template: clone-repo
- name: prep
template: data-prep
dependencies: [src]
- name: train
template: training
dependencies: [prep]
- name: eval
template: evaluate
dependencies: [train]
args:
parameters:
- name: value-1
value: '{{dag-chain.tasks.train.outputs.parameters.output-param-1}}'
- name: value-2
value: '{{dag-chain.tasks.train.outputs.parameters.output-param-2}}'
But with these steps I'm getting the error :
" Internal Server Error: templates.dag-chain.tasks.eval templates.evaluate inputs.parameters.value-1 was not supplied: "
Please help me identify the mistakes I'm making.
I have tried the steps mentioned above but it's not working.
I don't have Argo accessible just now to test, but a couple of things to try:
args in DAG tasks should be arguments (see field reference for a DAG task here).
Try removing dag-chain from the parameters (see example here).
- name: dag-chain
dag:
tasks:
- name: src
template: clone-repo
- name: prep
template: data-prep
dependencies: [src]
- name: train
template: training
dependencies: [prep]
- name: eval
template: evaluate
dependencies: [train]
arguments:
parameters:
- name: value-1
value: '{{tasks.train.outputs.parameters.output-param-1}}'
- name: value-2
value: '{{tasks.train.outputs.parameters.output-param-2}}'
If that doesn't work I'll try some more steps with Argo.
I created a WorkflowTemplate in which I want to pass result of a script template as an input parameter to another task
Here is my WorkflowTemplate
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: dag-wft
spec:
entrypoint: whalesay
templates:
- name: whalesay
inputs:
parameters:
- name: message
default: '["tests/hello", "templates/hello", "manifests/hello"]'
dag:
tasks:
- name: prepare-lst
template: prepare-list-script
arguments:
parameters:
- name: message
value: "{{inputs.parameters.message}}"
- name: templates
depends: prepare-lst
templateRef:
name: final-dag-wft
template: whalesay-final
arguments:
parameters:
- name: fnl_message
value: "{{item}}"
withParam: "{{tasks.prepare-lst.outputs.parameters.templates_lst}}"
- name: manifests
depends: prepare-lst && (templates.Succeeded || templates.Skipped)
templateRef:
name: final-dag-wft
template: whalesay-final
arguments:
parameters:
- name: fnl_message
value: "{{item}}"
withParam: "{{tasks.prepare-lst.outputs.parameters.manifests_lst}}"
- name: prepare-list-script
inputs:
parameters:
- name: message
script:
image: python
command: [python]
source: |
manifests_lst = []
# templates list preparation
templates_lst = ['templates/' for template in "{{inputs.parameters.message}}" if 'templates/' in template]
print(templates_lst)
# manifests list preparation
for i in "{{inputs.parameters.message}}":
if 'templates/' not in i:
manifests_lst.append(i)
print(manifests_lst)
outputs:
parameters:
- name: templates_lst
- name: manifests_lst
In the above script template I've added print statement of two variables templates_lst and manifests_lst. I want to pass these two variables result as in input to two other tasks in the dag. The two other tasks are templates and manifests
The way I am accessing the output values is "{{tasks.prepare-lst.outputs.parameters.templates_lst}}" and "{{tasks.prepare-lst.outputs.parameters.manifests_lst}}". It is not working
How we can I do this?
1. Fully define your output parameters
Your output parameter spec is incomplete. You need to specify where the output parameter comes from.
Since you have multiple output parameters, you can't just use standard out ({{tasks.prepare-lst.outputs.parameters.result}}). You have to write two files and derive an output parameter from each.
2. Load the JSON array so it's iterable
If you iterate over the string representation of the array, you'll just get one character at a time.
3. Use an environment variable to pass input to Python
Although it's not strictly necessary, I consider it best practice. If a malicious actor had the ability to set the message parameter, they could inject Python into your workflow. Pass the parameter as an environment variable so the string remains a string.
Changes:
- name: prepare-list-script
inputs:
parameters:
- name: message
script:
image: python
command: [python]
+ env:
+ - name: MESSAGE
+ value: "{{inputs.parameters.message}}"
source: |
+ import json
+ import os
+ message = json.loads(os.environ["MESSAGE"])
manifests_lst = []
# templates list preparation
- templates_lst = ['templates/' for template in "{{inputs.parameters.message}}" if 'templates/' in template]
+ templates_lst = ['templates/' for template in message if 'templates/' in template]
- print(templates_lst)
+ with open('/mnt/out/templates_lst.txt', 'w') as outfile:
+ outfile.write(str(json.dumps(templates_lst)))
# manifests list preparation
for i in "{{inputs.parameters.message}}":
if 'templates/' not in i:
manifests_lst.append(i)
- print(manifests_lst)
+ with open('/mnt/out/manifests_lst.txt', 'w') as outfile:
+ outfile.write(str(json.dumps(manifests_lst)))
+ volumeMounts:
+ - name: out
+ mountPath: /mnt/out
+ volumes:
+ - name: out
+ emptyDir: { }
outputs:
parameters:
- name: templates_lst
+ valueFrom:
+ path: /mnt/out/templates_lst.txt
- name: manifests_lst
+ valueFrom:
+ path: /mnt/out/manifests_lst.txt
I am trying to pass parameters from an outer step template to an inner step template in argo. Below is my workflow definition.
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: fanout-
spec:
templates:
- name: process-workflows
steps:
- - name: fanout
template: fan-out
- - name: fanout-step
template: parallel-process
arguments:
parameters:
- name: batch
value: '{{item}}'
withParam: '{{steps.fanout.outputs.result}}'
- name: fan-out
script:
name: main
image: 'node:lts-alpine3.14'
command:
- node
resources: {}
source: |
inputlist = JSON.parse({{=toJson(workflow.parameters.inputlist)}})
fanout = {{workflow.parameters.fanout}}
var i,j, result=[];
for (i = 0,j = inputlist.length; i < j; i += fanout) {
result.push(inputlist.slice(i, i + fanout));
}
console.log(JSON.stringify(result))
- name: parallel-process
inputs:
parameters:
- name: batch
steps:
- - name: actualprocessor
template: process
arguments:
parameters:
- name: input
value: {{inputs.parameters.batch}}
- - name: aggregate-result
template: aggregate
arguments:
parameters:
- name: aggregate
value: {{steps.actualprocessor.outputs.parameters.res}}
- name: process
inputs:
parameters:
- name: input
outputs:
parameters:
- name: res
valueFrom:
path: /tmp/res.txt
script:
name: main
image: 'alpine:latest'
command:
- sh
source: |
sleep 5
echo 'awakened...'
echo processing-{{=toJson(inputs.parameters.input)}}
echo {{=toJson(inputs.parameters.input)}} > /tmp/res.txt
- name: aggregate
inputs:
parameters:
- name: aggregate
container:
name: main
image: 'alpine:latest'
command:
- sh
- '-c'
args:
- 'echo received {{inputs.parameters.aggregate}}'
entrypoint: process-workflows
arguments:
parameters:
- name: inputlist
value: |
[
{"k" : "v1", "a" : [{ "k": true}]},
{"k" : "v2", "a" : [{ "k": true}]}
]
- name: fanout
value: '1'
Use case:
The fanout-step step (outer step) uses parallel-process template (inner step). It provides a batch argument to the parallel-process template. The parallel-process template needs to provide the value of batch to the input parameters in the target step.
Issue: The input parameter inside the actualprocessor step is empty. I can see that the batch input param is getting populated correctly.
What am I missing here?
The issue is resolved by encasing the parameters in quotes. Thanks to Tom Slabbaer for pointing it out.
Below is the working template.
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: fanout-
spec:
templates:
- name: process-workflows
steps:
- - name: fanout
template: fan-out
- - name: fanout-step
template: parallel-process
arguments:
parameters:
- name: batch
value: '{{item}}'
withParam: '{{steps.fanout.outputs.result}}'
- name: fan-out
script:
name: main
image: 'node:lts-alpine3.14'
command:
- node
resources: {}
source: |
inputlist = JSON.parse({{=toJson(workflow.parameters.inputlist)}})
fanout = {{workflow.parameters.fanout}}
var i,j, result=[];
for (i = 0,j = inputlist.length; i < j; i += fanout) {
result.push(inputlist.slice(i, i + fanout));
}
console.log(JSON.stringify(result))
- name: parallel-process
inputs:
parameters:
- name: batch
steps:
- - name: actualprocessor
template: process
arguments:
parameters:
- name: input
value: "{{inputs.parameters.batch}}"
- - name: aggregate-result
template: aggregate
arguments:
parameters:
- name: aggregate
value: "{{steps.actualprocessor.outputs.parameters.res}}"
- name: process
inputs:
parameters:
- name: input
outputs:
parameters:
- name: res
valueFrom:
path: /tmp/res.txt
script:
name: main
image: 'alpine:latest'
command:
- sh
source: |
sleep 5
echo 'awakened...'
echo processing-{{=toJson(inputs.parameters.input)}}
echo {{=toJson(inputs.parameters.input)}} > /tmp/res.txt
- name: aggregate
inputs:
parameters:
- name: aggregate
container:
name: main
image: 'alpine:latest'
command:
- sh
- '-c'
args:
- 'echo received {{inputs.parameters.aggregate}}'
entrypoint: process-workflows
arguments:
parameters:
- name: inputlist
value: |
[
{"k" : "v1", "a" : [{ "k": true}]},
{"k" : "v2", "a" : [{ "k": true}]}
]
- name: fanout
value: '1'