argo workflow submit error - duplicated node name - argo-workflows

I am trying to use argo events + argo workflow . However I am constantly getting this duplicated nodename for ideally not sure why is it saying so . I have a sensor which reacts to events and it has a dag workflow.
apiVersion: argoproj.io/v1alpha1
kind: Sensor
metadata:
name: argocd-dotnet-kafka-subscriber
spec:
template:
serviceAccountName: argo-events-sa
dependencies:
- name: github
eventSourceName: github
eventName: github-app # argocd-dotnet-kafka-event
triggers:
- template:
name: trigger
argoWorkflow:
group: argoproj.io
version: v1alpha1
resource: workflows
operation: submit
source:
resource:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: argocd-dotnet-kafka-
namespace: workflows
spec:
entrypoint: build
serviceAccountName: workflow
volumes:
- name: regcred
secret:
secretName: regcred
items:
- key: .dockerconfigjson
path: config.json
- name: github-access
secret:
secretName: github-access
items:
- key: token
path: token
- key: user
path: user
- key: email
path: email
templates:
- name: build
dag:
tasks:
- name: build
templateRef:
name: container-image
template: build-kaniko-git
clusterScope: true
arguments:
parameters:
- name: repo_url
value: "https://github.com/Workquark/argocd-dotnet-kafka-subscriber-deploy"
- name: repo_ref
value: ""
- name: repo_commit_id
value: ""
- name: container_image
value: joydeep1985/argocd-dotnet-kafka-subscriber-deploy
- name: container_tag
value: "latest"
- name: test
script:
image: alpine
command: [sh]
source: |
echo This is a testing simulation...
sleep 5
volumeMounts:
- name: github-access
mountPath: /.github/
parameters:
- src:
dependencyName: github
dataKey: body.repository.git_url
dest: spec.templates.0.dag.tasks.0.arguments.parameters.0.value
- src:
dependencyName: github
dataKey: body.ref
dest: spec.templates.0.dag.tasks.0.arguments.parameters.1.value
- src:
dependencyName: github
dataKey: body.after
dest: spec.templates.0.dag.tasks.0.arguments.parameters.2.value
- src:
dependencyName: github
dataKey: body.repository.name
dest: spec.templates.0.dag.tasks.0.arguments.parameters.3.value
operation: append
- src:
dependencyName: github
dataKey: body.after
dest: spec.templates.0.dag.tasks.0.arguments.parameters.4.value
- src:
dependencyName: github
dataKey: body.repository.name
dest: spec.templates.0.dag.tasks.1.arguments.parameters.4.value
- src:
dependencyName: github
dataKey: body.after
dest: spec.templates.0.dag.tasks.1.arguments.parameters.5.value
- src:
dependencyName: github
dataKey: body.repository.name
dest: spec.templates.0.dag.tasks.2.arguments.parameters.4.value
- src:
dependencyName: github
dataKey: body.after
dest: spec.templates.0.dag.tasks.2.arguments.parameters.5.value
Above is the sensor code for it .
apiVersion: argoproj.io/v1alpha1
kind: ClusterWorkflowTemplate
metadata:
name: container-image
spec:
serviceAccountName: workflow
templates:
- name: build-kaniko-git
inputs:
parameters:
- name: repo_url
- name: repo_ref
value: refs/heads/master
- name: repo_commit_id
value: HEAD
- name: container_image
- name: container_tag
container:
image: gcr.io/kaniko-project/executor:debug
command: [/kaniko/executor]
args:
- --context={{inputs.parameters.repo_url}}#{{inputs.parameters.repo_ref}}#{{inputs.parameters.repo_commit_id}}
- --destination={{inputs.parameters.container_image}}:{{inputs.parameters.container_tag}}
volumeMounts:
- name: regcred
mountPath: /kaniko/.docker/
Above is the templateref for the argo workflow for kaniko . The error I keep getting is -
time="2022-04-20T01:25:40.089Z" level=fatal msg="Failed to submit workflow:
templates.build sorting failed: duplicated nodeName "
{"level":"error","ts":1650417940.0938516,"logger":"argo-events.sensor","caller":"sensors/listener.go:355","msg":"failed to execute a trigger","sensorName":"argocd-dotnet-kafka-subscriber","error":"failed to execute trigger: timed out waiting for the condition: failed to execute submit command for workflow : exit status 1","errorVerbose":"timed out waiting for the condition: failed to execute submit command for workflow : exit status 1\nfailed to execute trigger\ngithub.com/argoproj/argo-events/sensors.(*SensorContext).triggerOne\n\t/home/runner/work/argo-events/argo-events/sensors/listener.go:408\ngithub.com/argoproj/argo-events/sensors.(*SensorContext).triggerWithRateLimit\n\t/home/runner/work/argo-events/argo-events/sensors/listener.go:353\nruntime.goexit\n\t/opt/hostedtoolcache/go/1.17.1/x64/src/runtime/asm_amd64.s:1581","triggerName":"trigger","triggeredBy":["github"],"triggeredByEvents":["32623564393765662d343331612d346333342d613166352d346230613238613735353163"],"stacktrace":"github.com/argoproj/argo-events/sensors.(*SensorContext).triggerWithRateLimit\n\t/home/runner/work/argo-events/argo-events/sensors/listener.go:355"}

Related

Argoworkflow can't call template inputs parameters within steps

kind: Workflow
metadata:
generateName: small-
spec:
entrypoint: fan-out-in-params-workflow
arguments:
parameters:
- name: jira-ticket
value: INFRA-000
templates:
- name: fan-out-in-params-workflow
steps:
- - name: generate
template: gen-host-list
- - name: pre-conditions
template: pre-conditions
arguments:
parameters:
- name: host
value: "{{item}}"
withParam: "{{steps.generate.outputs.result}}"
- name: gen-host-list
inputs:
artifacts:
- name: host
path: /tmp/host.txt
s3:
key: host.txt
script:
image: python:alpine3.6
command: [python]
source: |
import json
import sys
filename="{{ inputs.artifacts.host.path }}"
with open(filename, 'r', encoding='UTF-8') as f:
json.dump([line.rstrip() for line in f], sys.stdout)
- name: pre-conditions
inputs:
parameters:
- name: host
steps:
- - name: online-check
template: online-check
arguments:
parameters:
- name: host
value: {{inputs.parameters.host}}
- name: online-check
inputs:
parameters:
- name: host
script:
image: python:alpine3.6
command:
- python
source: |
print({{inputs.parameters.host}})
Hi there, I'm quite new to argoworkflow. Now I'm trying to call the template pre-conditions inputs parameters host like I posted above. But it seems the host params passes to template pre-conditions successfully but I can't get it in steps online-check, anyone can give me some advice ? Anything will be appreciated !

How to run multiple files in one container image and use the output as input in another container in Argo workflow?

I am trying to run 2 python files in one container , save the output as parameters:
- name: training
serviceAccountName: argo-service
outputs:
parameters:
- name: output-param-1
valueFrom:
path: ./tmp/output.txt
- name: output-param-2
valueFrom:
path: ./tmp/output2.txt
container:
image: (image name)
command: [python]
args: ["/data/argo/model_build.py","/data/argo/model_build_2.py"]
Use that output as input in another container :
- name: evaluate
serviceAccountName: argo-service
inputs:
parameters:
- name: value-1
- name: value-2
container:
image: (image name)
command: ["python", "/data/argo/evaluate.py"]
args:
- '{{inputs.parameters.value-1}}'
- '{{inputs.parameters.value-2}}'
and have defined the chain as :
- name: dag-chain
dag:
tasks:
- name: src
template: clone-repo
- name: prep
template: data-prep
dependencies: [src]
- name: train
template: training
dependencies: [prep]
- name: eval
template: evaluate
dependencies: [train]
args:
parameters:
- name: value-1
value: '{{dag-chain.tasks.train.outputs.parameters.output-param-1}}'
- name: value-2
value: '{{dag-chain.tasks.train.outputs.parameters.output-param-2}}'
But with these steps I'm getting the error :
" Internal Server Error: templates.dag-chain.tasks.eval templates.evaluate inputs.parameters.value-1 was not supplied: "
Please help me identify the mistakes I'm making.
I have tried the steps mentioned above but it's not working.
I don't have Argo accessible just now to test, but a couple of things to try:
args in DAG tasks should be arguments (see field reference for a DAG task here).
Try removing dag-chain from the parameters (see example here).
- name: dag-chain
dag:
tasks:
- name: src
template: clone-repo
- name: prep
template: data-prep
dependencies: [src]
- name: train
template: training
dependencies: [prep]
- name: eval
template: evaluate
dependencies: [train]
arguments:
parameters:
- name: value-1
value: '{{tasks.train.outputs.parameters.output-param-1}}'
- name: value-2
value: '{{tasks.train.outputs.parameters.output-param-2}}'
If that doesn't work I'll try some more steps with Argo.

How to manage and reference multiple artifact repositories in argo workflow

I have multiple artifact repositories and have them configured in the configMap, like:
apiVersion: v1
kind: ConfigMap
metadata:
name: artifact-repositories
data:
bucket1: |
s3:
endpoint: ...
bucket: bucket1
accessKeySecret:
...
secretKeySecret:
...
bucket2: |
s3:
endpoint: ...
bucket: bucket2
accessKeySecret:
...
secretKeySecret:
...
Then, I want to reference them in a key-only way in the same workflow :
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: artifact-
spec:
entrypoint: main
artifactRepositoryRef:
configMap: artifact-repositories
key: bucket1
templates:
- name: main
steps:
- - name: step1
template: step1
- - name: step2
template: step2
- name: step1
container:
...
outputs:
artifacts:
- name: art-output
path: /tmp/s1.txt
s3: # use bucket1 through artifactRepositoryRef
key: argo/s1.txt
- name: step2
container:
...
outputs:
artifacts:
- name: art-output
path: /tmp/s2.txt
s3:
# how to use bucket2 in a key-only way
key: argo/s2.txt
artifactRepositoryRef can only reference one artifact repository, how to reference another artifact repository in a concise way ?

How to pass parameters in nested argo steps?

I am trying to pass parameters from an outer step template to an inner step template in argo. Below is my workflow definition.
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: fanout-
spec:
templates:
- name: process-workflows
steps:
- - name: fanout
template: fan-out
- - name: fanout-step
template: parallel-process
arguments:
parameters:
- name: batch
value: '{{item}}'
withParam: '{{steps.fanout.outputs.result}}'
- name: fan-out
script:
name: main
image: 'node:lts-alpine3.14'
command:
- node
resources: {}
source: |
inputlist = JSON.parse({{=toJson(workflow.parameters.inputlist)}})
fanout = {{workflow.parameters.fanout}}
var i,j, result=[];
for (i = 0,j = inputlist.length; i < j; i += fanout) {
result.push(inputlist.slice(i, i + fanout));
}
console.log(JSON.stringify(result))
- name: parallel-process
inputs:
parameters:
- name: batch
steps:
- - name: actualprocessor
template: process
arguments:
parameters:
- name: input
value: {{inputs.parameters.batch}}
- - name: aggregate-result
template: aggregate
arguments:
parameters:
- name: aggregate
value: {{steps.actualprocessor.outputs.parameters.res}}
- name: process
inputs:
parameters:
- name: input
outputs:
parameters:
- name: res
valueFrom:
path: /tmp/res.txt
script:
name: main
image: 'alpine:latest'
command:
- sh
source: |
sleep 5
echo 'awakened...'
echo processing-{{=toJson(inputs.parameters.input)}}
echo {{=toJson(inputs.parameters.input)}} > /tmp/res.txt
- name: aggregate
inputs:
parameters:
- name: aggregate
container:
name: main
image: 'alpine:latest'
command:
- sh
- '-c'
args:
- 'echo received {{inputs.parameters.aggregate}}'
entrypoint: process-workflows
arguments:
parameters:
- name: inputlist
value: |
[
{"k" : "v1", "a" : [{ "k": true}]},
{"k" : "v2", "a" : [{ "k": true}]}
]
- name: fanout
value: '1'
Use case:
The fanout-step step (outer step) uses parallel-process template (inner step). It provides a batch argument to the parallel-process template. The parallel-process template needs to provide the value of batch to the input parameters in the target step.
Issue: The input parameter inside the actualprocessor step is empty. I can see that the batch input param is getting populated correctly.
What am I missing here?
The issue is resolved by encasing the parameters in quotes. Thanks to Tom Slabbaer for pointing it out.
Below is the working template.
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: fanout-
spec:
templates:
- name: process-workflows
steps:
- - name: fanout
template: fan-out
- - name: fanout-step
template: parallel-process
arguments:
parameters:
- name: batch
value: '{{item}}'
withParam: '{{steps.fanout.outputs.result}}'
- name: fan-out
script:
name: main
image: 'node:lts-alpine3.14'
command:
- node
resources: {}
source: |
inputlist = JSON.parse({{=toJson(workflow.parameters.inputlist)}})
fanout = {{workflow.parameters.fanout}}
var i,j, result=[];
for (i = 0,j = inputlist.length; i < j; i += fanout) {
result.push(inputlist.slice(i, i + fanout));
}
console.log(JSON.stringify(result))
- name: parallel-process
inputs:
parameters:
- name: batch
steps:
- - name: actualprocessor
template: process
arguments:
parameters:
- name: input
value: "{{inputs.parameters.batch}}"
- - name: aggregate-result
template: aggregate
arguments:
parameters:
- name: aggregate
value: "{{steps.actualprocessor.outputs.parameters.res}}"
- name: process
inputs:
parameters:
- name: input
outputs:
parameters:
- name: res
valueFrom:
path: /tmp/res.txt
script:
name: main
image: 'alpine:latest'
command:
- sh
source: |
sleep 5
echo 'awakened...'
echo processing-{{=toJson(inputs.parameters.input)}}
echo {{=toJson(inputs.parameters.input)}} > /tmp/res.txt
- name: aggregate
inputs:
parameters:
- name: aggregate
container:
name: main
image: 'alpine:latest'
command:
- sh
- '-c'
args:
- 'echo received {{inputs.parameters.aggregate}}'
entrypoint: process-workflows
arguments:
parameters:
- name: inputlist
value: |
[
{"k" : "v1", "a" : [{ "k": true}]},
{"k" : "v2", "a" : [{ "k": true}]}
]
- name: fanout
value: '1'

How to trigger an existing Argo cronworkflow?

I have tried many versions of this template below
apiVersion: argoproj.io/v1alpha1
kind: Sensor
metadata:
name: tibco-events-sensor
spec:
template:
metadata:
annotations:
sidecar.istio.io/inject: 'false'
serviceAccountName: operate-workflow-sa
dependencies:
- name: tibco-dep
eventSourceName: tibco-events-source
eventName: whatever
triggers:
- template:
name: has-wf-event-trigger
argoWorkflow:
group: argoproj.io
version: v1alpha1
resource: Workflow
operation: resubmit
metadata:
generateName: has-wf-argo-events-
source:
resource:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
name: has-wf-full-refresh
Keep getting errors of workflows not found
"rpc err
or: code = NotFound desc = workflows.argoproj.io \"has-wf-full-refresh\" not found"
I have hundreds of workflows launched as cronworkflows. And i would like to switch them to be event driven vs cron based. Id prefer not to change already existing flows. I just want to submit or resubmit them.
I figured out that the argoWorkflow trigger template doesnt support CronWorkflows. I ended up using the httptrigger template.
apiVersion: argoproj.io/v1alpha1
kind: Sensor
metadata:
name: tibco-events-sensor
spec:
template:
metadata:
annotations:
sidecar.istio.io/inject: 'false'
serviceAccountName: operate-workflow-sa
dependencies:
- name: tibco-dep
eventSourceName: tibco-events-source
eventName: whatever
triggers:
- template:
name: http-trigger
http:
url: http://argo-workflows.argo-workflows:2746/api/v1/workflows/lab-uat/submit
secureHeaders:
- name: Authorization
valueFrom:
secretKeyRef:
name: argo-workflows-sa-token
key: bearer-token
payload:
- src:
dependencyName: tibco-dep
value: CronWorkflow
dest: resourceKind
- src:
dependencyName: tibco-dep
value: coinflip
dest: resourceName
- src:
dependencyName: tibco-dep
value: coinflip-event-
dest: submitOptions.generateName
method: POST
retryStrategy:
steps: 3
duration: 3s
policy:
status:
allow:
- 200