Configuring Argo output artifacts defined in a WorkflowTeamplate from Workflow - workflow

With the following WorkflowTemplate with an output artifact defined with the name messagejson. I am trying to configure it to use S3 in a Workflow:
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: file-output
spec:
entrypoint: writefile
templates:
- name: writefile
container:
image: alpine:latest
command: ["/bin/sh", "-c"]
args: ["echo hello | tee /tmp/message.json; ls -l /tmp; cat /tmp/message.json"]
outputs:
artifacts:
- name: messagejson
path: /tmp/message.json
---
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: read-file-
spec:
entrypoint: read-file
templates:
- name: read-file
steps:
- - name: print-file-content
templateRef:
name: file-output
template: writefile
arguments:
artifacts:
- name: messagejson
s3:
endpoint: 1.2.3.4
bucket: mybucket
key: "/rabbit/message.json"
insecure: true
accessKeySecret:
name: my-s3-credentials
key: accessKey
secretKeySecret:
name: my-s3-credentials
key: secretKey
However, I get Error (exit code 1): You need to configure artifact storage. More information on how to do this can be found in the docs: https://argoproj.github.io/argo-workflows/configure-artifact-repository/. The same works if I try to configure input artifacts from a Workflow but not output artifacts.
Any idea?

Related

error: unable to recognize "a.yaml": no matches for kind "Sensor" in version "argoproj.io/v1alpha1"

I got the following YAML from:
https://raw.githubusercontent.com/argoproj/argo-events/stable/examples/sensors/webhook.yaml
and saved it at a.yaml
However, when I do
kubectl apply -f a.yaml
I get:
error: unable to recognize "a.yaml": no matches for kind "Sensor" in version "argoproj.io/v1alpha1"
Not sure why Sensor is not valid "Kind"
apiVersion: argoproj.io/v1
kind: Sensor
metadata:
name: webhook
spec:
template:
serviceAccountName: operate-workflow-sa
dependencies:
- name: test-dep
eventSourceName: webhook
eventName: example
triggers:
- template:
name: webhook-workflow-trigger
k8s:
operation: create
source:
resource:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: webhook-
spec:
entrypoint: whalesay
arguments:
parameters:
- name: message
# the value will get overridden by event payload from test-dep
value: hello world
templates:
- name: whalesay
inputs:
parameters:
- name: message
container:
image: docker/whalesay:latest
command: [cowsay]
args: ["{{inputs.parameters.message}}"]
parameters:
- src:
dependencyName: test-dep
dataKey: body
dest: spec.arguments.parameters.0.value
The Kubernetes api can be extended.
Kubernetes by default does not know this kind.
You have to install it.
Check this with your System-admin on a Production-Side.
There are two requirements for this to work:
You need a cluster with Alpha Features Enabled:
https://cloud.google.com/kubernetes-engine/docs/how-to/creating-an-alpha-cluster
AND
You need argo events installed
kubectl create ns argo-events
kubectl apply -f https://raw.githubusercontent.com/argoproj/argo-events/stable/manifests/namespace-install.yaml
Then you can install a webhook Sensor.

Argo Workflow Stuck in Progressing

I've created a test Argo Workflow to help me understand how I can CI/CD approach to deploy an Ansible Playbook. When I create the app in Argo CD, it syncs fine, but then it just gets stuck on Progressing and never gets out of that state.
I tried digging around to see if there was any indication in the logs, but I'm fairly new to Argo. It doesn't even get to the point where it's creating any pods to do any of the steps.
Thoughts?
Here is my workflow:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
name: ansible-test
spec:
entrypoint: ansible-test-ci
arguments:
parameters:
- name: repo
value: ****
- name: revision
value: '1.6'
templates:
- name: ansible-test-ci
steps:
- - name: checkout
template: checkout
#- - name: test-playbook
# template: test-playbook
# arguments:
# artifacts:
# - name: source
# from: "{{steps.checkout.outputs.artifacts.source}}"
- - name: deploy
template: deploy
arguments:
artifacts:
- name: source
from: "{{steps.checkout.outputs.artifacts.source}}"
- name: checkout
inputs:
artifacts:
- name: source
path: /src
git:
repo: "{{workflow.parameters.repo}}"
#revision: "{{workflow.parameters.revision}}"
#sshPrivateKeySecret:
# name: my-secret
# key: ssh-private-key # kubectl create secret generic my-secret --from-file=ssh-private-key=~/.ssh/id_rsa2
outputs:
artifacts:
- name: source
path: /src
container:
image: alpine/git:latest
command: ["/bin/sh", "-c"]
args: ["cd /src && git status && ls -l"]
#- name: test-playbook
# inputs:
# artifacts:
# - name: source
# path: /ansible/
# container:
# image: ansible/ansible-runner:latest
# command: ["/bin/sh", "-c"]
# args: ["
# cd /ansible &&
# ansible-playbook playbook.yaml -i inventory
# "]
- name: deploy
inputs:
artifacts:
- name: source
path: /ansible/
container:
image: ansible/ansible-runner:latest
command: ["/bin/sh", "-c"]
args: ["
cd /ansible &&
ansible-playbook playbook.yaml -i inventory
"]
Images of what's going on in Argo CD:
I ended up solving this by adding a ServiceAccount and Role resource to the namespace that Argo Workflow was trying to run within.
Here's the Role I added:
apiVersion: rbac.authorization.k8s.io/v1
kind: Role
metadata:
name: workflow-role
rules:
# pod get/watch is used to identify the container IDs of the current pod
# pod patch is used to annotate the step's outputs back to controller (e.g. artifact location)
- apiGroups:
- ""
resources:
- pods
verbs:
- get
- watch
- patch
# logs get/watch are used to get the pods logs for script outputs, and for log archival
- apiGroups:
- ""
resources:
- pods/log
verbs:
- get
- watch
---
apiVersion: rbac.authorization.k8s.io/v1
kind: RoleBinding
metadata:
name: workflow-role-binding
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: Role
name: workflow-role
subjects:
- kind: ServiceAccount
name: default

Argo Workflow error when using envFrom field

Workflow:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: my-workflow-
spec:
entrypoint: main
arguments:
parameters:
- name: configmap
value: my-configmap
- name: secret
value: my-secret
templates:
- name: main
steps:
- - name: main
templateRef:
name: my-template
template: main
arguments:
parameters:
- name: configmap
value: "{{workflow.parameters.configmap}}"
- name: secret
value: "{{workflow.parameters.secret}}"
Template:
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: my-template
spec:
entrypoint: main
templates:
- name: main
inputs:
parameters:
- name: configmap
parameters:
- name: secret
container:
image: my-image:1.2.3
envFrom:
- configMapRef:
name: "{{inputs.parameters.configmap}}"
- secretRef:
name: "{{inputs.parameters.secret}}"
When deployed through the Argo UI I receive the following error from Kubernetes when starting the pod:
spec.containers[1].envFrom: Invalid value: \"\": must specify one of: `configMapRef` or `secretRef`
Using envFrom is supported and documented in the Argo documentation: https://argoproj.github.io/argo-workflows/fields/. Why is Kubernetes complaining here?
As mentioned in the comments, there are a couple issues with your manifests. They're valid YAML, but that YAML does not deserialize into valid Argo custom resources.
In the Workflow, you have duplicated the parameters key in spec.templates[0].inputs.
In the WorkflowTemplate, you have placed the configMapRef and secretRef names at the same level as the keys. configMapRef and secretRef are objects, so the name key should be nested under each of those.
These are the corrected manifests:
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: my-template
spec:
entrypoint: main
templates:
- name: main
inputs:
parameters:
- name: configmap
- name: secret
container:
image: my-image:1.2.3
envFrom:
- configMapRef:
name: "{{inputs.parameters.configmap}}"
- secretRef:
name: "{{inputs.parameters.secret}}"
---
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: my-workflow-
spec:
entrypoint: main
arguments:
parameters:
- name: configmap
value: my-configmap
- name: secret
value: my-secret
templates:
- name: main
steps:
- - name: main
templateRef:
name: my-template
template: main
arguments:
parameters:
- name: configmap
value: "{{workflow.parameters.configmap}}"
- name: secret
value: "{{workflow.parameters.secret}}"
Argo Workflows supports IDE-based validation which should help you find/avoid these issues.

How can I make steps/tasks from WorkflowTemplate show up in Argo workflows UI?

Why does the workflow just end on an arrow pointing down?
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: workflow-template-whalesay-template
spec:
templates:
- name: whalesay-template
inputs:
parameters:
- name: message
container:
image: docker/whalesay
command: [cowsay]
This is the worflowtemplate I'm using. I applied this to k8s before the next step.
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
name: workflow-template-dag-diamond
generateName: workflow-template-dag-diamond-
spec:
entrypoint: diamond
templates:
- name: diamond
dag:
tasks:
- name: A
templateRef:
name: workflow-template-whalesay-template
template: whalesay-template
arguments:
parameters:
- name: message
value: A
This workflow references the previous step. Workflow is doing what its suppose to do but I can't see the green dots on the UI.

Kubernetes job does not recognize environment

I am using the following job template:
apiVersion: batch/v1
kind: Job
metadata:
name: rotatedevcreds2
spec:
template:
metadata:
name: rotatedevcreds2
spec:
containers:
- name: shell
image: akanksha/dsserver:v7
env:
- name: DEMO
value: "Hello from the environment"
- name: personal_AWS_SECRET_ACCESS_KEY
valueFrom:
secretKeyRef:
name: rotatecreds-env
key: personal_aws_secret_access_key
- name: personal_AWS_SECRET_ACCESS_KEY_ID
valueFrom:
secretKeyRef:
name: rotatecreds-env
key: personal_aws_secret_access_key_id
- name: personal_GIT_TOKEN
valueFrom:
secretKeyRef:
name: rotatecreds-env
key: personal_git_token
command:
- "bin/bash"
- "-c"
- "whoami; pwd; /root/rotateCreds.sh"
restartPolicy: Never
imagePullSecrets:
- name: regcred
The shell script runs some ansible tasks which results in:
TASK [Get the existing access keys for the functional backup ID] ***************
fatal: [localhost]: FAILED! => {"changed": false, "cmd": "aws iam list-access-keys --user-name ''", "failed_when_result": true, "msg": "[Errno 2] No such file or directory", "rc": 2}
However if I spin a pod using the same iamge using the following
apiVersion: batch/v1
kind: Job
metadata:
name: rotatedevcreds3
spec:
template:
metadata:
name: rotatedevcreds3
spec:
containers:
- name: shell
image: akanksha/dsserver:v7
env:
- name: DEMO
value: "Hello from the environment"
- name: personal_AWS_SECRET_ACCESS_KEY
valueFrom:
secretKeyRef:
name: rotatecreds-env
key: personal_aws_secret_access_key
- name: personal_AWS_SECRET_ACCESS_KEY_ID
valueFrom:
secretKeyRef:
name: rotatecreds-env
key: personal_aws_secret_access_key_id
- name: personal_GIT_TOKEN
valueFrom:
secretKeyRef:
name: rotatecreds-env
key: personal_git_token
command:
- "bin/bash"
- "-c"
- "whoami; pwd; /root/rotateCreds.sh"
restartPolicy: Never
imagePullSecrets:
- name: regcred
This creates a POD and I am able to login to the pod and run /root/rotateCreds.sh
While running the job it seems it not able to recognose the aws cli. I tried debugging whoami and pwd which is equal to root and / respectively and that is fine. Any pointers what is missing? I am new to jobs.
For further debugging in the job template I added a sleep for 10000 seconds so that I can login to the container and see what's happening. I noticed after logging in I was able to run the script manually too. aws command was recognised properly.
It is likely your PATH is not set correctly,
a quick fix is to define the absolute path of aws-cli like /usr/local/bin/aws in /root/rotateCreds.sh script
Ok so I added an export command to update the path and that fixed the issue. The issue was: I was using command resource so it was not in bash environment. So either we can use a shell resource with bash argument as described here:
https://docs.ansible.com/ansible/latest/modules/shell_module.html
or export new PATH.