How to run an existing Argo Workflow after completion of another Argo workflow? - workflow

So I have a Argo workflow_B that needs to run after completion of Argo workflow_A which is used by another team. Both workflows already exist I just want to chain them together.
How can I achieve that?
Is it possible to do such thing using exit-handler?
or should I use event-source like Webhook or AWS SNS to do that?

Have you looked at the workflow of workflows pattern?
If team A is able to create workflow on team b namespace
https://github.com/argoproj/argo-workflows/blob/master/examples/workflow-of-workflows.yaml

Related

Reversing the flow of jobs in a workflow

I'm working on some terraform logic and using github workflows to deploy multiple components in a sequential manner like job2(alb) depending on the completion of job1(creation of VPC). This works fine during the apply phase. However if I were to delete the infra using terraform destroy the sequence of jobs fails as job1 can't be successfull without job1.
Is there a way to enable the execution of the workflow in the bottom-up approach based on input?
I know that we can leverage terraform to deploy these components and handle the dependencies at terraform level. This is an example of a use case I'm working on.
You can control the flow of jobs by using the keyword “needs”. Read the docs here: https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idneeds

Is there a possibility to run a job in argocd after all the apps are deployed and are in sync

I want to trigger a workflow from argocd after all the apps are deployed and are in synced status. is there a way to monitor the deployment status of the apps and then trigger the job?
I have tried using post-sync-hooks, but looks like it works only for a single component, suppose I have 3 apps, I dont want to run post-sync-job all the 3 times, after all are deployed I need to trigger once. Please suggest on this.
Yes. There are post sync hooks.
See the documentation https://argo-cd.readthedocs.io/en/stable/user-guide/resource_hooks/

Notify completion of argo workflow

I have a use case where I am triggering argo workflow from a python application. However, I need a mechanism from argo workflow that it should notify my python application when the workflow execution is completed. I am already using a pub sub mechansim in my python application. So want my python app to subscribe to a redis queue and take action once workflow publishes a message on this queue informing its completion.
This is the interaction flow I am looking for
Workflow ——-> Redis queue ——> Python app
Thanks for help
you can use Argo Workflows's ExitHandler:
https://argoproj.github.io/argo-workflows/variables/#exit-handler
https://github.com/argoproj/argo-workflows/blob/master/examples/exit-handlers.yaml

Azure terraform pipeline

I hope somebody can help me to solve this issue and understand how to implement the best approach.
I have a production environment running tons of azure services (sql server, databases, web app etc).
all those infra has been created with terraform. For as powerful as it is, I am terrified on using it in a pipeline for 1 reason.
Some of my friend, often they do some changes to the infra manually, and having not having those changes in my terraform states, if I automate this process, it might destroy the resource ungracefully, which is something that I don't want to face.
so I was wondering if anyone can shade some light on the following question:
is it possible to have terraform automated to check the infra state at every push to GitHub, and to quit if the output of the plan reports any change?
change to make clear my example.
Lets say I have a terraform state on which I have 2 web app, and somebody manually created a 3 web app on that resource group, it develops some code and push it to GitHub.My pipeline triggers, and as first step I have terraform that runs a terraform plan and/or terraform apply, if this command reports any change, I want it to quit the pipeline(fail) so I will know there is something new there, and if the terraform plan and/or terraform apply return there are no changes to the infra, is up to date to continue with the code deployment.
thank you in advance for any help and clarification.
Yes, you can just run
terraform plan -detailed-exitcode
If the exit code is != 0, you know there are changes. See here for details.
Let me point out that I would highly advise you to lock down your prod environment so that nobody can do manual changes! Your CI/CD pipeline should be the only way to make changes there.
Adding to the above answer, you can also make use of terraform import command just to import the remote changes to your state file. The terraform import command is used to import existing resources into Terraform. Later run plan to check if the changes are in sync.
Refer: https://www.terraform.io/docs/cli/commands/import.html

Azure Storage: Deploy Queue via ARM template

Is there a way to deploy Queues within a Storage Account via ARM templates? Haven't found an option so far.
If yes, how?
If not, is it planned to provide this capability?
If not and not planned, which approach do you recommend for automated deployments?
Thanks
Is there a way to deploy Queues within a Storage Account via ARM templates?
No, it's not support now.
ARM templates could deploy Azure Storage accounts, blob container, but not support to deploy queues, or tables within storage account.
Here is a link to a sample template to create a container: https://azure.microsoft.com/en-us/resources/templates/101-storage-blob-container/
You can vote up your voice to this feedback to promote the further achieved.
A few ways you can think about data plane operations in deployment/apps
1) the code that consumes the queue can create on init if it doesn't exist
2) if you're deploying via a pipeline, use a pipeline task to perform data plane operations
3) in a template use the deploymentScript resource to create it - this is a bit heavyweight for the task you need, but it will work.
That help?