How to run a workflow using github cli? - github

% gh workflow run test.yml
gives me:
could not create workflow dispatch event: HTTP 422: Workflow does not have 'workflow_dispatch' trigger (https://api.github.com/repos/bcpitutor/v2_lms/actions/workflows/20724896/dispatches)
Why do I need a workflow_dispatch trigger?

The gh workflow run man page does confirm:
The given workflow file must support a workflow_dispatch 'on' trigger in order to be run in this way.
It comes from cli/cli issue 1725, and issue 2889, implemented by PR 3303.
This PR implements gh workflow run, a command for creating workflow_dispatch events for workflows that support them
Again, a confirmation your workflow needs to support that workflow_dispatch trigger.

Related

GitHub action workflow executed without being called in on statement

I have made a pull request on GitHub for a workflow that should be invoked in the following event:
on:
pull_request_review:
types: [submitted,edited]
Despite the fact that no pull request review has been done yet, it seems the workflow did create a status check in the PR that adds the particular file.
Why is that?
Shouldn't the workflow be executed only when a PR review is submitted?

Azure Devops pipelines to trigger ONLY on Merge

I'm looking on a way to trigger a Azure pipeline ONLY on successful (or attempted) pull request merge.
Now I have :
trigger:
branches:
include:
- DEV
steps:
- script: FOO
But this runs EVERY time there is a change on the DEV branch and I would like to avoid that.
Besides, I want a programmatic response not going trough the UI each time.
EDIT:
A weird thing is happnening
condition: and(succeeded(), eq(variables['Build.Reason'], 'PullRequest'))
gets:
Expanded: and(True, eq('IndividualCI', 'PullRequest'))"
When doing a PR, and thus doesn't work as intented
I'm looking on a way to trigger a Azure pipeline ONLY on successful (or attempted) pull request merge.
There is no such out of box way to achieve this at this moment.
We could only set the CI trigger on the target branch, but we could set the condotion for the pipeline to avoid build any task:
and(succeeded(), eq(variables['Build.Reason'], 'PullRequest'))
For example:
trigger:
branches:
include:
- DEV
steps:
- script: FOO
condition: and(succeeded(), eq(variables['Build.Reason'], 'PullRequest'))
Or you could set the condition for the stage, job and so on.
Please check the document Specify conditions for some more details.
If there is a change on the DEV branch and it would be avoided by the condition.
Note: With above way, the pipeline will be triggered, but no task will be executed.
And if you even do not want the pipeline be triggered. You could add new pipeline with powershall task to invoke REST API to trigger above pipeline and set the condition to the powershell task.
In this way, the pipeline will only triggered when the commit comes from the PR.
Update:
Doing a PR on the DEV branch results in : "Expanded: and(True,
eq('IndividualCI', 'PullRequest'))"
Yes,you are correct. That because azure devops does not have the feature to trigger the pipeline after the PR completed. Pull request trigger and Build Validation both trigger the pipeline when the PR starts.
To resolve this request, we could try create a service hook to monitor PR status. If the PR status changes, the pipeline is triggered through API or Application, you could check this document for some more details.
And another way to achieve is using the REST API.
The main idea is:
create a pipeline and set it as Build validation, but not set it as Required, should set it as Optional:
Add powershell task in above pipeline to invoke REST API to monitor the PR status until it complated, and add another task to invoke the REST API to trigger your current pipeline.
So, you could remove the:
trigger:
branches:
include:
- DEV
in your current pipeline.
The trigger you have set is a CI trigger, it will work whenever the target branch has a new commit.
Currently, there isn't a trigger that works when a pull request is completed.
The feature closest to your needs is PR triggers and build validation branch policy.
They will work when a pull request is created or when it has been changed.
If you are using Azure Repos Git, please use branch policy for build validation. If you are using GitHub or Bitbucket Could, please use pr triggers. Click the documents for the detailed information.
Besides, you can use branch policy to prevent the direct commits. When you set the branch policy of any type, only users with "Bypass policies" permission can commit to the branch directly. The rest of the users must commit the branch through a pull request.
How to create branch policy: Branch policies and settings.
How to set "Bypass policies" permission: Set Git repository permissions.

Execute YAML templates from Azure DevOps classic pipeline

I would put my questions through following points, hope it's make clear now:
The application source code is in application_code repo.
The pipeline code(YAMLs) are in pipeline_code repo. Because I'd like to version it and don't like to keep in application_code repo. Just to avoid giving control to Dev team to manage it.
Problem statement:
The pipeline YAML won't be triggered unless it's in the source code repository based on the events pr, commit etc.
Can we trigger or execute YAML file which is in pipeline_repo whenever there's event triggered in application_code repo?
I've tried achieving above using Classic pipeline and YAML template but this don't work together. As I can execute a YAML template from a YAML pipeline only not from a classic pipeline like below:
#azure-pipeline.yaml
jobs:
- job: NewJob
- template: job-template-bd1.yaml
Any ideas or better solution than above?
The feature Multi-repository support for YAML pipelines will be available soon for azure devops service. This feature will support for triggering pipelines based on changes made in one of multiple repositories. Please check Azure DevOps Feature Timeline or here. This feature is expected to be rolled out in 2020 Q1 for azure devops service.
Currently you can follow below workaround to achieve above using Build Completion(the pipeline will be triggered on the completion of another build).
1, Setup the triggering pipeline
Create an empty classic pipeline for application_code repo as the triggering pipeline, which will always succeed and do nothing.
And check Enable continuous integration under Triggers tab and setup Bracnh filters
2, setup the triggered pipeline
In the pipeline_code repo using Checkout to Check out multiple repositories in your pipeline. You can specifically checkout the source code of application_code repo to build. Please refer below example:
steps:
- checkout: git://MyProject/application_code_repo#refs/heads/master # Azure Repos Git repository in the same organization
- task: TaskName
...
Then in the yaml pipeline edit page, click the 3dots on the top right corner and click Triggers. Then click +Add beside Build Completion and select above triggering pipeline created in step 1 as the triggering build.
After finishing above two steps, when changes made to application_code repo, the triggering pipeline will be executed and completed with success. Then the triggered pipeline will be triggered to run the real build job.
Update:
Show Azure DevOps Build Pipeline Status in Bitbucket.
you can add a python script task at the end of the yaml pipeline to update the Bitbucket build status. You need to set a condtion: always() to always run this task even if other tasks are failed.
You can get the build status with env variable Agent.JobStatus. For below example:
For more information, please refer to document Integrate your build system with Bitbucket Cloud, and also this thread.
- task: PythonScript#0
condition: always()
inputs:
scriptSource: inline
script: |
import os
import requests
# Use environment variables that your CI server provides to the key, name,
# and url parameters, as well as commit hash. (The values below are used by
# Jenkins.)
data = {
'key': os.getenv('BUILD_ID'),
'state': os.getenv('Agent.JobStatus'),
'name': os.getenv('JOB_NAME'),
'url': os.getenv('BUILD_URL'),
'description': 'The build passed.'
}
# Construct the URL with the API endpoint where the commit status should be
# posted (provide the appropriate owner and slug for your repo).
api_url = ('https://api.bitbucket.org/2.0/repositories/'
'%(owner)s/%(repo_slug)s/commit/%(revision)s/statuses/build'
% {'owner': 'emmap1',
'repo_slug': 'MyRepo',
'revision': os.getenv('GIT_COMMIT')})
# Post the status to Bitbucket. (Include valid credentials here for basic auth.
# You could also use team name and API key.)
requests.post(api_url, auth=('auth_user', 'auth_password'), json=data)

Concourse Webhook to Git

Environment:
BitBucket
Concourse 3.14.0
Wondering is it possible to configure Concourse pipeline with Git webhook which will check if new commit has happened and it would trigger a pipeline build based on that trigger? I looked at https://concourse-ci.org/resources.html#resource-webhook-token, but it does not tell me how to get a webhook token from Concourse and if it does support what I am asking.
Any feedback is very much appreciated.
Concourse resources usually pull any new versions every minute or so. Whenever this frequency doesn't suit your needs, you can modify it with the check_every resource property. But values lower that 1m (one minute) are typically considered aggressive. Github implements quotas for API calls and when you have many pipelines, you don't want them to fail because you've hit some quota limits.
In case you want Concourse to immediately react on published new versions for the pipeline resources, you need to reverse the pattern. Instead of Concourse pulling any new versions at some defined frequency, you start pushing the information to Concourse that some new versions are to be pulled. This reversed “push” pattern involves triggering “resource checks” whenever new versions are created on the resource.
Trigger immediate resource checks
Every Concourse resource can enable a resource-check triggering URL with the webhook_token resource property. This URL includes the webhook_token secret in its query string, and is supposed to receive a mere POST HTTP request.
With Github repositories, you can POST to this URL with a Github workflow, relying on a standard Github action from the marketplace (recommended, first choice), or a Github webhook (second choice).
Using a Github workflow
You need to commit and push a YAML file in the .github/workflows folder of your Github repository, in order to define your workflow. Refer to the documentation of the “Trigger Concourse resource-check” action for detailed examples. It's very easy, as only five simple inputs need to be configured.
Using a Github webhook
With this alternative, you can manually setup a Github webhook in your repository. The URL depends on the resource for which an immediate check is to be triggered, so you can't set it up at your Github organization level. The webhook_token secret in appended in clear-text to the URL set up for the webhook, and can't be stored as a Github secret. Github webhook don't support fetching any Github secret.
And in case you're bored of manually set up webhooks, automated setup is possible with the github-webhook resource. You can even trigger the webhook recreation whenever the webhook_token secret changes in Credhub, with the help of the Credhub resource. I've done some working code implementing this idea, see those example jobs and those example resource definitions.
But I definitely recommend using a Github workflow with the “Trigger Concourse resource-check” action as a first choice.
I think you are looking for this resource - https://github.com/concourse/git-resource
It automatically checks for any new commit in your git repository and you can run other jobs based on that.
Example pipeline.yml:
resources:
- name: git-repo
type: git
source:
uri: git#github.com:concourse/git-resource.git
branch: master
private_key: {{GIT_KEY}}
jobs:
- name: run-on-new-commit
- get: git-repo
trigger: true
- task: do-something-else

how to trigger a jenkins pipeline stage when an authorized user make a comment on github pull request?

I am familiar with Jenkins Pull Request Builder and I had set up a freestyle job with it to build my project based on the comment that authorized user put. (For example test in prod) in the past.
Now I am trying to use a Jenkins 2.0 with github organization plugin for one of my project.
this is the scenario:
A User is making a PR to master(or some other sensitive branch)
A test is going to get run automatically.
After the test past, an authorized user needs to go to the PR and put a comment Deploy to test environment and then a jenkinsfile that was waiting for this input needs to get trigger.
I just dont know how to do the step 3. how do I make jenkins pipeline job listen for comments in github repo pull requests? the Jenkins documentation is not really clear about the input from user part.
I read this thread answer but the documentation about the Gates approval is really limited.
I know this is super late, but here's some info for future Googlers:
I have a Github webhook that sends the event to a Lambda function that will parse the event for a specific comment string, then create an HTTP POST request for the Jenkins job, which is configured to allow builds to be triggered remotely.
So: open PR > comment on PR 'Deploy to test environment' > webhook sends to AWS APIGateway > AWS SNS topic > AWS Lambda > parse the event for comment > If comment matches, create HTTP POST > Jenkins receives request and runs job
There's a lot of documentation on this, but none of it together, so here are the resources that I used:
Regarding allowing jobs to be triggered remotely:
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Using Github to trigger Lambda function:
https://aws.amazon.com/blogs/compute/dynamic-github-actions-with-aws-lambda/
Github API. You will want to pay particular attention to the Issues API:
https://developer.github.com/webhooks/