Custom GitHub Checks With Jenkins Pipeline - github

I need to create a custom check for tests and to trigger the check through the jenkins pipeline.
for example:
stage("tests"){ \\ Tests go here. \\ Notify github pr check the status }
I tried several plugins and some articles I've read about it but it didn't work.
any suggestions?

Related

How to pull code from different branches at runtime and pass parameter to NUnit.xml file?

We recently moved Java(TestNG) to C#.Net (NUnit). Sametime migrated to Jenkins to Team-city. Currently we are facing some challenges while we configuring the new build pipeline in Team-City.
Scenario 1: Our project has multiple branches, we generally pull code from different Git-branches then trigger the automation.
In Jenkins we used to create build-parameter(list), when user try to execute the Job, he/she select the branch-name from the list (build-parameters), then git will pull code user selected branch then trigger execution.
Can you please help how to implement a similar process in Team-City?
How to configure the default value in the list parameter?
Scenario 2: In Jenkins build-parameter use used to pass to (TestNG.xml). eg: browser, environment. When the user select browser and environment from build parameters, when execution trigger TestNG pull those values and initiate the regression.
How should create build parameters (browser, envi.) and pass those
values to NUnit/ config file?
Thanks
Raghu

Trigger Prow job reading parameters from the comment with some parameters

Git hub link to issue that i have raised:
https://github.com/kubernetes/test-infra/issues/25654
We have api tests that are triggered once user comments /test smoke on a PR request,but we want to make this job parametrized which will help run these tests with some parameter.
E.g /test smoke skip
Here we want utilise skip keyword in our job and take action accordingly.
This would enable jobs to run on some user based run time condition helping in creating less jobs.
As of now i dont see any way to pass any parameter with PR comment which can be utilised.
Any workaround to execute the job with parameters would be helpful.

Is there a way in Terraform Enterprise to read the payload from VCS?

I have configured a webhook between github and terraform enterprise correctly, so each time I push a commit, the terraform module gets executed. Why I want to achieve is to use part of the branch name where the push was made and pass it as a variable in the terraform module.
I have read that the value of a variable can be a HCL code, but I am unable to find the correct object to access the payload (or at least, the branch name), so at this moment I think it is not possible to get that value directly from the workspace configuration.
if you get a workaround for this, it may also work from me.
At this point the only idea I get is to call the terraform we hook using an API Call
Thanks in advance
Ok, after several try and error I found out that it is not possible to get any information in the terraform module if you are using the VCS mode. So, in order to be able to get the branch, I got these options:
Use several workspaces
You can configure a workspace for each branch, so you may create a variable a select that branch in each workspace. The problem is you will be repeating yourself with this option
Use Terraform CLI and a GitHub action
I used these fine tutorial from Hashicorp for creating a Github action that uses Terraform Cloud. It gets you done the 99% of the job. For passing a varible you must be aware that there are two methods, using a file or using an enviromental variable (check that information on the Hashicorp site here). So using a:
terraform apply -var="branch=value"
won't work. In my case I used the tfvars approach, so in my Github Action I put this snippet:
- name: Setup Terraform variables
id: vars
run: |-
cat > terraform.auto.tfvars <<EOF
branch = "${GITHUB_REF#refs/*/}"
EOF
I defined a variable within terraform called branch, I was able to get and work with this value

Specify file_path while triggering AWS codebuil

I have created AWS codebuild pipeline. It triggers automatically whenever I push to the Master branch.
Now, I want to trigger it only when something is changed in Dockerfile. Below is my project structure:
casestudy
|
|
|->Docker->Dockerfile
|-> Infrastructure -> infrastructure-files
Below is the screenshot of codebuild webhook filter:
If I push something to Dockerfile, the build is not getting triggered.
Kindly note that if I remove the file_path filter the build triggers with every push on the master branch.
My code is placed on GIthub.
I believe you want file_path = docker/Dockerfile
or perhaps, to prevent things like otherdir/docker/Dockerfile from triggering it, file_path = ^docker/Dockerfile
If you go through the Github web interface and view the repository's webhooks (Settings->Webhooks) and click "Edit" next to the webhook created by CodeBuild and then scroll down to the bottom where it says "Recent Deliveries", by clicking the "..." next to one of those you can actually see the request sent to the webhook. You will see in there the list of files modified by the commit, and they will not have a leading slash (e.g. "docker/Dockerfile")

Is there a way to script repetitive tasks in Azure DevOps?

We have a number of tasks that we carry out every time we create a new GIT repository in our project, and I would like to know if there's a way to script (PowerShell or any other method) these out. for e.g. every these are the steps we follow everytime we create a new repo
Create a new GIT repo
Create a Build pipeline for Build validations during
pull request
Add branch policies to Master including a step to validate build using the above build
Create a Build pipeline for releases
Create a Release pipeline
Is there a way to script repetitive tasks in Azure DevOps?
Of course yes! As Daniel said in comment, just use REST API can achieve these all. But since the steps you want to achieve are little much, the script might be little complex.
Create a new GIT repo
If you also want to use API to finish this step, it needs 3 steps to finish that( Since this does not be documented in doc, I will described it very detailed ):
Step1: Create the validation of importing repository
POST https://dev.azure.com/{org name}/{project name}/_apis/git/import/ImportRepositoryValidations?api-version=5.2-preview.1
Request body:
{
"gitSource":
{
"url":"${ReposURL}",
"overwrite":false
},
"tfvcSource":null,
"username":"$(username}"/null,
"password":"${pw}"/"${PAT}"/null
}
Step2: Create the new repos name
POST https://dev.azure.com/{org name}/{project name}/_apis/git/Repositories?api-version=5.2-preview.1
Request body:
{
"name":"${ReposName}",
"project":
{
"name":"{project name}",
"id":"{this project id}"
}
}
Step3: Import repos
POST https://dev.azure.com/{org name}/{project name}/_apis/git/repositories/{the new repos name you create just now}/importRequests?api-version=5.2-preview.1
Request body:
{
"parameters":
{
"deleteServiceEndpointAfterImportIsDone":true,
"gitSource":
{
"url":"${ReposURL}",
"overwrite":false
},
"tfvcSource":null,
"serviceEndpointId":null
}
}
In these script, you can set variables in Variable tab, then use ${} to get them in the script.
Create a Build pipeline for Build validations during pull request
This step you'd better finish manually, because you can configure more about tasks and trigger with UI. If still want use API, refer to this doc: create build definition. There has detailed sample you can try with.
Add branch policies to Master including a step to validate build using the above build
This API still be documented in doc: create build policy. Just refer to that, and ensure use the correct policy type and the corresponding buildDefinitionId.
Create a Build pipeline for releases
This still recommend you finish manually, same with the step3 you mentioned.
Create a Release pipeline
See this doc: create release.
Note: For some parameter which will be used many times, you can set it as variable. For the parameter which need get from previous API response, you can define a variable to get its value then pass this variable into the next API to use.For e.g. :
$resultT= $result.Headers.ETag
Write-Host "##vso[task.setvariable variable=etag;]$resultT"
Now, you can directly use the $(etag) in the next API.