Jenkins publish changes in repository to github using pipeline and groovy - github

I have a jenkins organization pipeline job that executes on all repositories that have "Jenkinsfile" defined. The job clones the repository from github, then runs the powershell script that increments the version number in the file. I'm now trying to publish that updated file back to the original repository on github, so when developer pulls the changes he gets the latest version number.
I tried using the script (inside "jenkinsfile") as suggested in jenkins JIRA (https://issues.jenkins-ci.org/browse/JENKINS-28335), but to no avail. Any suggestions will be appreciated. Basically need to execute "git commit" and "git push" using the same parameters defined for a job.
Just as a reference, here is a previous solution used for free style (not pipeline job): How to push changes to github after jenkins build completes?.

Actually found couple solutions, first I modied script from Jenkins like this (some objects changed in workflow pipeline):
import hudson.FilePath
import org.eclipse.jgit.transport.URIish
node {
env.WORKSPACE = pwd()
stage 'Checkout'
checkout scm
def build = manager.build
def listener = manager.listener
def workspace = new FilePath(new File(env.WORKSPACE))
def environment = build.getEnvironment(listener)
final def project = build.getParent()
final def gitScm = project.getTypicalSCM()
final def gitClient = gitScm.createClient(listener, environment, build, workspace);
final def gitTagName = "TAG_NAME"
final def comment = "COMMENT"
final def remoteURI = new URIish("origin")
gitClient.tag(gitTagName, comment)
gitClient.push().tags(true).to(remoteURI).execute()
}
You need to run the script multiple times and then allow code execution in jenkins (manage jenkins->in process script approval).
Another solution, much simpler (using this one for now):
bat "\"${tool 'Git'}\" config user.email \"ci#virtocommerce.com\""
bat "\"${tool 'Git'}\" config user.name \"Virto Jenkins\""
bat "\"${tool 'Git'}\" commit -am \"Updated version number\""
bat "\"${tool 'Git'}\" push origin HEAD:master -f"
You have to have Git tool with a name "Git" configured in Jenkins.

Related

How to read file through jenkins from github

I am trying to read a file from github using readFile,
sshagent (credentials: ["${github_cred}"]) {
script {
sh """
git checkout test_branch
"""
def file = readFile(file: "/myrepo/${params.value}.txt")
But in other case, this file will not be available for certain parameters passed. So I would like to check if the file exists in github and proceed with the next steps if it is available or else it should skip the stage.
First Try:
When I try to do with the above code, it is throwing NoSuchFileException when it is not available. I tried with fileExists function which actually works only on the controller node, not on the github. Is there any possible to achieve this?
Second Try:
I also tried with git show as below but I got illegal string body or character after dollar sign error, I don't know what is wrong here.
git show HEAD~1:/myrepo/"${params.value}".txt > /dev/null 2>&1
The fileExists should run in the current node the Pipeline is running on. So it should ideally work.
if (fileExists("/myrepo/${params.value}.txt") {
def file = readFile(file: "/myrepo/${params.value}.txt")
}
Another easy workaround is to wrap your readFile with a try-cath given you know readFile will fail if the file is not available.
def isSuccess = true
def file = null
try {
file = readFile(file: "/myrepo/${params.value}.txt")
}catch(e) {
isSuccess = false
}
The Jenkins machine is windows or macos or linux?
sh """
pwd
git checkout test_branch
"""
In case it's linux or macos add pwd to see the local full path of the repo
And then use this full path in the readfile

is there a mylyn connector for Gitlab?

I worked with bugzilla and Eclipse, and I used Mylyn to manage issues though Eclipse.
Now I use Gitlab and gitlab issues, I wonder if there is a mylyn connector for Gitlab ?
I knwow that there is this one : gitlab connector , but it is no more usable and I did not found another one.
Did someone face with the same problem and did find a solution ?
After a while I can share my solution, maybe it will help others.
There is no Mylin connector for Gitlab that runs correctly. A solution could be to debug the buggy one but in fact Gitlab is not a powerfull tool to manage issues.
I chose to use Bugzilla at least for three points :
the bug workflow is easy to customize and this is an important feature to adapt the bug workflow to the company processes
Mylin connector for Bugzilla is avalaible since a long time and runs correctly
Bugzilla is still a reference tool
The first step is to define Bugzilla as the issues management tool, this is done through Gitlab UI and the documentation is here.
For me, if an external tool is used, the best is to desactivate Gitlab issues tracking. On your project, go to Settings->General->Visibility, project features and desactivate Issues.
Note: if Bugzilla and Gitlab are deployed on the same host, you have to accept request to localhost. On Gitlab administration, go to Settings->Network->Outbound requests, select the two options about local network.
After that, you can comment your commits with a message containing Ref #id where id is a bug id in Bugzilla. As with Gitlab issue, the commit will contain an hyperlink to the issue but the hyperlink will open Bugzilla bug page.
If you do not go further, you will lost a Gitlab feature : Gitlab issue references all commits related to it.
A solution to have a similar feature with Bugzilla is to add to bug a comment with an hyperlink to commits.
This could be achieve with a server hook, this is described here.
Note : each time you change the gitlab.rb file, do no forget to execute gitlab-ctl reconfigure.
The hook has to manage "standard" commit and merge commits.
The following python code could be seen as a starting point for a such hook.
It assumes that development are done on branches named feature/id nd that commits comments contains a string Ref #id. Id is a bug id.
It could be improve:
to manage exceptions better
to manage more push cases
to check more rules such as :
the bug has to be in progess
the bug assignee has to be the git user who performs the push
the bugzilla project has to be the one for the Gitlab project
the bug is open on a version that is still under development or debug
....
#!/usr/bin/env python3
import sys
import os
import fileinput
import glob
import subprocess
import requests
#
# Constants
#
G__GIT_CMD =["git","rev-list","--pretty"]
G__SEP ="commit "
G__NL ='\n'
G__AUTHOR ='Author'
G__AUTHOR_R ='Author: '
G__DATE ='Date'
G__DATE_R ='Date: '
G__C_MSG ='message'
G__URL_S ='https://<<gitlab server url>>/<<project>>/-/commit/'
G__MERGE_S ='Merge: '
G__MERGE ='Merge'
G__URL ='URL'
G__BUGZ_URL ='http://<<bugzilla url>>/rest/bug/{}/comment'
G__HEADERS = {'Content-type': 'application/json'}
G__JSON = {"Bugzilla_login":"<<bugzilla user>>","Bugzilla_password":"<<password>>","comment": "{}"}
G__JSON_MR = {"Bugzilla_login":"<<bugzilla user>>","Bugzilla_password":"<<password>>","comment": "Merge request {}"}
G__COMMENT_ELEM = 'comment'
G__MSG_REF ="Ref #"
G__MSG_REF_MR ="feature/"
G__WHITE =" "
G__APOS ="'"
#
# Filters some parts of message that are empty
#
def filter_message_elements(message_elements):
flag=False
for message_element in message_elements:
if len(message_element)!=0:
flag=True
return flag
#
# Add an element in commit dictionary.
#
# If this is a commit for a merge, an element is added.
#
def add_commit_in_dict(commits_dict, temp_list, flag_merge):
url = G__URL_S+temp_list[0]
commits_dict[temp_list[0]]={}
commits_dict[temp_list[0]][G__URL]=url
if False==flag_merge:
commits_dict[temp_list[0]][G__AUTHOR]=temp_list[1].replace(G__AUTHOR_R,'')
commits_dict[temp_list[0]][G__DATE]=temp_list[2].replace(G__DATE_R,'')
commits_dict[temp_list[0]][G__C_MSG]=temp_list[3]
else:
commits_dict[temp_list[0]][G__MERGE]=temp_list[1]
commits_dict[temp_list[0]][G__AUTHOR]=temp_list[2].replace(G__AUTHOR_R,'')
commits_dict[temp_list[0]][G__DATE]=temp_list[3].replace(G__DATE_R,'')
commits_dict[temp_list[0]][G__C_MSG]=temp_list[4]
#
# Fill commits data
#
def fills_commit_data(commits_dict, fileinput_line):
params=fileinput_line[:-1].split()
try:
# Git command to get commits list
cmd=G__GIT_CMD+[params[1],"^"+params[0]]
rev_message = subprocess.run(cmd,stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=True)
# loop on commits
messages_list=rev_message.stdout.split(G__SEP)
for message in messages_list:
if len(message)==0:
continue
message_elements = message.split(G__NL)
# filters empty message
flag=filter_message_elements(message_elements)
if not flag:
continue
# Extracts commit data and detects merge commit
temp_list=[]
flag_merge=False
for message_element in message_elements:
text = message_element.strip()
if 0!=len(text):
temp_list.append(text)
if -1!=text.find(G__MERGE):
flag_merge=True
# adds the commit in commits dictionary
add_commit_in_dict(commits_dict, temp_list, flag_merge)
except Exception as inst:
sys.exit(1)
#
# Extract the bug id from the commit message
#
def find_bug_id(message):
issue_int=-1
pos=message.find(G__MSG_REF)
if pos==-1:
sys.exit(1)
issue_nb=message[pos+len(G__MSG_REF):]
pos2=issue_nb.find(G__WHITE)
issue_nb=issue_nb[:pos2]
try:
issue_int=int(issue_nb)
except ValueError:
sys.exit(1)
return(issue_int)
#
# Extract the bug id from the commit message
# in case of merge request
#
def find_bug_id_mr(message):
issue_int=-1
pos=message.find(G__MSG_REF_MR)
if pos==-1:
sys.exit(1)
issue_nb=message[pos+len(G__MSG_REF_MR):]
pos2=issue_nb.find(G__APOS)
issue_nb=issue_nb[:pos2]
try:
issue_int=int(issue_nb)
except ValueError:
sys.exit(1)
return(issue_int)
#
# Checks if the commit list contains a merge request commit
#
def is_merge_request(commits_dict):
flag=False
for key in commits_dict:
if G__MERGE in commits_dict[key]:
flag=True
break
return flag
#
# Add a comment to a bug
#
def add_comment_to_bug( commit_data):
bug_id = find_bug_id(commit_data[G__C_MSG])
url = G__BUGZ_URL.format(str(bug_id))
G__JSON[G__COMMENT_ELEM] = G__JSON[G__COMMENT_ELEM].format(commit_data[G__URL])
response = requests.post(url, json=G__JSON, headers=G__HEADERS)
#
# add a comment in case of merge request
#
def add_mr_comment_to_bug(commits_dict):
commit_data=None
for key in commits_dict:
if G__MERGE in commits_dict[key]:
commit_data=commits_dict[key]
break
bug_id = find_bug_id_mr(commit_data[G__C_MSG])
url = G__BUGZ_URL.format(str(bug_id))
G__JSON_MR[G__COMMENT_ELEM] = G__JSON_MR[G__COMMENT_ELEM].format(commit_data[G__URL])
response = requests.post(url, json=G__JSON_MR, headers=G__HEADERS)
#
# Main program
#
def main():
# dictionary containing all commits
commits_dict={}
# loop on inputs referencing data changes
for fileinput_line in sys.stdin:
fills_commit_data(commits_dict, fileinput_line)
# find if this is merge request or not
flag_merge_request = is_merge_request(commits_dict)
if False==flag_merge_request:
# loop on commit to add comments to bugs
for key in commits_dict.keys():
add_comment_to_bug(commits_dict[key])
else:
# in case of merge request, only the merge commit has to be added
# others commits have been processed before
add_mr_comment_to_bug(commits_dict)
if __name__ == "__main__":
main()

Gitlab CI pipeline failing: a tag issue

My gitlab CI pipeline is setup to run maven tests from a docker image created from my maven project.
I have tested the pipeline on my master branch and it worked fine and ran the test.
However I have created a new feature branch and now running the pipeline yet again, however I now get this error
error checking push permissions -- make sure you entered the correct tag name, and that you are authenticated correctly, and try again: getting tag for destination: repository can only contain the runes `abcdefghijklmnopqrstuvwxyz0123456789_-./`: it2901/cs344-maven:feature/produce-allocation-pdf
ERROR: Job failed: command terminated with exit code 1
I can't seem to pinpoint the problem at all. I have also pushed the tag: tut3 to the feature branch as well.
Here is my .gitlab-ci.yml: https://controlc.com/7a94a00f
Based on what you shared, you have this configured:
VERSIONLABELMETHOD: "tut3" # options: "","LastVersionTagInGit"
It should be either:
VERSIONLABELMETHOD: ""
or
VERSIONLABELMETHOD: "LastVersionTagInGit"
or
VERSIONLABELMETHOD: "OnlyIfThisCommitHasVersion"
When you specify "tut3", the script takes it as if it was "" (empty string). Assuming you didn't define $VERSIONLABEL anywhere $ADDITIONALTAGLIST will also be empty.
And later in the code you can see that this gets executed:
if [[ "$CI_COMMIT_BRANCH" == "$CI_DEFAULT_BRANCH" ]]; then ADDITIONALTAGLIST="$ADDITIONALTAGLIST latest"; fi
Assuming $CI_DEFAULT_BRANCH is set to master if you use a separate branch mybranch the code above won't get executed so it's likely that the Kaniko command line doesn't have any a neither a valid $FORMATTEDTAGLIST or $IMAGE_LABELS.
You can debug by seeing their output on the script which is happening at the end before calling Kaniko:
...
echo $FORMATTEDTAGLIST
echo $IMAGE_LABELS
mkdir -p /kaniko/.docker
...
A hack would be to override $CI_DEFAULT_BRANCH with your custom branch.
✌️

Fail a merge or build if a particular file has changed?

On VSTS, we have some files we want to protect on certain branches. How can one fail a merge/build if a particular file has changed?
First prize is to configure this on the build server, which in this case is VisualStudio.com (VSTS / GIT).
Scenario: we have various release branches v1, v2, v3. We want to protect the packages.json file to prevent anyone updating Nuget packages on these branches. So if the package.json file has changed on a pull request into "v3", don't allow the merge.
For Git, you can protect a certain branch (not a certain file), then all the files exist in the branch will be protected.
You can use Branch security which users/groups can contribute for the branch.
Or you can use Branch Policies to protect not commit changes on the branch directly but use pull request to make changes etc.
Or you can lock a branch to prevent updating.
To expanded Starain's answer:
First create a build definition for the branch you want to protected (such as select V3 branch in get sources step). And add a powershell task with the content below:
$head=$(git rev-parse HEAD)
$parents=$(git rev-list --parents -n 1 $head)
$p1,$p2,$p3=$parents.split(' ')
If ($p1 = $head)
{
$parent1=$p2
$parent2=$p3
}
ElseIf ($p2 = $head)
{
$parent1=$p1
$parent2=$p3
}
Else
{
$parent1=$p1
$parent2=$p2
}
$outp1=$(git diff $head $parent1 --name-only)
$outp2=$(git diff $head $parent2 --name-only)
If ($outp1 -contains 'package.json')
{
echo "changed the specified file on the branch which contains commit $parent1"
exit 1
}
If ($outp2 -contains 'package.json')
{
echo "changed the specified file on the branch which contains commit $parent2"
exit 1
}
So that when the file package.json has been change, powershell script will fail the build result.
Then add the branch policy for the branch which you want to protect.
Add build policy -> select the build definition you just created -> Policy requirement as Required -> Build expiration 0.1 hours (6 min) or other values since it’s every fast to queue a build with a powershell task -> save.
You can try to do it in the build, simple workflow:
Configure branch policy for a succeed build required
Check whether the specific file changed in that build
Fail build if specific file has been changed
You can put a required reviewer for a particular folder/file in VSTS for a particular branch.
In this way the person won't be able to check-in without getting an approval from the required reviewer.
Git doesn't really work that way; individual files don't have any sort of security on them.
You could use a pre-commit hook, but it's important to note that those hooks are client-side, not server-side -- each user would have to set up a pre-commit hook.
VSTS/TFS doesn't support Git server hooks (at least, not to the extent that it can block a push), otherwise a pre-receive or update hook would be exactly what you want.

Terraform - Pass in Variable to "Source" Parameter

I'm using Terraform in a modular fashion in order to build out my infrastructure. I do this by having a configuration file that calls in the different modules. I want to pass an infrastructure variable which picks up what tagged version of the Github repository the application should be building out. Most importantly I'm trying to figure out how to make a concatenation of a string happen in the "source" variable of the configuration file.
module "athenaelb" {
source = "${concat("git::https://github.com/ORG/REPONAME.git?ref=",var.infra_version)}"
aws_access_key = "${var.aws_access_key}"
aws_secret_key = "${var.aws_secret_key}"
aws_region = "${var.aws_region}"
availability_zones = "${var.availability_zones}"
subnet_id = "${var.subnet_id}"
security_group = "${var.athenaelb_security_group}"
branch_name = "${var.branch_name}"
env = "${var.env}"
sns_topic = "${var.sns_topic}"
s3_bucket = "${var.elb_s3_bucket}"
athena_elb_sns_topic = "${var.athena_elb_sns_topic}"
infra_version = "${var.infra_version}"
}
I want it to compile and for the source to look like this (for example): git::https://github.com/ORG/REPONAME.git?ref=v1
Anyone have any thoughts on how to make this work?
Thanks,
Keren
This is not possible currently in Terraform itself.
The only way to achieve something like this is to use a separate script to interact with the git repository that Terraform clones into a subdirectory of the .terraform/modules directory and switch it to a different tag depending on which version you need. This is non-ideal since Terraform organizes these into directories based on a hash of the module path, but if you can identify the module in question it is safe to run git checkout within these repositories as long as you do not run terraform get again afterwards.
For more details and discussion on this issue, see issue #1439 in Terraform's issue tracker, where this feature was requested.
You could use envsubst or python jinja and use these wrapper scripts in your pipeline deploy script to actually build the scripts from .envsubst and .jinja files before your terraform plan/apply
https://github.com/uvoo/process-templates/tree/main/scripts
I wish terraform would support this but my guess is they never will so just add some simple functions/files into deploy scripts which is usually the best way to deploy.