Azure Pipelines Get Latest files only - azure-devops

I'm looking for a way to create an artifact that I can attach to a deployment pipeline that only contains the files that were changed in the commits that triggered this build.
What I have is a repo that has change scripts for database objects, so I want to package up only the change scripts for the last commit into a zip file and attach it to the build outputs. That way I can take zip file and apply each of the files on top of the database, this will be done later in a different step, right now I'm just trying to get all of the files that were changed.
Editted
I have created the following step in the YAML file based on the comments below
- powershell: |
#get the changed template
echo "git diff-tree --no-commit-id --name-only -r $(Build.SourceVersion)"
$a = git diff-tree --no-commit-id --name-only -r $(Build.SourceVersion)
#assign the filename to a variable
echo "Files"
echo "##vso[task.setvariable variable=fileName]$a"
- powershell: |
#Print Files
$fileName: echo "$env:fileName"
Below is the result, you can see that no files are changed. Here I changed the Readme file, which triggered the build.

Not sure if this would help you, but hopefully will point in the right direction.
Assuming you have Git as source control. Have you considered to query those changes using Git instead? (I haven't tried, but I bet you'll be able to find in the pipeline metadata which merge triggered the build, and then use it to query Git for the file changes.
Have a look at this question in Stackoverflow
Hope it will help.

If you are using Git version control, you could try to add a script task to get the changed file names in your pipeline, copy them to artifact directory and then publish them.
It is easy to get the changed files using git commands git diff-tree --no-commit-id --name-only -r commitId. When you get the changed file's name, you need to assign it to a variable using expression ##vso[task.setvariable variable=VariableName]value. Then you can use this variable in the copy and publish task.
You can check below yaml pipeline for example:
- powershell: |
#get the changed template
$a = git diff-tree --no-commit-id --name-only -r $(Build.SourceVersion)
#assign the filename to a variable
echo "##vso[task.setvariable variable=fileName]$a"
- powershell: |
echo "$env:fileName"

Related

Git Log in Powershell

My command is to pull log from repository based on a previous tag till the latest HEAD. Below is the command i run in powershell
git log 10.01.39.000..head --oneline --name-only --pretty=format:
The same when i try to do it by substituting a variable in place of the tag name does not give me the output. Should the .. be escaped in powershell? Powershell is considering the ..Head as a method for the variable.
git log $tag..head --oneline --name-only --pretty=format:
In this case, you want to:
Force expansion of $tag before passing the argument to git
Prevent evaluation of the .. operator
The easiest way to do this is to construct an expandable string literal using ":
git log "$tag..head" --oneline --name-only --pretty=format:...

Receiving "fatal: not a git repository (or any of the parent directories): .git" while using pat to push code into bitbucket using azure powershell

I am writing an Azure PowerShell script that will consume the JSON file which has the location of all my SQL scripts and Migrationflag column (which holds to execute/to be executed) and execute all the sequence of scripts.
upon execution, the flag will change to 'N' and the updated JSON file should be uploaded to bitbucket.
Now, I am stuck with "fatal: not a git repository (or any of the parent directories): .git" error while trying to push.
I've created a pat token and service connection with username: santhoshsreshta and below is the code to push.
$v_JSON = Get-Content '$(system.defaultworkingdirectory)\locationToBuild\BuildOrder.json' -Raw | ConvertFrom-Json
$v_JSON | Sort-Object -Property OrderNo | Where-Object {$_.MigratedFlag -like 'Y'} | ForEach {
$Script = $_.Location
Write-Host "Executing Script: $Script"
Invoke-Sqlcmd -ServerInstance "myservername" -Database $(database) -Username $(testauto_username) -Password $(testauto_password) -InputFile $(system.defaultworkingdirectory)\$Script
$_.MigratedFlag = 'N'
}
$v_JSON | ConvertTo-Json -depth 32| set-content '$(system.defaultworkingdirectory)\locationToBuild\BuildOrder.json'
$MyPat = 'mypatcode'
git push https://mygitusername:$MyPat#bitbucket.org/xyz/abcd.git
getting the error,"##[error]fatal: Not a git repository (or any of the parent directories): .git"
but when issuing git clone https://mygitusername:$MyPat#bitbucket.org/xyz/abcd.git -- getting invalid username/password error.
I believe we should not clone again as my pipelines get sources task will clone it and puts in a self-hosted agent.
this is my git url: https://mygitusername#bitbucket.org/xyz/abcd.git
Thanks a ton,
A DevOps, PowerShell newbie here.
Receiving “fatal: not a git repository (or any of the parent directories): .git” while using pat to push code into bitbucket using azure powershell
That because we are using git feature, but we are not in the folder managed by git.
When we use Azure powershell task directly, the default work folder should be:
PS C:\Users\<Username>
Obviously, there are no files managed by our git in this directory. That the reason why you get the error Not a git repository.
So, to resolve this issue, we just need to switch the working folder to the repo folder by the command:
cd $(System.DefaultWorkingDirectory)
Check the Use predefined variables and this thread for some details.
Update:
The test sample:
cd $(System.DefaultWorkingDirectory)
cd leotest
echo 123>Test.txt
git add Test.txt
git commit -m "Add a test file"
git push https://Username:password#bitbucket.org/Lsgqazwsx/leotest.git HEAD:master
It means that there is no local .git & you need to do the below first:
git init
git commit -m "first commit"
git branch -M main
git remote add origin https://your-repo-url
git push -u origin main
I just copied the below from GitHub page & it works for Azure Repos :-) further showing the power & advantage of companies following universal standards

Azure devops - update json file - powershell script

I have created powershell script to update json file with variables. Json file is located in Azure devops repo, json file name var.json.
I am going to use this solution in azure devops, so I built pipeline and set test variable in variables tab in azure devops:
In my script I have param and variables blocks, presented below:
param(
[Parameter (Mandatory=$true)]
[String] $FileRes
)
#env variable
$Path = $Env:BUILD_SOURCESDIRECTORY
# Download variables from Json file
$JsonBase = #()
$JsonPath = "$Path\Var.json"
$JsonBase = Get-Content $JsonPath | out-string | ConvertFrom-Json
$JsonBase.FileNames[0].value = $FileRes
in my script I use commands: $JsonBase | ConvertTo-Json | Set-Content -Path $JsonPath to direct output to json file.
Json file structure:
{
"FileNames": [
{
"value": "AAAbbbccc123",
"value1": "www",
"value3": "swd",
"value4": "xvb"
}
]
}
Pipeline's status at the end is ok, all steps are green, but var.json file is not updated as I wanted. There is still old value --> "value": "AAAbbbccc123"
In fact, it has been replaced, but you need to see this change in the output repos.
For more clearly, you could use private agent to run this build. Then go the corresponding local repos and check the Var.json file after the build finished:
In your script, you are Set-Content into the file which exists under the $(Build.SourcesDirectory)\Var.json, not the one which stored in VSTS repos. So, to check whether it is replaced successfully, please go your output repos, the one in agent.
Sometimes, if what you used is hosted agent, you may could not view the detailed output repos since the host image will be recycled by the server after the pipeline finished.
At this time, you can add another script in it to print the JSON file content out, then you could check whether it is replaced successfully:
$content= Get-Content -Path $JsonPath
Write-Host $content
In addition, please make a little change into your script:
$JsonBase.FileNames[0].value = "$(FileRes)"
Here please use $(FileRes) instead of $FileRes, since you specified the value in the Variables tab. And do not forget the double quote "".
Update:
To sync the output repos change back into VSTS repos, try follow:
(1) The first command line task:
git config --global user.email "xxx#xx.com"
git config --global user.name "Merlin"
cd $(Build.SourcesDirectory)
git init
(2) In powershell task, execute set-content script.
(3) In second command line task, do git push to push the changes:
git add Var.json
git commit -m "aaaa"
git remote rm origin
git remote add origin https://xxx#dev.azure.com/xxx/xxx/_git/xxxx
git push -u origin HEAD:master
In addition, to run git script successfully in pipeline. Beside enable “Allow script to access........” you also should follow this permission setting.

Bash script that pulls subfolder from github

I've made a simple script in bash to get subfolders from my repo on GitHub.
The problem i'm having is that I want to be able to run the script several times getting several subfolders. The script currently run once. Then i get the message "branch master already up to date" how can I change it so this does not happen and I can pull several folders?
getFolder()
{
repository="$1"
folder="$2"
remote="$3"
branch="$4"
if [ "$repository" = school ]
then
repository=https://github.com/mergin/School
fi
git init
git remote add "$remote" "$repository"
git config core.sparsecheckout true
echo "$folder"/ > .git/info/sparse-checkout
git pull "$remote" "$branch"
}
This is the current function that i use.

SCM environment variables missing

Usually, when using SCM like the Git Plugin, there are a bunch of environment variables that you can use (e.g. see these)
But neither the Git Step nor the Generic SCM seem to do that.
Is there a way to get these variables into the groovy env.* so that they can be used?
Something like this would be useful:
def commitMessage = sh 'git log --max-count=1 --oneline --no-merges | cut -b9-'
I can think of writing the results to a file and read them via the readFile() mehtod -- but is there an easier way to achieve this?
For the Record: I have the following code to get the branch-name:
stage 'preparation'
node {
// checkout branch
git branch: 'origin/master', url: 'git#example.net:project.git'
// write current branch-name to file
sh 'git branch -a --contains `git rev-parse HEAD` | grep origin | sed \'s!\\s*remotes/origin/\\(.*\\)!\\1!\' > git-branch.txt'
// read data from file into environment-variable
env.gitBranch = readFile('git-branch.txt').trim()
// let people know what's up
echo "testing branch ${env.gitBranch}"
}
The remainder of the flow-script is comprised of serveral parametrized jobs which get the env.gitBranch passed as parameter (among others, if needed).
Be sure to allow concurrent builds for the workflow to catch every updated branch.
See JENKINS-24141; these variables are not yet available from Workflow.
In the meantime, you are on the right track: run a git command to record any information you need, and use readFile to load it (see also JENKINS-26133).