Jfrog Pipelines not syncing when I changed values.yaml - jfrog-pipelines

I have updated values.yaml but not pipelines.yaml and that update was not synced.
To get the changes effective, I had to manually sync the pipeline - is there a solution to this? How can I automatically sync the pipeline on values yaml file changes as well?

You will have to move to config folder structure which means create a folder called .jfrog-pipelines and place your values.yaml and pipelines.yaml files under this folder.
Check the sample example here in JFrog Public Repository

Related

Can you move the location of azure-pipelines.yaml to another Azure Repository?

he is my scenario =>
I had an azure repos A containing 3 folders A1 A2 A3, in each folder i had a CI file and a CD file.
I wanted to split my repo and reconfigure my pipeline to point to the new repos and keep the some path.
changing the path in the some repos is possible but I was wondering if it's possible to reconfigure an existing pipeline to use a yaml file from another repos ?
Screenshot:
here I can chose another file but not another file from another repos
Thank you in advance = )
Sure, you can use templates as described here (https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops).
E.g. you have a YAML Repository called "YAML" containing the file build.yaml and you have the Repository called "UsingYAML" which needs the build.yaml file.
To access the build.yaml in YAML-Repo do the following:
name: 'Pipeline_using_a_template'
steps:
- template: build.yml#YAML # Template reference for build
I just found whre you can do it
Edit your pipeline, and it's in the Triggers tab
Then go to yaml tab => get sources => Repository

How do I use an Azure DevOps Services Build Pipeline to retrieve TFVC source files based upon a Label value and then zip those files?

This is a TFVC repo in Azure, not Git. It is running in Azure DevOps Services, not local in Azure DevOps Server (2019). This is a classic pipeline, not YAML.
I got as far as adding a variable that contains the Label value I am looking to package into the zip file.
I can't figure out how to get the sources by Label value. In the Pipeline Get Sources step, I've narrowed the path down, but then I need to recursively get source files that have the Label in the variable I defined.
The next step is to zip those source files up, I've added an Archive task to which I will change the root folder from "build binaries" to the sources folder.
This is necessary for this particular project because we must pass the source files to the vendor as a zip for them to compile and install for us. The developers create/update the source files, build and test them locally, then apply a Label to the sources for a given push to the vendor.
When configuring 'Get sources' step, there is no any option or method that can only map the source files with the specified label.
As a workaround, in the pipeline job, you can try to add the steps to filter out the source files with the specified label, and use the Copy Files task to copy these files to a folder, then use the Archive Files task in this folder.
[UPDATE]
Normally, a pipeline run will automatically check out the file version (changeset) that triggers the run. If manually trigger the the pipeline, by default the run will check out the latest changeset if you do not specify one.
The labels are used to mark a version of a files or folders, so you also can get the specific version of files or folders via the labels.
In your case, you can try using the 'tf get' command to download the files with the specified labels.

how to set default branch in Azure DevOps yaml?

This is my YAML file, as you can see, the YAML file is in the master branch.
When I manually queue a build, by default, it uses master as the branch to build.
I am wondering if I can set the default branch to R_current_sprint?
When you edit the YAML in Azure DevOps click on Triggers:
Then go to the YAML tab, click on Get sources, and you will see the option to choose the default branch:
You will want to put a copy of this yaml file in your other branch named R_current_Sprint. Be sure to rename all instances of master in the R_current_sprint branch yaml file to R_current_sprint.

Azure DevOps - How to update pipeline repo source without re-create a new one?

I am planning to move my Azure pipeline source files to a new Azure repo. How can I update the existing pipeline setting to point to the new Azure private repo location? I prefer not to re-create the repo and variables. Right now, if I edit the current pipeline setting, it would only allow me to select .yml files from the current repo. There's not option to change repo.
Thanks
From your description, the pipeline is Yaml type.
You could navigate to Triggers -> YAML -> Get sources.
Then you could select the target Azure repo.
If the Yaml file has different name, you could also select the target yaml file in the Settings.

How to maintain hundred different Terraform configs?

we created a Terraform template which we will deploy in future many dozen times in different workspaces. Each workspace will have a separate configuration file.
Now we would like to automate this procedure and think about keeping the configuration files in a Git Repo.
Has anyone a best practise how to store the configuration files in a Git Repo and trigger a CICD workflow (Azure DevOps)?
In general we only would like to apply changes for workspaces that have a changed configuration.
The terraform plan and apply command have an option for you to pass in the tfvars file you want to use. So something like this:
terraform apply --var-file=workspace.tfvars
So in the pipeline you would grab your terrafom template artifacts and your config files. I would then set a TF_WORKSPACE variable to force your workspace and I would also make your tfvars files match the workspace name so you can re use the variable in your apply command. This would force your workspace and configuration file to match.
To trigger this when those files have changed would require a path trigger that would trigger on those changes.
I don't see any harm in running the Terraform every time regardless if changes occur. The worse possible outcome would be that someone made a change that isn't in Terraform and it gets undone.