What is the easiest way to apply the changes from a specific changeset from one TFS instance on another instance?
What I want is to get some sort of patch file from instance A that I can apply to instance B. Since there are two different instances, a traditional branch/merge approach cannot be used. And as far as I know, TFS has poor support for patch files in the traditional Unix-sense.
Do I really need to inspect a changeset on instance A and manually zip the relevant files which I can then extract into the source tree of instance B?
Turns out that the "patch" route was a dead end due to lack of support in TFS. The solution we ended up with was to perform a nightly job which basically does the following:
Get all code from remote repo with a read-only user.
Overwrite all content of a separate branch in our repo with the content from the other.
Perform a merge from that separate branch to the main branch whenever we want to bring their changes into our main branch.
This answer explains how to create a patch file using the tf diff command. However, there is no built-in way to apply that patch file to another instance or branch. I have not seen any third-party tools to do so either.
I wrote a blog post about a similar issue where I used the TF.exe command and 7Zip to create a TFS patch file that could then be applied on another TFS server or workspace. I posted the the Powershell scripts at Github, which can be used to Zip up any pending changes on one workspace and then apply them to a different server. It would have to be modified to use a changeset instead of pending changes, but that shouldn't be too difficult to accomplish.
Related
I want to create automated deployment pipeline for azure datafactory.
For one stream of development we can configure it using doc
https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment
But when it comes to deploying to two diff test datafactories for parrallel features development (in two different branches), it is not working because the adb_publish which gets generated is only specific to the one datafactory.
Currently we are doing deployement using powershell scripts and passing object list which needs to be deployed.
Our repo is in Azure devops.
I tried
linking the repo to multiple df but then it is causing issue, perhaps when finding deltas to publish.
Creating forks of repo instead of branches so that adb_publish can be seperate for the every datafactory - but this approach will not work when there is a conflict, which needs manual merge, so the testing will be required again instead of moving to prod.
Adf_publish get generated whenever you publish. Publishing takes whatever you have in your repo and updates data factory with it.
To develop multiple features in parallel, you need to just use "Save". Save will commit your changes to the branch you are actually working on. Other branches will do the same. Whenever you want to publish, you need to first make a pull request from your branch to master, then publish. Any merge conflict should be solved when merging everything in the master branch. Then just publish and there shouldn't be any conflicts, and adf_publish will get generated after that.
Hope this helped!
Since a GitHub repository can be associated with only one data factory. And you are only allowed to publish to the Data Factory service from your collaboration branch. Check this
It seems there is not a direct and easy way to accomplish this. If forking repo as workaround, you may have to solve the conflicts before merging as #Martin suggested.
seeking for advice about such problem.
We have stack of microservices written on NodeJs and running on Kubernetes cluster. We have separate GitHub repository for each of them and currently using Circleci for our CI/CD process. As of now we have about 25-30 repos, but their number will increase and problem that we faced now is that we need to have Circleci config yaml in each repository and if we need to change something globally in our ci/cd pipeline, we need to update this in each repository, which is obviously pretty painful process and Circleci doesn't support to have one config file for multiple repos.
I believe our situation/setup in terms of multiple repos is not unique, does anybody have experience/ideas of which CI tool support described scenario of having one config file for multiple repos?
Below are 2 approaches that I considered when had to deal with similar situation. You'd need to define for yourself what you want to optimize for and make a decision based on that
Optimizing for flexibility and isolation. In this scenario instead of making all repos use the same config file, you're keeping the file in each repo and automating how you manage this file.
For example: you'll have to create a CLI tool or a script to automate copying circle file and committing to appropriate repos (whenever a change needs to happen)
PROS: isolation - all repos have their own configuration, if you ever going to have a golang microservice or different config in one of your nodejs services, modifying CI pipeline wouldn't be an issue
CONS: a bit of extra work to write automation around managing this config separately
Optimizing for easier maintainability. Figure how to share single pipeline configuration across your repos.
For example: use git submodules for keeping circle.yml file, or use separate npm package with circle.yml file. Another alternative is to use a CI tool that supports templating, then define pipeline template and re-use it for each individual pipeline (one of the CI tools that supports it - Teamcity)
I personally picked approach #1 in similar situation. IMHO, this is a price one have to pay when one decides to go with microservices to not end up with a platform that is rather a distributed monolith :) also I really liked when all repos are descriptive and self contained and CI pipeline as code is one of the ways to help achieve that
In my mind you have 2 options - you could have a single CI job/config that can deploy any single/multiple services (if all the services are the same). Or if every service is different than you need a separate job/config for each. If it's somewhere in the middle it's a question of whether you want a single job that has a bunch of if/then statements e.g. "if repo = user then do this special thing." The if/then approach worked fine for me up to a point, but eventually, there were too many special cases at it was easier to just go with the unique config for each service.
I solved the issue of it "being hard to make a 1 line change across 30 git repos" by having a git superuser. Basically, normal users can only merge using PRs, but the superuser can commit directly. Since I'm only changing things like config files there are rarely merge conflicts or broken test cases so it works. Here's some sample code:
#!/usr/bin/env bash
for dir in /temp/*/
do
cd $dir
git pull
sed 's/Nick/John/g' report.txt > report_new.txt
git commit -m "CI change" && git push
cd ..
done
I'm using Visual Studio Team Services to build my project which is stored in GitHub (here). The master branch contains multiple projects which make up the solution. Amongst those are a WebAPI project and a Cordova project. I need to build those using two separate build definitions in VSTS.
Previously I had set-up my build definition and used the branch filters to filter on what had been pushed to the repo. For instance:
master/src/API
This worked, but it doesn't any more. It seems as if the underlying code has changed. A filter of 'master' still works and I understand how this feature is probably meant to filter specifically on branches and maybe not on folders within the branch?
It's not a huge problem, but at this time all of my builds will trigger with every check-in, even if nothing changed in the meantime for that source code. So I'm not wondering what a good solution for this issue would be:
Put every project in it's own branch. Seems like a workaround
Some other filter option or maybe another syntax or something?
Leave it as it and don't worry about the extra builds (but that itches, you know...)
Anyone running a similar set-up?
Path filters is not supported for VSTS GitHub CI Build, it is available for Git CI Build on VSTS. You can vote this user voice: https://visualstudio.uservoice.com/forums/330519-team-services/suggestions/15140571-enable-continuous-integration-path-filters-for-git
The workaround is as you said that put every project in its own branch.
This is for a repository containing a library. The library version number is incremented (manually) each time a Merge Request to master is accepted.
However, if I want to access a file from version X.Y.Z, I have to look for the commit that incremented the version number to X.Y.Z, get its date, and then look in the history of the file for the version at that date.
I would like to create a tag per version, automatically when the Merge Request to master is created. Is this possible?
I hoped it would be possible with the new GitLab slash commands, but there currently is not support for tags.
Is there any other possibility than using web hooks?
While facing the same challenge, I stumbled upon this suggestion on GitLab's former issue tracker on GitHub1:
“You can write up a script to use GitLab API to accept a merge request, get the commit of the merge and then tag that commit.” --MadhavGitlab
(just to mention that — for me that's not sufficient)
1 EDIT:
Looks like all issues have been purged from the GitHub mirror, so this link does no longer work, but luckily the relevant quote persists right here.
I first tried to do it the gitlab way, by creating a .gitlab-ci.yml file in the project top-level directory. That file can contain the commands creating the version tag. The user executing the script has to have enough permission to push to the git project, and be configured with authoring information.
I finally did it on a Jenkins server, where I created a job that is invoked when commits are pushed into a specific branch. The tag can be created in the execute shell commands.
Creating a patch is very easy in SubVersion, With Tortoise, you right-click and select Create Patch. But for the life of me, I can't find this functionality in TFS. Is this possible?
If not, what's the standard way to submit patches in open source TFS hosted projects (a la CodePlex)?
tf diff /shelveset:shelveset /format:unified
Edit: This writes to standard output. You can pipe the output to a file.
For more options, see Difference Command.
Because TFS doesn't natively support patch files, the most common thing I see people do on CodePlex is simply zip the modified files and upload the zip. The project coordinator then does a diff against their own checkout.
However since CodePlex also supports TortoiseSVN, more and more people are using that to create their patch files.
I wrote a blog post about a similar issue where I used the TF.exe command and 7Zip to create a TFS patch file that could then be applied on another TFS server or workspace. I posted the the Powershell scripts at Github, which can be used to Zip up any pending changes on one workspace and then apply them to a different server. It would have to be modified to use a changeset instead of pending changes, but that shouldn't be too difficult to accomplish.