Why is OpsHub Visual Studio Online Migration Utility Kicking off Builds - azure-devops

We have started making use of the OpsHub Visual Studio Online Migration Utility to migrate a few projects from TFS (on-premises) to Visual Studio Online (TFVC to TFVC).
We have now migrated two projects and in both cases the migration process has kicked off literally thousands of builds on our VSO build agents. The projects that it kicks off the builds for seem to be random (projects that are not part of the migration). In many cases the builds are CI builds that are running against commits from weeks/months ago. When the builds are started, the majority of the builds are being started on behalf of "Anonymous".
Has anyone else experienced this with OpsHub? If so, how do we get it to stop doing this? We have four more projects to migrate and we'd love to resolve this before starting another migration because it makes our build servers unusable while we kill the (thousands of) builds.
Thanks!

You'd want to either disable all of your build definitions until the migration is completed, then delete any queued builds and re-enable them or have NO_CI added to the comments to prevent triggering builds. I'd recommend just disabling the build definitions during the migration.

Please check to make sure continuous builds are not enabled for the projects or source collection which is being migrated.

Related

How do you combine artefacts from multiple builds in Team Services?

I have a project where I need to perform multiple builds - across different agents. I have a Visual Studio project, an Xcode project and a NPM+Gulp project where I need to combine the artefacts into a single archive.
I have set up builds for each. And my first try was to set up a Release which downloaded the artefacts from each build and packaged it all up. But I didn't figure out how to get that package from the Release. (I had tried to use the Publish Artefact step, but got and error because apparently it can only be used for a Build)
So now I'm looking at creating a separate Build that does the packaging - and then puts the result in its Artefacts. But this made me wonder if there are better ways to deal with this in Team Services?
What I also want to achieve here is to trigger builds of my VS, Xcode and NPM Builds when I push to my git repo, then have "something" kick of a packaging step when all those three builds are done. What's a good way to do that in Team Services?
The scenario you're describing sounds like you want everything to be a part of a single build, not separate builds.
Since you have some pieces that run on Windows and some that run on MacOS, you can use multi-phase builds to run different "sections" of the build across different agents.

Sitecore deploy changes from local to another remote env and source controlling

I am using Sitecore 6.6.0, we have multiple environments
Local
DEV
QA
PROD
I have to deploy few changes directly from Local to Prod (Don't ask me why directly to PROD, even if it is for QA, my question remains same), what I am doing is create a package on my local with all items and separately create folder structure for all files related to the fix an deploy that to PROD.
There is always a chance of human error, since I will have to remember all associated items and files for a fix, so is there a better automated way, which will not skip any changed Items or Files?
On the other note I am using Bit-bucket for source controlling sitecore code what about sitecore DBs? most of the sitecore developments stays in DBs. What is the best approach to source control sitecore DBs?
Update
Installed packages from nuget
After installing Unicorn from nuget and unicorn.default.config, I get the following error
Attempt by method 'Unicorn.Data.DataProvider.UnicornDataProvider..ctor(Unicorn.Data.ITargetDataStore, Unicorn.Data.ISourceDataStore, Unicorn.Predicates.IPredicate, Rainbow.Filtering.IFieldFilter, Unicorn.Data.DataProvider.IUnicornDataProviderLogger, Unicorn.Data.DataProvider.IUnicornDataProviderConfiguration, Unicorn.Predicates.PredicateRootPathResolver)' to access method 'System.Action`1<System.__Canon>..ctor(System.Object, IntPtr)' failed.
Further after following the ReadMe on Github
When I do a sync on site/unicorn.aspx.
[P] Auto-publishing of synced items is beginning.
ERROR: Method not found: 'Sitecore.Publishing.Pipelines.Publish.PublishResult Sitecore.Publishing.Publisher.PublishWithResult()'. (System.MissingMethodException)
at Unicorn.Publishing.ManualPublishQueueHandler.PublishQueuedItems(Item triggerItem, Database[] targets, IProgressStatus progress)
at Unicorn.Pipelines.UnicornSyncEnd.TriggerAutoPublishSyncedItems.Process(UnicornSyncEndPipelineArgs args)
at (Object , Object[] )
at Sitecore.Pipelines.CorePipeline.Run(PipelineArgs args)
at Unicorn.ControlPanel.SyncConsole.Process(IProgressStatus progress)
Solution:
For older sitecore versions (pre 7.2 iirc) you need to disable the auto
publish config file as it relies on a method added later by sitecore.
https://github.com/kamsar/Unicorn/issues/103
In order to track the database changes you are making, you will first need to install software that will be able to help you serialize your changes and store in source control. Team Development for Sitecore (TDS) and Unicorn are the two most popular options.
You will also want to make sure you have your own local database where you are making your changes so you can isolate those changes from your QA, PROD, etc. allowing you to maintain the same level of isolation you do for developing code.
Automation of this process helps reduce the human error you mention for the deployment by introducing a repeatable and known process. Here are a few blogs that can help you get started:
Jason Bert - Continuous Deployment (Git/TDS/TeamCity)
Jason St-Cyr - Automating with TeamCity and TFS (TFS/TDS/Team Build)
Andrew Lansdowne - Auto deploy Sitecore Items using Unicorn and TeamCity (Unicorn/TeamCity)
Brian Beckham - TDS and Build Configurations
You may also want to look into configuration transforms to support different values in your Sitecore Include patch files. SlowCheetah plugin will let create the transforms in Visual Studio (it might be in Visual Studio 2015 now...). TDS can pick up those transforms automatically and execute them on the build server for you, or you can do it with Visual Studio itself to create published packages.
For Sitecore versioning and deployment Unicorn is also a good option.
https://github.com/kamsar/Unicorn
Cheers,
Bo

Migrate VSO to VSO with history and work items

I'm using VSO xxx.visualstudio.com. We have to migrate with history to yyy.visualstudio.com. I know there is no direct tool. Looking for good approach or solution.
Unique Difference with, below question:
Visual Studio Online migration (VSO to VSO)
Export Source code with work Items
TFS Integration Platform tool is not working.
OpsHub company is taking too long to reply.
You can use the TFS Integration Tools to move TFVC code and Work Items from one Team Project to another regardless of TFS/VSO version.
I have done this a number of times and it works pretty well..

Migrating on-premise TFS 2012 to VSO with free opshub tool

We are moving from on-premise tfs 2012 to the visual studio online environment.
For this we need to move a lot of projects, most of them aren’t a problem with the free opshub tool.
But 2 of our projects are really big and the tool would take about 8~9 days to complete the migration, which is too long for us.
What I would like is the following:
Start the migration from on-premise to vso
Keep working in the on-premise
After the migration has finished, migrating the commits that were made in the last 8~9 days.
I’m not sure the free opshub tool supports something like this.
Is there a way to do this with the free tool? If so how does this work?
That is feasible. You just start the migration initially, as your initial run is completed, and the status of migration goes to not running, you go the View Progress page, and start the already configured migration.
This will pick the newly added changes, that were done to the on-premise instance, after the actual migration started.
Note: You will get a process vs total revision count mismatch in status page in such cases, as total count is not calculated only once at start of the migration.

Automated Building and Release Management VS2012

Trying to make my life easier, Currently we have 4 developers working in Visual Studio 2012 and we are using TFS 2012 for source control. The project we work on is a multi-tenant web application (single source directory with multiple dbs) that is a mixture of legacy, asp and vb6 com components, coupled with new C# code. We use TFS for source control and for managing User Stories and Bugs. Because of the way our site works it can not be ran or debugged locally only on the server.
Source Control is currently setup with a separate branch for each developer that's working directory is mapped to a shared network path on the dev server that has a web site pointed to it in IIS. Dev01-Dev05 etc. The developers work on projects in their branch test it using their dev website, then check in changes to their own branch and merge those into the trunk. The trunk's work space is mapped to the main dev website so that the developers can test their changes against the other customer's dev domains to test against customizations and variances in functionality based on the specific dbs the are connected to.
Very long explanation but basically each dev has a branch and a site, that are then merged into the trunk with its own site.
In order to deploy our staging server:
I compile the trunk's website via a bat file on the server
Run a windows app I built to query TFS for changesets associated with
specific WorkItems in a certain status, and copy all the files for
those changesets from the publish folder to a deployment folder.
Run another bat file on the server to use RedGate's Deployment Manager
to create a package from those new files
Go to the DM site on our network to create and deploy that release (haven't been able to get the command line tools to work for this, so I have to do it manually)
Run any SQL scripts that have been saved off in Folders that match ticket numbers on each database (10 or so customer dbs) to support the release
I have tried using TFS automated build stuff and never really got it to build the website correctly. Played around with Cruise Control also with little success. Using a mishmash of skunk works projects to do this is very time consuming and unreliable at best.
My perfect scenario would be:
Gated Checkin, Attempt build/publish every time a developer merges into the trunk, rejects and notifies developer if the build fails.
End of the day collect the TFS Items of a certain status and deploys files associated with them to the staging site
Deploy SQL scripts for those TFS items across all the customer dbs in staging
Eventually* run automated regression UI tests, create new WorkItems or emails to devs if failed
Update TFS WorkItems to new state so QA/Customers know their items are ready to test in our staging environment
Send report of what items were deployed successfully
How can I get here so that I am not spending hours preparing and deploying releases to staging and eventually production? Pretty open to potential solutions, things that would be hard to change would be the source control we are using, can't really switch to subversion or something else so we are pretty stuck with TFS.
Thanks
Went back in and started trying to get TFS to build/publish my web solution. I was able to get a build to complete successfully. adding msbuild argument /p:DeployOnBuild=True and setting the msbuild platform to x86 seemed to do the trick on that.
Then I found https://github.com/red-gate/deployment-manager-tfs which gives you a build process template to do the package and deployment using the redgate tools. After playing with that for a bit I finally got it to create, package and deploy my build to our staging environment.
Next up will be to modify the template to run some custom scripts to collect only the correct items to deploy, deploy all the sql files and then to set the workitems to the appropriate statuses after completion.
Really detailed description of your process. Thanks for sharing!
I believe you can set up TFS to have gated check-in on a single branch, which if you can setup on trunk would make sure that the merges built successfully. That could trigger msbuild, if you can get that working or a custom build job.
If you can get that working then you'd be able to use that trunk code as the artifact to send to Deployment Manager. That avoids having to assemble the files for deployment through the TFS change sets, as you'd be confident that the trunk could always build.
Are you using Deployment Manager to deploy the database from source control as well as the application?
That could be a way to further automate the process. SQL Source Control and SQL CI allow you to source control the structure of a database, keep a database up to date on each check-in, and run database unit tests. They also produce database packages for Deployment Manager, so you can deploy a release that contains both the application and the database.
If you want to send me the command you're using in step 4 to deploy the release using Deployment Manager I can help out with that. The commands I use are:
DeploymentManager.exe --create-release --server=http://localhost:81 --project="Project Name" --apiKey=XXXXXXXXXXX--version=1.1
DeploymentManager.exe --deploy-release --server=http://localhost:81 --project="Project Name" --apiKey=XXXXXXXXXXX--version=1.1 --deployto=CI-Environment-Name
That will create a release version 1.1 using the latest available packages for that project. You can optionally specify the package to be used when creating the release with
--packageversion=<package name>=<version>
--packageversion="application=1.5