Can you continuously sync project backlogs with the tool? - vsts-sync-migrator

I am looking at a scenario where I need to continuously sync parts of a backlog in one project to another project. I can't see a processor that compares source and target WIs and syncs them after first time migration has been done.
Is there such a processor?
Many thanks, Valentin

UPDATE: The tool supports re-running a migration and only pulling across changes. It is still only one-way and is designed to support migration.
https://github.com/nkdAgility/azure-devops-migration-tools
The intended purpose of this feature is to allow you to migrate many work items over a few weeks while users are still able to access the source. You can then re-run the migration with a special flag and have it pull the small number of changes.
WARNING: For sync you should note that the Source always wins. So if both the target and the source have been updated then the Source will overwrite the source.

Related

Any suggestions on the latest trend in version control for SQL Server 2014 and above?

For example when a developer makes changes in any of the database elements in a business critical database it should force them to commit the code before applying the changes to database itself. I came across Redgate sql source control which matches my expectation somehow. Still do we have any more tools or effective database practices that I am missing here?
If you use SQL Source Control or a tool like it (eg, ReadyRoll or VS Database Projects) I'd recommend also using DLM Dashboard.
The reason for this is that no tool can enforce changes to go through a process if people are given (too many) rights and are able to apply changes to production. It's then up to these people to correctly follow the process.
Although DLM Dashboard doesn't enforce changes to your database, it will alert you on changes made to production, warning you when out-of-process changes (aka "drift").
DLM Dashboard is free, which is another reason to use it!

Version-control in a large SSIS ETL project

We're about to make data transformation from one system to another using SSIS. We are four people people who will continuously be working on this for two years and therefore we need some sort of versioning system. We can not use team foundation. We're currently configuring a SVN server, but digging into it I've seen some big risks.
It seems that a solution is stored in one huge XML file. This must be a huge problem in a combined code/drag and drop environment as SSIS, as it will be impossible for SVN to merge the changes correctly, and whenever we get an error when commiting we will have to look inside that huge XML file and correct the mistakes manually.
One way to solve this problem is to create many solution projects in SSIS. However, this is not really the setup we want as we are creating one big monster which will have 2 days to execute and we want to follow its progress as it executes. If we have to create several solutions are there ways to link their execution and still have a visual look of whats going on and how well the execution is doing?
Has anyone had similar problems and/or do you have any suggestions as to how to solve them?
Just how many packages are you talking about? If it is hundreds of packages, then what is the specific problem you are trying to avoid? Here are a few things you might be trying to avoid based on your post:
Slow solution and project load time at startup in BIDS. I suppose this could be irritating from time to time. But if you keep BIDS open all day, that seems like a once a day cost.
Slow solution and project load time when you get latest solution definition from your version control system. Again, I suppose this could be irritating from time to time, but how frequently do you need to refresh the whole solution? If you break the solution into separate projects, then you only need to refresh a project. You would only need to refresh the whole solution if you want to get access to a new project within the solution.
What do you mean by "one huge XML file"? The solution file is an XML file that keeps track of the projects. Each project file is an XML file that keeps track of its SSIS packages. So if you have 1,000 SSIS packages evenly distribution across 10 projects in 1 solution, then each file would have no more than 100 objects to track. I can tell you from experience that I've had Reporting Services projects with more RDL files than this and it only took seconds to load the solution properly in BIDS. And as #revelator pointed out, the actual SSIS packages are their own individual XML files. Any version control system should track each of these as separate files and won't combine them into "one huge XML file". If you clarify what you mean by this point, then I think you will get better help on the question.
Whether you are running one package or 1,000 packages, you won't be doing this interactively from BIDS. You will probably deploy the packages to server first and then have the server run the packages. If that's the case, then you will need to call the packages probably with a SQL Server Agent job. Whether you chain the packages by making each package call another package or if you chain the packages by having the job call each package as a separate job step, you can still track where you are in the chain with logging. If you are calling the packages with jobs, then you can track it with job steps too. I run a data warehouse that has scores of packages and I primarily rely on separating processes into jobs that each contain one or more packages. I also chain jobs with start job commands so that I can more easily monitor performance of logical groups of loads. Also, each package shows its execution time in the job history at the step level. Furthermore, I have custom logging in each stored procedure and package that shows how many seconds and rows an individual data load or stored procedure took so that I can troubleshoot performance bottlenecks.
Whatever you do, don't rely on running packages interactively as a way to track performance! You won't get optimal performance running ETL on your machine, let alone running it with a GUI. Run packages in jobs on servers, not desktops. Interactively running packages is just their to help build and troubleshoot individual packages, not to adminster daily ETL.
If you are building generic packages that change their targets and sources based on parameters, then you probably need to build a control table in a database tha tracks progress. If you are simply moving data from one large system to another as a one time event, then you are probably going to divide the load into small sets of packages and have separate jobs for each so that you can more easily manage recovering from failures. If you intend to build something that runs regularly to move data, then how could 2 days of constant running for one process even make sense? It sounds like the underlying data will change on you within 2 days...
If you are concerned about which version control system to use for managing SSIS package projects, then I can say that just about any will do. I've used Visual SourceSafe and Perforce at different companies and both have the same basic features of checking in and checking out individual packages. I'm sure just about any version control system that integrates with Visual Studios will do this for you.
Hope you find something useful in the above and good luck with your project.
Version control makes it possible to have multiple people developing together and working on same project. If I am working on something, a fellow ETL developer will not be able to check it out and make changes to it until I am finished with my changes and check those back in. This addresses the common situation where one developer’s project artifact and code changes clobber that of another developer by accident.
http://blog.sqlauthority.com/2011/08/10/sql-server-who-needs-etl-version-control/
Most ETL projects I work use SVN as the source control repository. The best method I have found is to break each project or solution down into smaller, distinct (and often independently runnable) packages. So for example, say you had a process called ManufacturingImport, this could be your project. Within this you would have a Master package, which then called other packages as required. This means that members of the team can work on distinct packages or pieces of work, rather than everyone trying to edit the same package and getting into troublesome situations with merging.

How to move to a new version control system

My employer has tasked me with becoming our new version control admin. We are currently using two different version control systems for two different code bases. The code/functionality in the two code bases overlap in some areas. We will be moving both code bases to a new version control system.
I am soliciting ideas on how to do this. I suppose we could add the two code bases to the new version control as siblings in the new depot's hierarchy and gradually remove redundancy by gradually promoting to a third sibling in the hierarchy, ultimately working out of the third sibling exclusively. However, this is just a 30,000 ft view of the problem, not a solution. Any ideas, gotchas, procedures to avoid catastrophe?
Thanks
Git can be setup in such a way that svn, git, and cvs clients can all connect. This way you can move over to a central Git repo, but people who are still used to svn can continue to use it.
It sounds that in your specific situation, with two code-bases you want to combine, you should make three repositories and start to combine the first two into the third one.
My advice is to experiment with a few "test" migrations. See how it goes and adjust your scripts as necessary.
Then once your set, you can execute it for real and your done. Archive your old repos too.
Another place you might find inspiration is OpenOffice.org. They are in the middle of going from SVN to Mercurial. They probably have published information on their migration work.
Issues to consider:
How much history will you migrate?
How long will you need to continue using the old systems for patch-work, etc?
How long will you need to keep the old systems around to access historical information?
Does the new target VCS provide an automatic or quasi-automatic migration migration method from either of the two old VCS?
How will you reconcile branching systems in the two old VCS with the model used in the new VCS?
Will tagging work properly?
Can tags be transferred (which will not matter if you are not importing much history)?
What access controls are applied to the old VCS that must be reproduced in the new?
What access controls are to be applied to the new VCS?
This is at least a starting point - I've no doubt forgotten many important topics.

Merging and branching shared code between projects in TFS

I'm currently in charge of migrating our asp.net applications from source safe to TFS. We have three or four very similar apps (let us say e-commerce) that currently share a core library (services, business logic, entities, data access etc).
The applications are similar but not identical so one app might get a feature set the others won't get etc.
I want to stop the sharing of code and instead set up branches (if that fits) so if I change something in Application A:s core library I will need to merge the changes with the other branches instead of them getting the changes automatically. This to avoid surprises when you update from your trunk and suddenly the core has changed for another project and this project breaks in some way.
Any suggestions on how I should set this up in TFS? Should I have a "main" Core that is not directly used in any project that is the parent of all the other cores so I can push changes up to that one from one core and then distribute it to the other cores? Does that make sense and would it be easy to set up in TFS?
In response to your comment, I'd suggest you to read up on Feature branches on the CodePlex website.
Scenario 4 – Branch for Feature
In this scenario, you create a
development branch, perform work in
that branch, and then merge your work
back into your main source tree. You
organize your development branches
based on product features. The
following is a physical view showing
branching for feature development:
My Team Project
Development -> Isolated development branch container
Feature A -> Feature branch
Source
Feature B -> Feature branch
Source
Feature C -> Feature branch
Source
Main -> Main Integration branch
Source
We are alos moving from SS to TFS in the near future.
As I perceive it, we are going to keep our SS repository online and start fresh over in TFS. Our framework probably will get its own project in TFS. Project specific shared units will need to get merged from time to time.
The way you structure your repository depends on your specific situation. Every branch scenario has its specific advantages and drawbacks.
How many projects
How many developers
Are the developers dedicated
Do you need concurrent hot fixes
Do you need service packs
Take a look at the CodePlex branching guide for all the information you need to make an informed decision about your TFS structure. Print out the cheat sheets and pin them to your wall for quick reference.
Before executing on your branch plan,
pay attention to this cautionary
message - every branch you create does
have a cost so make sure you get some
value from it. The mechanics of
branching in TFS are simplified to a
single right click branch command.
However, the total cost of branching
is paid by reduced code velocity to
main, merge conflicts and additional
testing can be expensive.
I am assuming you have already investigated whether you truly need to make your "copies" seperate team projects. Remember the TFS concept of a "Team Project" is a VERY LARGE high level container. It is not the same thing as what most IT shops consider a "Project". Think of "Microsoft Vista" or "Office 2007" as a project, not, say "A new release of Company XYZ's Accounts Receivable System" as a project in the Team Project sense.
I have a client that decided on one single Team Project for TFS. There is nothing wrong with this - and it is truly the best scenario in many circumstances.
If you truly need a very strong isolation between your copies of the application (perhaps they are seperate clients and you need very strong security seperation) and must have seperate team projects.
That said - you still - as you've stated need to share code between instances of your application. The first thing I would strongly recommend is to get away from "Cut and Paste" sharing. I would truly try to isolate the shared code into a seperate Solution and generate binaries for that (perhaps you've already done this!)
This is covered in the Codeplex TFS: http://tfsguide.codeplex.com/
Another approach I've done for several clients - is to have a Team Project that contains the shared code. The "Build" creates the binaries for the shared code - and the "Deploy" simply copies those to a "known location" (ie UNC share on the build machine)
For the applications that are "Consumers" of the "Framework" we simply used the "AdditionalReferencesPath" Item group to include the location of that known location.
Furthermore - this tool: http://tfsdepreplicator.codeplex.com/ can be helpful. This would allow you to have builds automatically triggered for your "Consumer" Projects whenever the "Framework" solution is built.
My brief answer is that you should only setup one 'TFS project' and simply organize your different projects, i.e. your individual applications, and each shared library, as separate folders under that one TFS project. The alternative is to include specific (binary) builds of the shared libraries in each individual application – if you do that then you can organize each application into it's own TFS project, tho you can't merge changes or branch those projects without using the TFS command line (and some non-obvious commands to boot).
I was trying to determine the same information, this guide on codeplex is perfect
http://vsarbranchingguide.codeplex.com/releases
Includes terminology and different branching workflow approaches as well as cheat sheets.

Visual Source Safe --> TFS Migration

Around here we have been working with a bunch of Visual Source Safe repositories for about 10 years or so.
Now I want to get rid of sourcesafe and move on to Team Foundation Server.
Do you have any tips or tricks for me before I embark on this migration? What are the things I have to be careful about?
I am sure this migration will mean that our working habits have to be modified in some way. Do you think that these changes could be a problem for the organization? Think about a group of about 20 .NET developers in a single site.
There are a few different ways you can migrate. The tool will pull your history, etc. over, but the more pragmatic and simple way is to lock VSS as a history archive and start fresh:
Have everyone check in all changes into VSS, make sure everything builds, etc.
Set all VSS databases to "locked" (read-only rights for all users)
Get Latest on the entire VSS database into a "clean" set of folders on a workstation
Check all of the files into TFS from the workstation
For any history prior to the conversion, folks need to go to VSS, but after a week or two it's realistically unlikely to happen all that often. And you know that the history in VSS is accurate and not corrupted by the conversion process.
Be aware that TFS does not support sharing files between different projects, as VSS does. If you have any such shared files the link between them will be broken during the migration, resulting in initially identical, but now distinct files in each project. Updates to one of these files in TFS will no longer propagate to the copies in the other projects.
If you do choose to use the VSSConverter.exe tool that ships with Visual Studio Team Foundation Server, then you should install TFS 2008 SP1 first as it includes a number of improvements as detailed on this blog by the migration tools team.
Some of the key features of the
release include:
Elimination of namespace conflicts. I
previously blogged about this as "the
rename problem" and we have fixed the
converter to correctly migrate files
with overlapping namespaces. This was
the biggest pain point for most users
trying to use previous versions of the
tool.
Automatic solution rebinding.
In this latest version, VS solution
files will be automatically upgraded
to the 9.0 version and checked back in
to version control. Previously users
were required to do this manually.
Correcting of timestamp
inconsistencies. The use of client
timestamps by VSS can lead to
revisions being recorded in the
opposite order that they actually
occurred in. The tool now recognizes
this issue and continues migrating
changes where it would previously
fail.
Improved logging. Although
we've fixed a lot of issues, providing
better, more detailed logging will
help users that do run into issues
diagnose the problems.
I just googled, but this walkthrough seems like a good reference, and it mentions the tool VSSConverter which should help you make the migration as painless as possible.
I would like to recommend one thing though: Backup. Backup everything before you do this. Should anything go wrong it's better to be safe than sorry.
My links aren't showing up. This is the address: http://msdn.microsoft.com/en-us/library/ms181247(VS.80).aspx
We are currently in the process of doing this at my day job. We are actually making the switch over in about a month. I am a main part of the migration and a big part of why we are getting off of SourceSafe. To help in the migration, I used the Visual Studio® Team System 2008 Team Foundation Server and Team Suite VPC Image. It was very useful. Right off the bat, the image contains a full working TFS installation for you to play and demo with. It also includes Hands on Labs and one of the labs is running the VSS -> TFS migration tool. If you have an MSDN subscription, once you have played with the image, the next step would be to install the TFS Small Team edition that comes with your subscription.
One thing to note is to make sure you get the latest Service Packs for Visual Studio 2008 and the .NET Framework installed on the image. The service packs fixed some annoying bugs and it definately increased the usability of the system. We have a farely large SourceSafe database with about 90+ projects and the migration tool took about 32 hours to complete. First I made a backup of our sourcesafe database for testing. Then I made the migration on the test sourcesafe database. Afterwards, I checked the source tree in TFS and everything transferred fine. We kept all the history for our source files from VSS which was great. No need to keep that stinking VSS database around after we go live.
We are taking the migration in steps. First the source control and letting our developers get use to using it. Then after that we will be migrating the QA and Business Analysts over to use the Work Item tracking features.
My advice is to take the migration in steps. Don't do too much at one time. Give time for people who will be using the system to train up.
VSS Converter is a far from perfect solution. And there are significant differences between the 2005 and the 2008SP1 version of the converter.
For example, in a VSS DB that's been in use for a long time, there will have been a large number of users contributing to VSS. Many of these users will have left the organisation a long time ago and therefore will no longer have domain accounts. TFS requires mapping VSS users to domain accounts, so you will have to decide whether you map old users to a single 'dummy' domain account or to a current team member.
In addition, VSS Converter 2008 requires these domain accounts to be valid TFS accounts. Whereas the 2005 converter does not enforce this.
If your VSS history contains significant folder Moves, then it's likely you will loose all history before this Move. For example, if you Move a folder to a new location, then Delete the previous parent, you will loose all history. See this article for more explanation:
http://msdn.microsoft.com/en-us/library/ms253166.aspx
In one migration I was involved with, we had a 10 year old VSS database that lost all history prior to 6 months ago. This was due to a significant tidy up that took place 6 months ago.
TFS conversion tool <-- Use this
I've used this tool for some times already, the results are pretty satisfatory as it comes with the history of changesets from SourceSafe if you desire too.
Anyway, using this tool you should always pay attention to errors and warnings in the log, and check if everything built okay / passed okay.
It's recomended to also run an Analysis on SS before running this.
Hope it helps
Good guidance there from my former colleage Guy Starbuck. Another thing to add with that approach - you may have decided over time that you want to refactor the way your application is organized (folders etc) and this will give you an oppurtunity to do so.
I've been in situations where we organized a solution haphazardly without thought (let alone major changes in the application) which led to a desire to organize things differently - and the move from VSS to TFS is a great oppurtunity to do so.
As far as the original question:
And: this migration will for sure mean that our working habits have to be modified in some way. Do you think that this changes could be a problem for the organization? Think to a group of about 20 .net developers, in a single site
I would say - yes your working habits will change but much more for the better.
You no should use "Check-out" Locks and "Get-Latest on Check-out".
You can now effectively Branch and Merge
You will now have "Changesets" all files checked-in at the same time will be grouped together. This makes historical change tracking much easier - but more importantly - rollbacks are much easier (ie find all files checked in at the same time and roll them back)
Associating Check-ins to Work Items. Don't overlook Work Items! The biggest mistake you can make is to only use TFS as a VSS replacement. The Build and Project Management features are excellent - you paid for them - USE THEM!
As far as details on how your experience will change, another former colleague of mine (and Team System MVP) Steve St. Jean wrote a detailed article on the differences: From VSS to TFS