Team services with many small projects and single developer - azure-devops

My company has used a cloud TFS host for many years. Now the host has disappeared from the internet and a lot of code and all history has been lost. There are people working on it so it might be solved but I anyway need to fairly quickly set up new source code handling in Visual Studio Online and need some hints on how to do it.
The current solution was set up long before I started and for various reasons I am currently the only developer. It might change in the future but there will never be more than one developer for each visual studio solution.
I work with many small customer specific projects in Visual Studio (windows application, windows service, WebAPI, SSRS, SQL, Entity framework). The average size of a project is maybe 20 hours from start coding to deployment (there are a few larger projects as well). New features and bug fixes are sometimes added after deployment (can be years later) but that is usually 2-6 hour projects.
The current process has one TFS project per customer and each contain at most a handful Visual Studio solutions. There are no dependencies between the solutions and common code is handled with NuGet.
We had around 250 projects in the cloud and even if I so far only recovered 50 of these, the ones I had locally, we will sooner or later end up with similar numbers. Total size was in the region of 30GB (a lot comes from TFS by default checking in the nuget packages folder)
For most projects there is no need for workitems, kanban, reporting and other ALM features. Only developers will ever use visual studio online. I would like to work with a branch/pull-request/merge process. Coming from Git/Mercurial I have never felt comfortable with TFS.
So my questions are now:
What is a good way to structure the projects?
Single VSTS-project for everything
A VSTS-project for each customer (as today)
A VSTS-project per Visual Studio solution
What is a good way to structure the repositories?
One repository for everything
One repository per customer
One repository per Visual Studio solution
How does Visual Studio and the online portal work with hundreds of projects/repositories where 90% are not active. I usually have 3-5 instances of Visual Studio running with different solutions at any time.
I have read a lot of recommendations but they all seem to deal with long-lived projects and/or team of developers.
My main concerns are:
How much work it is to add a new customer or visual studio solution (happens weekly)
Getting started time. Sometimes an external developer is involved. It is not common but when it happens I don't want them to spend a lot of time on clone/pull (security is not an issue)
Standard. I want the process to follow standard/best practice as much as possible to make it easier to document for other developers. e.g. not encoding information in names of projects or forcing a folder structure.

I will suggest:
Create projects from each customer. Such as you can create projects with customer name like WebAPI, SSRS, SQL etc.
Since a VSTS project belongs to a certain customer related. So all the repositories in the team project should related to the customer. The structure for the repositories can be: different repositories for each case/solution of a customer.
There are only two kinds of version control system hosted on VSTS/TFS: Git and TFVC. And it seems you are familiar with Git and Mercurial, so you can use Git VCS for your projects.
Git repositories hosted on VSTS works as other remote repositories like github, bitbucket etc. it’s bare repo without working directories. So the solutions are not stored but the checksum between two versions and it stored with sha-1 value (40 char). And for most time, you work in local repo (no connect/communication with remote repo). Only when you clone/pull/push, your local git repo will communicate with remote repo.

Project(s) structure: Single project for everything.
Repositories structure: One repository per VS solution
Regarding VS work with these projects/solutions, you can close a solution, then open another solution (You can’t open multiple solutions at the same time in the same instance of VS), you also can just open the file in VS and edit. Regarding commit and push, you can use Git command (e.g. git commit, push)
You need to add them to VSTS when developers are involved, and they need to clone/pull source code from remote.

Related

Azure DevOps & copying code base from one project to another or finding a better way of doing this

I'm seeking advice on the following:
In my development shop we support a SASS solution to our customers. We currently have 10 sites that we develop and provide technical support. We're a small team, just 2 of us. We're using Azure DevOps services to host and manage our code, right now we're just using it for a code repo. Within our organization, we multiple projects that represent site. Each site uses the same code base, except the web.config. The web.config is used to change the UI\theme for each customer. When we get a request to create a new site, we first create a new project site and then we copy our code base from the "golden copy" project.
We use a "golden copy" code base to make feature changes and bug fixes. Once we develop a new feature (or fix an issue) to the golden copy, and then we push it to test, QA beings testing. If testing is successful, then the development team copies the entire "golden copy" code files and copies the code to each site project, build and deploy to test for QA to ensure that site works with the new changes . This can be time consuming and prone to errors.
I would like to know the following:
- Is there way in dev ops azure where we merge\copy from our golden
copy to our other site project's repos?
- Can you offer a better way for reorganizing our Organization\Projects
setup based on our current setup\workflow.
Thank you,
As Shayki mentioned, you can consider adopting Git branching strategy. Distributed version control systems like Git give you flexibility in how you use version control to share and manage code.
Keep your branch strategy simple. Build your strategy from these three concepts:
Use feature branches for all new features and bug fixes.
Merge feature branches into the master branch using pull requests.
Keep a high quality, up-to-date master branch.
A strategy that extends these concepts and avoids contradictions will result in a version control workflow for your team that is consistent and easy to follow. For details ,please refer to this official document.
Is there way in dev ops azure where we merge\copy from our golden copy
to our other site project's repos?
For this issue , do you refer to synchronize the changes on the golden copy to other projects' repos? If so, I think it can only be done manually(copy the entire "golden copy" code files to each site project) or clone the entire repo into other projects through the following steps.
In other projects, select the Import repository option:

"Best Practices" doc for Devs reconfiguring Team Explorer when migrating Collection to new server?

I am planning to soon migrate a couple of Collections from a on premises TFS 2017 server to a on premises Azure DevOps 2019 server. These collections have multiple Git repos, no older VSTS style code repos.
I've found all sorts of good documents covering how to migrate the collection- and I am able to do that with ease. I took snapshots of my old and new servers and did a temporary test move over a weekend, everything came up just fine. I then reverted to the snapshots.
Does anyone know of a good document or URL for me to provide as instructions to my 20+ developers for them to reconfigure their Team Explorer in Visual Studio? The Collections on the old server will be detached, so there's no need for them to continue to have the old server configured. I don't want anyone to have to completely switch to the new server in a way where they lose any git branches that they only have local (not pushed up to the server.)
I myself only use TortoiseGit to interact with the git repos. I can see in my git repos, I go to the .git folder and change the URL in the file named "config" and the repo is switched over painlessly.
Almost all of the devs only use Team Explorer. If anyone knows of a good guide that I can walk them through with to make the switch from within Team Explorer instead of having to edit text files and registry keys, I appreciate it.
Thanks!
If I understand you correctly, you are looking for guides to connect your devs' Team Explorer in Visual Studio to the projects in the new on premises Azure DevOps 2019 server.
You can check the documents provided in Microsoft site Connect from Visual Studio or Team Explorer.
For more detailed steps you can check out this thread.
You can ask your devs to follow the steps in above thread to add the URL of the new server to Team Explorer. Then they can switch to code repo of new server.

Is There A Way To Backup Visual Studio Team Services Projects?

I'm advocating using Visual Studio Team Services for our source control solution, and have actually started doing so. However, my manager, who is somewhat apprehensive when it comes to cloud-hosted storage and services, wants to know what our contingency plan is in the event of Team Services ceasing to be accessible for whatever reason.
I've pointed out that we have our source code on our developers' computers, in their mapped work spaces, but admittedly if we ended up with just that and no access to Team Services we'd certainly be in a bit of bind. They might all be working on different parts of the same solution and we wouldn't be able to check all of their changes back into the central repository or merge changes made in separate branches. We also wouldn't have access to the comments associated with previous check-ins, or our backlog, tests, etc.
So, the question is, is there a way to backup everything that we're hosting in Team Services so that, in the event of something going wrong, we'd be able to restore all of that to a locally-hosted installation of TFS (or somewhere else)?
I'm a bit late to the party but we developed a Team Services backup tool. We scheduled it as a scheduled task and it runs once a night. It then just clones all our repositories to disk.
Taken from this blog:
We use the VSO Rest API to query our VSO account and get all the data
we need. Since in VSO you can only have one Team Project Collection,
we retrieve all the team projects of the default collection. Each of
these team projects can have multiple repositories that need to be
backed up. A folder is created for each team project and saved to a
location on disk that can be configured in the app.config. When the
team project folder is created, the task loops over each repository in
the team project and creates folders for each repository.
You can also fork it on GitHub here
There's no out of the box backup ability.
Now, if you are only referring to source control, and not work items, pull requests, builds, test plans or anything else that the service offers, then I'd suggest you migrate your code over to git.
With git every developer will have a complete copy of the source repository, including all history and commit comments. From there, it's a simple task to push the git repository to a different git hoster (such as bitbucket or github) and make them your new centrally hosted git repository.
On a historical note, Visual Studio Team Services at one point offered a data export for a period of time. You might want to add a vote or three to this related UserVoice idea to help raise the importance of the feature with Microsoft.
Side comment: The business risks in using Visual Studio Team Services will come from either Microsoft shutting down the Visual Studio Team Services service or that the underlying Azure infrastructure has such a catastrophic failure that your Visual Studio Team Services account is unrecoverable. Both of those are extremely low risk, and very likely lower than the risks you'd have running TFS on-premises, in your own data centre, unless of course, your infrastructure and staff are better than Microsoft's :-)
Not a full VS backup in terms of a restore of service. But you can take a full Zip from root down using the Code web site. Right click the root folder and has a zip download option. Pretty neat feature.
The easiest way to back up everything is to use something like the TFS Integration Platform to periodically pull off all your data into an on-premises TFS solution. I've set this up using an Azure VM that we turned off when we weren't actively backing up, which makes it really low cost. For more info on using the TFS IP with Team Services, see this: http://nakedalm.com/migration-from-tf-service-to-tf-server-with-the-tfs-integration-platform/

DVCS with Central Build/ automatic push to server feature?

I am looking for alternatives to NWDI (Stands for Netweaver Development Infrastructure by SAP) source control system for developing
Java EE Applications. Primarily because:
NWDI is not DVCS : So developers have to be online to do just about anything.
User Interface: Its very difficult to use and train developers on using this system.
Tracking Changes/Generating Reports: Very limited support for this.
For example I cant find out what projects (Files within the project) have been changed in the last 2 weeks.
Code Review: You can do code reviews, it has a good diff utility. But thats about it, there is no way to attach code reviews to a change request.
Branching and Merging are extremely painful.
However the current system has a few handy features:
Automatic Builds: No need to write any build scripts , everything is built in. So when a new repository (we call it track)
is created it automatically configures the build based on the type of components (Supported by the repository) selected on creation.
A Central Build is triggered whenever a developer commits (Activates the changes). Irrespective of the status of the build the changes are now inflicted on the entire team.
Automatic push to Central Test Server: While creating a repository you can define all the servers (Central test, QA, Prod). A developer can push his changes by a
click of a button to Central Test Server. Again everything is built in and there is no need to extend any hooks like you have to do in Mercurial.
I was exploring Mercurial, Kiln and but couldn't find anything helpful. For mercurial Hooks can be used to do the same but I guess some customization effort is required.
Are there any cool DVCS like Mercurial which does the above 2 as well or is it something that I have to customize to make it work?
I don't know of a DVCS proposing everything build-in.
The only alternative (not DVCS, but with some of DVCS characteristics in it) is Rational Team Concert or RTC (free for up to 10 developers).
With a DVCS alone, the usual setup for CI and reviews is:
Git
Gerrit (review)
Jenkins (scheduler)
See "Using Gerrit Git Review with Jenkins CI Server"
Looks like there nothing useful out of the box. I am going to try out Kiln as it appears to be easy to use and try customizing it.

Eclipse / Aptana File Sync Solutions

Our development team uses Eclipse + Aptana to do their web development work. Currently, most of them are mapping their Eclipse projects directly to the web server. I'd rather them create a local project and use that to sync to the web server project directory they are working on.
The issue is that there aren't any good solutions which is just appalling given the popularity of the two.
The FileSync plugin for Eclipse is only one-way. Meaning if another developer makes a change to the file on the server, another dev isn't even notified and could overwrite the change.
The File Transfer option in Aptana 2.0 doesn't support any sort of Sync, just manually uploading/downloading files.
The Sync option in Aptana 1.5.1 doesn't allow you to merge files when they are different. You can only update one or the other. It does however allow you to view a diff (but only if you right click and select) and in that diff you can't make any changes.
I did find a way to allow files to be uploaded to their Sync repositories in Aptana using Eclipse Monkey. However it doesn't work if a user saves multiple files at once, 'Save All', again it doesn't work. And additionally, there is no notification if a user opens a local file that has an updated copy on the server. I tried to add one using Eclipse Monkey but I couldn't find any sort of listener in the Eclipse API to do it and any Eclipse Monkey documentation is far and few between.
My only solution at this point is just to let them continue to map directly to the server or ask them to do a manual download before they do any work (but again what if someone uploads a change right after they do that).
Anyone have any ideas?
April 2010
Add EGit to your Eclipse+Aptana setup, and:
let developers push to a local bare repo their developments (see also this post)
let your local project be updated by a git pull from that same local bare repo (creating/updating) a local working directory with sources merged/updated (or by using a post-update hook as described in my previous SO link)
let your local Aptana+Eclipse(+EGit) reference that local working directory, also used by your web server.
In short, when you are speaking of file synchronization + merges, this is a job for a (D)VCS (Version Control System: Centralized or Distributed VCS)
Oct 2011: as xmedeko mentions in the comments, Aptana3 has its own Git plugin.
And it isn't very compatible with EGit: See bug 1988.
Adding to VonC answer (which is correct IMHO), what probably lies beneath this scenario is that the process you adopted is not correct in itself, apart from the tools used.
If I understood well, you should not allow nor perform a direct upload from a development version of the project to the web server. Merging is not a job for remote synchronization tools, and it should happen well before the deployment phase (upload to web server is practically a deploy).
You should have a dedicated repository taken from some point in development history (according to you release timeline), a point where merge has already happened. Then deploy it (by means of file synchronization if you want, but that is not mandatory) on a local/staging web server.
Perform there any test you run on the web site actively running (i.e. integration and/or functional tests). If there's any bug & fixing, well there are different ways to actually apply the fixes on development & staging code repository. Only after that, you deploy the staging repository on to production web server (again, synchronization tools are a way to do that).