Nutshell :
I'd like some help / info / resources r.e. setting up a Team Build, MsBuild, TFS 2010 automated deployment for my Web App to Azure (inc. all the DB bits).
Ideally I'd like to have a process that I can fire off from the VS2010 Team Explorer UI "Queue New Build", then just keep an eye on its progress, releasing me to work on something else. Options to delve into logging for any issues, and know that my process is robust, complete and totally non-manual, i.e. :
Backs up all my Live data (SQL Azure
and Azure Blobs)
Deploys any DB schema changes (contained in my DB project)
Deploys any data changes to my core data (e.g. config data etc - which I have in my Post-Deployment scripts)
Does things sensibly (e.g. using compression for deployment packages to save time & bandwidth)
Cover my silly backside (e.g. seamlessly rolling back failed changes)
Keeps app 100% running during deployments (failed or successful) e.g. sessions left intact, minimal chances of data loss
Keep detailed logs of processes progress at each stage for fixing any issues
Keep everything that should be Source Controlled well... source controlled
Background / Dream / Goal :
At my last FT job we had a pretty sweet automated deployment setup for our hosted Web Apps, using CC.Net (to manage the process), CCTray & the CC.NET web UI to monitor and control, Code Generation (CodeSmith + NetTiers templates for data access & entities), MSBuild, VS Databse Projects, a few .bat scripts, and some handy utilities like PsExec etc to help out with little bits and pieces.
I didn't set it up, but have some experience managing it, dealing with issues etc.
It was (98% of the time) a lovely experience to deploy. You'd make sure TFS was up to date, double click CCtray, right click on a project and then click "Force Build", sit back and watch Green => Yellow => Green.
Great !!
Current Situation :
I run my own Micro ISV, and my main project is an App on Azure (in Beta).
I'd very much like to replicate the kind of deployment experience I had before - I'm even considering moving out of Azure to dedicated servers - just because I know I can setup an automated deployment system there.
My main stopping point is the DB bits, seems like a nightmare. Maybe I'm missing some great free tool or library which would do the job, I really hope so, but I also could really do with someone experienced with this to point me in the right direction for a "Best Practice" solution to wrap up all the little bits neatly.
I have scoured Google, read and read, burnt hours and hours, but what I seem to find atm is half solutions, not quite right for my project and needs, based on expensive tools I can't afford (near $0 budget), or is plain over my head and a bit incomprehensible & scary.
Now I'm NO Sys Admin, but with enough time I can generally work out what I need to do for these sort of things.
However, I don't have ANY time right now, and the success of my whole project really depends on me being able to cut out the horrendous 40min+ manual deployment wastage I currently endure.
I want to be able to get some user feedback, find a bug, or code an improvement and confidently just fire off a deploy and crack on with something else.
The extra issues with development for Azure in its current state (as opposed to dedicated servers), and the currently fairly poor tooling support from MS (I know there's lots of improvements coming, but I need something right now) - has left me swimming in a sea of "I don't know"'s & "I'm not sure"'s and tends to end in one big :
"I give up + a manual deploy for almost an hour + a little sobbing inside as my dream of deployment heaven dies once again" :(
But I know people out there more proficient, knowledgeable and experienced than myself have cracked this one for themselves. I just can’t seem to find the info I am sure is out there.
So if anyone has some good resources, tips, links, comments, or opinions on this, I'd love to hear.
Details Of My Setup :
App up and running in Azure (which is in Beta - partly due to not having the auto deploy setup), running in a Production slot, I haven't bothered with a staging slot, as some issues with subdomains / DNS / the auto generated Url has made that look painful / not feasible.
Azure / App :
App is
1 Web Role
ASP.NET 4
MVC 2
EF 4
SQL Azure
Azure Blob Storage
1 Worker Role - this runs some scheduled tasks, and works with same DB and Blob Storage
SQL Azure
Azure Blob Storage
The 2 Roles communicate with the Azure queue system (or will do shortly)
Locally :
Datacenter 08 (DC) + Hyper V
- VM for TFS 2010
- VM for a Linux firewall
Dev Box 1 (Win 7)
- VS 2010 / VS 08
- SQL 08 R2 / 05
Dev Laptop 2
- as above.
I tend to run these together all the time (so I never need to stop to wait for anything) with the FANTASTIC free tool Synergy to bind Keyboard, Mouse, Clipboard together.
Some Of The Stuff I've Read :
I have read what I have found and some of its great stuff, so I am also posting these links here to help other's struggling with this stuff, but none of it quite seems to do the trick, or maybe I don't get the trick maybe I'm missing something ?
http://deploytoazure.codeplex.com/
How do I manage and publish a database with my MVC2 application on Azure?
How can I automate the "generate scripts" task in SQL Server Management Studio 2008?
http://www.koltovich.com/blog/DeployingAzureProjectFromTFS2010BuildServer.aspx
http://msdn.microsoft.com/en-us/library/ff803365.aspx
http://msdn.microsoft.com/en-us/library/gg432988.aspx
http://www.jimzimmerman.com/blog/2010/03/16/Deploying+An+Azure+Project+Using+TFS+2010.aspx
http://archive.msdn.microsoft.com/azurecmdlets
http://selfpacedazure.web.officelive.com/Documents/Windows%20Azure%20Platform%20Articles%20from%20the%20Trenches.pdf
http://msdn.microsoft.com/en-us/library/gg651132.aspx
http://social.technet.microsoft.com/wiki/contents/articles/overview-of-tools-to-use-with-sql-azure.aspx
http://msdn.microsoft.com/en-us/library/ms178078.aspx
http://blog.syntaxc4.net/post/2011/05/13/Continuous-Integration-in-the-Cloud.aspx
http://blog.syntaxc4.net/post/2009/12/31/Synchronizing-a-Local-Database-with-the-Cloud-using-SQL-Azure-Sync-Framework.aspx
http://social.technet.microsoft.com/wiki/contents/articles/overview-of-tools-to-use-with-sql-azure.aspx
http://social.technet.microsoft.com/wiki/contents/articles/developing-and-deploying-with-sql-azure.aspx
http://blogs.msdn.com/b/tomholl/archive/2011/02/23/using-msbuild-to-deploy-to-multiple-windows-azure-environments.aspx
http://www.scarydba.com/2011/04/25/sql-azure-deployments/
Disclaimer / Forum Abuse Minimisation Blurb :
Like I say I am NO Sys Admin, I am NO script magician, and NO CI guru, I am a simple minded web dev, so pls pls be nice if its mindlessly easy to you, or if I stoopidly am missing the point, I don't mean to be all "Does You Haz the codes?" But I’ve basically spent 6 months dreaming that one day soon someone will post a nice clear simple blog entry with an "Idiots guide" that solves all my woes, and an hour later I am in deployment heaven again, but I am still waiting (or Googling badly), and its breaking my little Developer's heart :(
P.S. I promise that If I get a good answer here I'll do my bit for the fantastic SO community and spend at least 8 hours scouring for questions I might be able to help with and contributing back.
Great.
It seems the new SQL Server (Code Named Denali), along with the new SQL Server Developer Tools (Code Named Juneau), and specifically the 2.0 release of DAC projects may well have filled the gap between development and deployment to SQL Azure.
The new v2.0 of the DAC framework expands the set of supported objects
to full support of SQL Azure schema objects and data types across all
DAC services: extract, deploy, and upgrade
From SQL Azure Import/Export
Also see :
Bob Beauchemins Blog Post suggesting schema upgrades on SQL Azure are now
supported with the new DACImportExportCli.exe utility.
Other suggestions the new DACs 2.0 solve the major issues with upgrade deployments to SQL Azure
And looks like it all should run side by side with current setup. Will check it out and update here on progress. Brilliant.
For the database deployments I use RedGate compare which works well with Azure. There is a command line edition which can be used as part of an automated build process. Regarding keeping the site always running, you should deploy to staging and then the production site is never down. Once deployed you can switch the staging over to prod.
Related
I'm working on a website using PhpStorm. For a long time I developed it locally, but then I got hosting and a remote ftp server.
I created a new project in PhpStorm with the settings for remote host, and I found that deploying code takes long time (over a minute) before I can see the result, which is quite uncomfortable when debugging.
Is there any possibility to work with code on a local server, and, when I think that the project is ready for deploy, just send it to the server.
I understand, that I can just work in two different projects and just deploy the "ready" version to server via FTP, but maybe there is some more comfortable way?
There is several answers to this question, and most of them opinion based but i will try and keep it objective.
Case 1
A big corporation gives every developer a sandbox, to test their code from, the corp requires every developer to keep their code on the sandbox.
Using mounted drives could be extremely slow. Especially when PhpStorm is indexing.
Case 2
An easy way to keep an auto backup of your code it to use the build in (s)ftp(s) upload/deploy.
Solution
In both cases you could use the auto deploy feature that saves every changes to the server, that way the deploy doesn't take over a minute, but is usually already there before you know it.
I cannot recommend to use the deployment for Production as it will not pass through your version control, SAT, security setups etc. In that case I would suggest something like rocketeer etc.
EDIT:
As for 2 projects, well you can define 2 different deployment servers, and use the default one for your testing, with auto upload or something, and then the other one can be selected from the deployment menu.
I've gone down a rabbit hole and finding it hard to know where to go now.
I have a report project I'm trying to script up for deployment to a SharePoint site on a highly controlled production server. On my Dev box I can just deploy my project from BIDS and the reports run. If I upload my rdls, datasets and datasource to the document library directly they don't. I've done some digging and found that the uploaded files aren't linked in any way and that BIDS does some extra steps to set the DataSource for the Shared Datasets and then sets the reference to these DataSets on the Rdls.
So I've been poking around and can see that I need to call the SetItemReferences on ReportingServices2010.asmx to define the links but I'm lost using Powershell. Some scripts I've found are focussed on setting DataSources so I'm trying to adapt that using bits from other scripts but getting lost. One example does $Reference = New-Object -TypeName SSRS.ReportingService2010.ItemReference but I don't know where they're getting the SSRS. namespace from.
Incidentally, the structure I have is:
- One Shared DataSource points to a SharePoint List
- One DataSet pointing to the shared DataSource
- Four reports with NO embedded DataSources and five embedded DataSet references each pointing to the shared DataSet applying various filters.
Is there already a built in way to do this so I can avoid hassles?
Requirements here are that I need something extremely simple that doesn't require extra PowerShell modules to be installed (if possible). The network is highly controlled and it's difficult enough to get scripts we run ourselves approed let alone some third party module installed on the farm of machines in Prod. Basically it will take at least six months to scan, test and formally approve any addons but if we write a very simple script it's much easier.
Yes - deploy with your browser. I have written 3 separate report projects with SSRS on a highly controlled SharePoint 2010 production environment. Each one of them, I have deployed using the browser.
Deploying using your browser is simpler than the PowerShell. Follow the general steps outlined in the last part of this thread. Doing it via Powershell is possible, but far more difficult task.
If the admins have this production environment so highly controlled, then there should exist a parallel staging environment that is kept in precise configuration as production and available to you in order to do DevOps of your SSRS reports. You should request to test your install on the staging environment in order to work out your deployment issues (either by browser or PowerShell). If you get denied this request, then you need to request again. Otherwise its impossible to get it perfect if you don't have access to develop on a similar system.
DevOps on these reports is the last mile of the race and can be difficult if you are the first to do it at your organization. You can do it, just keep going and your reports will be installed. Keep good notes so when you development future reports you can repeat this process and will be the go-to person for getting it done in the future. Don't lose faith.
What is the best way to achieve DevOps with XPages.
Multiple Developers working as a team, On Premises Servers [Dev, QA,
Prod] can we replicate to Bluemix? Source Control Automated Testing UI
/ Application, Unit testing business logic with testing framework, Automated Deployment
IDE/Tools
Domino Designer; are there other ways?
Note: Use of Views when the data is in a NSF, otherwise data is in the cloud, or SQL. No Forms or other classic Notes design elements.
What are your approaches to this?
This is a high level overview of the topics required to attempt what you're describing. I'm breezing past lots of details, so please search them out; I've tried to reference what I'm currently aware of as far as supporting documentation and blog posts, etc. of others. If anyone has anything good to add, I'm happy to add it in.
There are several components involved with what you're describing, generally amounting to:
scm workflow
building the app (NSF)
deploying the built app to a Domino server
Everything else, such as release workflow through a QA/QC environment, is secondary to the primary steps above. I'll outline what I'm currently doing with, attempting to highlight where I'm working on improving the process.
1. SCM Workflow
This can be incredibly opinionated and will depend a lot on how your team does/wants to use source control with your deployment / release process. Below I'll touch on performing tests, conceptually, during/around the build step.
I've switched from a fairly generic scm server implementation to a GitLab instance. Even running a CE instance is pretty fantastic with their CI runner capabilities. Previously, I had a Jenkins CI instance performing about the same tasks, but had to bake more "workflow" into the Jenkins task, whereas now most of that logic is in a unified script, referenced from a config file (.gitlab-ci.yml). This is similar to how a Travis CI or other similar CI config file works.
This config calls some additional helper work, but ultimately revolves around an adapted version of Egor Margineanu's PowerShell script which invokes the headless DDE build task.
2. Building an NSF from Source
I've blogged about my general build process, with my previous Jenkins CI implementation. I followed the blogging of Cameron Gregor and Martin Pradny for this. Ultimately, you need to:
configure a Windows environment with Domino Designer
set up Domino Designer to import from ODP (disable export), ensuring Build Automatically is enabled
the notes.ini will need a flag of DESIGNER_AUTO_ENABLED=true
the Jenkins CI or GitLab CI runner (or other) will need to run as the logged in user, not a Windows service; this allows it to invoke the "headless dde" command correctly, since it runs in the background as opposed to a true headless invocation
ensure that Domino Designer can start without prompting for a user's password
My blog post covers additional topics such as flagging the build as success or failure, by scanning the output logs for being marked as a failed build. It also touches on how to submit the code to a SonarQube instance.
Ref: IBM Notes/Domino App Dev Wiki page on headless designer
Testing
Any additional testing or other workflow considerations (e.g.- QA/QC approval) should go around the build phase, depending on how you set up your SCM workflow. A lot of the implementation will revolve around the specifics of your setup. A general idea is to allow/prevent deployment based on the outcome of the build + test phase.
Bluemix Concerns
IBM Bluemix, the only PaaS that runs IBM XPages applications, will require some additional consideration, such as:
their Git deploy process will only accept a pre-built NSF
the NSF must be signed by the account owner's Bluemix ID
Ref:
- IBM XPages on Bluemix
- Bluemix Docs: Building XPages apps for the Bluemix Runtime
3. Deploy
To Bluemix
If you're looking to deploy an XPages app to run on Bluemix, you would want to either ensure your headless build runs with the Bluemix ID, or is at least signed with it, and then deploy it for a production push either via a git connection or the cf/bluemix command line utility. Bluemix's receive hooks handle all the rest of the deployment concerns, such as starting/stopping the server instance, etc.
To On-Premise Server
A user ID with appropriate level credentials needs to perform the work of either performing a design replace/refresh or stopping a dev/test/staging server, performing the file copy of the .nsf, then starting it back up. I've heard rumors of Cameron Gregor making use of a plugin to Domino Designer to perform the operations needed for OSGi plugin development, which sounds pretty useful. As most of my Domino application development is almost purely NSF based, I'm focusing more on an approach of deploying to a staging/dev/test server, which I can then perform a design task on to do the needed refresh/replace; closer to the "normal" Domino way of doing things.
Summary
Again, there are a lot of moving pieces involved here, some of which gets rather opinionated rather quickly. For example, I'm currently virtualizing my build machine, so I can spool up a couple virtual machines of it, allowing for more than one build at a time. If there are major gaps in the process, let me know and I'll fill it what I can.
I am trying to migrate from TFS 2013 Update 4 to Visual Studio Online. I have had real difficulties using OpsHub Migration Utility, so I have come now seeking advice on how I may go forward.
I have 65 Team Projects to migrate, some with source code, others fairly empty with just a few work items so they differ in size a fair bit. I want to maintain branches we currently have which are cross Team Projects so I initially tried selecting all 65 projects for migration. This took 11hours at the creating configuration stage of the utility, then on getting to the end the tool complained it could not communicate with the service OpsHub Visual Studio Online Migration Utility. So back to square one.
Next approach was to batch the projects into sets of 10 projects. The validation and creating configuration stage was done in 10mins for each set which was a big leap forward, and the migration actually started. This has currently been running for about 17hours. It has done 29,000 work item revisions out of 480,000 across all Team Projects, so I think this may well take a couple of weeks to clear. It hasn't started on the Version Control data yet, which I'm hopeful it does at a later stage but this says not running at the moment.
This is running on a fast i7 box, with 16GB RAM, SSDs the business. 100Mbps internet connection.
I have emailed OpsHub to find out if the commercial version will perform better. But any suggestions what I could do better welcome, including any alternatives to the migration tool.
For Configuration creation taking high time, the assumed reason seems
to be due to issue in communicating with the service OpsHub Visual
Studio Online and waiting till the TimeOut.
The Work Item migration sync time seems to be near to the expectation, migration of Version Control data will be started once the work item migration is completed.
Also as the data is in big amount, its taking time but the migration would keep on running in background, you can use your source TFS instance(if required) without stopping the migration.
For more details on Profession Services you can drop an email to support#opshub.com
There is another tool named TFS Integration Platform that developed by MS for TFS Migration. You can try with it to see if it can migrate the data faster.
Is there any way to move custom queries I've setup in my company's locally hosted TFS server to my instance of TFS on visualstudio.com? I've Googled/Binged/Yahoo'd and even DuckDuckGo'ed around and asked other devs using the service but none of them had any saved queries they wanted to move, so no one had done any researching yet. After a few fruitless searches I've turned to the experts here on SO. Anyone find anything about this they can share?
The usual suspects when it comes to TFS Migration (namely the TFS Integration Platform) does not support moving project or personal queries. Depending on the quantity of queries, a manual recreation is obviously possible. However, if there are a significant number, then another option is to use the TFS SDK (Microsoft.TeamFoundation.Client and Microsoft.TeamFoundation.WorkItemTracking.Client). Within there, you can access the "My Queries" and project queries, including their folder hierarchy.
One example of this is available on Mike Poulson's blog where he shows going from a TFS 2010 -> TFS 2012 server. While this example is targeting on-prem, the same holds true for a move from TFS on-prem to Visual Studio Online.
Some of the queries may need "translation" in the migration process (naming differences, etc), using the SDK can also help in that process. So at the end of the day, it's a tradeoff between a manual recreation vs effort to code/debug/test a solution with the SDK.