I have a small team working on web site project using Visual Studio 2010 and with Team Foundation server 2012.
In order to have proper control on deployment, I would like to implement my dream deployment strategy as shown in the figure ( https://www.dropbox.com/sc/foy5fh7pntreiha/AAB4L4hhbpjcm1zHi6VBLSa6a )
There is no problem for my team to perform the check in/out between their development pc with the TFS server. But I have problem to deploy code from TFS server to targeted web server.
I read many articles talking about build deploy, but for me I don't think I need to do build because mine is not a web application and we basically have all the codes in the targeted web server. We don't need to build the project into dll and then only upload to web server.
I tried using "copy website" feature in Visual Studio 2010, but on the copy website panel, it is always local programmer pc code at the left hand side and the targeted web server on the right hand side.
I wanted this deployment flow because I think this is the safest flow so that no one will accidentally upload the wrong version of code into the web server. Everyone would have no choice but to check in their code(s) into the TFS server before he/she can upload into the web server.
Please kindly help me.
Thanks
Dont do that.
Instead use Stage / Production server, Stage and Master git branches,
Tell them to exclusively work out of stage, you control the merge to master,
use deployhq or similar service to hook into git(github) and trigger automatic deployments.
Much better than VS, much safer. Should a deploy not work due to file error, DHQ will prevent the entire deployment and revert to old state.
Related
We've got some legacy on-premise apps that we're evaluating moving off-site, and we are evaluating all our options. I understand that Azure Web Sites would be a lot easier to setup, but at this point it looks like may need some of the additional control that Cloud Services gives us.
However, everything I've read about Cloud Services so far demonstrates how you build an app and then deploy the build to the cloud. Similarly, you can connect to a Visual Studio Online repository, define builds in VSO, and after a commit, a build is performed and the build is deployed to the cloud.
However, in our case some of our pages are Classic ASP pages. In the event that one of these pages changes, I have not been able to figure out a workflow that allows us to deploy the updated files. Remember, classic ASP files do not have a "build" process; it's like a powershell script that is interpreted at runtime.
There is no Visual Studio solution or project involved with these apps. It's just a package of files we want to upload. For a "proof of concept" I decided to start with the simplest possible "app," a simple "hello.txt" file, and I have not been able to figure out a way to deploy this without "wrapping" it in a Visual Studio solution.
I was hoping that I could use, e.g., Publish-AzureServiceProject, however this appears to need a ServiceDefinition.csdef file, and again, I'm not sure how to do this without setting up solution in Visual Studio--a solution that wouldn't be used for anything.
I have a feeling I'm missing something and just need to find the appropriate publish settings file, or proper use of an Azure cmdlet. Is there a straightforward way to publish a package of files to an Azure Cloud Service?
Josh, you will need to package the files into a deployable package. This can be achieved using the cspack commandline tool and a hand-crafted definition file. Your ASP files would be treated as 'content' in this case.
The easiest way would be just to create stub Visual Studio Solution and include a 'Cloud Service' project to which you add all the ASP files. This way all your files will be redeployed in the event that your Web Roles require recycling by the Azure fabric.
While this might seem like a big overhead if you need to tweak just a single file, it is the correct way to manage PaaS deployments in Azure. If this process doesn't work for you then you should consider moving to an IaaS VM you fully manage yourself.
One thing that may be helpful is to realize that the web role in Cloud Services are just VM's using IIS. For that reason, you can connect to them just like any other server, via RDP, FTP, etc. Our team often bypasses the overhead of simple things, like deploying a new CSS file, an image, etc. by simply copying it in the old school way.
Again, not sure if this helps you, but old school techniques work just as well. :-)
What is currently the "best" way to develop a back-end system in Azure Mobile Services?
Specifically, what tools are available? From what I've seen, most examples just go to the Management portal and manually add a few lines into the script window. This is worse than using just Notepad, and doesn't have any concept of version control...
Is there any way to make a project in VS 2012 that contains all the Node.js code that will run in the Azure Mobile service? Is there a way of fully running that code on a local development environment that mimics the Mobile Services?
I need to have server-side code with much more complexity than is shown in most of the Mobile Services samples or documentation that I've been able to find.
I have a web site, and a Win 8 Store App that need to authenticate against, and access relatively complex data structures from a back-end database. The solution being pushed right now all seem to include Mobile Services at the center of it, using simple REST against raw tables, but all the examples are too simple to be useful.
Can someone point me to a "real-life" sample of using Mobile Services, and a "mature" way of developing and testing such a system using the tools in Visual Studio?
Thanks.
Why you have no other option than the Management portal is really beyond me. It seems very awkward for a C#/.NET developer to go back to Notepad style programming with console.log() debugging.
What I would love to see is some Node.js entry points that you could connect to a regular C# assembly which could fulfill the request (as in ASP.NET MVC or Web API) having the full .NET Framework at your disposal.
What I could see as a possible architecture is to have:
ASP.NET MVC hosted on Azure
--- writes processed data with logic to --->
Azure SQL DB <--- reads from --- Azure Mobile Services ---- bridge to ---> Mobile devices
Or
Cloud Worker Role on Azure ---- crunching/processing ----> Azure SQL DB <---- reading/writing raw data ---- Azure Mobile Services ---- bridge to ---> Mobile devices
You can use the Mobile Services facility for mobile devices facilities, scheduling and push notifications with limited code and do most of the coding in a managed .NET environment.
The AMS (Azure Mobile Services) along with Azure has advanced dramatically since this post was written and the replied answers.
Some of this stuff still holds true. If you have a ton of node.js written not in the Azure cloud portal, you will want to copy and paste to the portal online, custom api calls section and even perhaps sql backend tables for CRUD operations.
The hope for C# developers is that it is NOW in preview mode in which YOU CAN skip node.js and build everything without node.js very shortly... Some bugs to work out, but in 6 months this will be fairly solid.
I had questions and issue and a guy named Carlos carlosfigueira was very helpful.
Azure Mobile Services - Getting more user information
Josh covers unit testing server-scripts here: http://www.thejoyofcode.com/Unit_testing_Mobile_Services_scripts_Day_7_.aspx
In this tutorial, he uses the Mocha testing framework for JS (id TDD mode) and walks through an example for testing an INSERT script that encrypts the value of a particular property (text) and a read script that decrypts it (value is encrypted at rest in SQL db).
You can also find aggregation of links and tutorials here.
I would suggest that you build this solution using Windows Azure Mobile solutions especially it supports the Node JS NPM right now, which means you can create the API you want on the Windows Azure using the Node JS NPM and can work with it using WAMS easily. have a look on the following link it will help you understand what I want to say more.
http://weblogs.asp.net/scottgu/archive/2013/06/14/windows-azure-major-updates-for-mobile-backend-development.aspx
For the Client I also suggest that you build it using SignalR which is designed for cases such yours where real time applications require a lot of transactions from the server side.
http://www.asp.net/signalr
you can also find more details about how you can integrate both of them in the following link: http://hhaggan.wordpress.com/2013/07/12/signalr-node-js/
I hope these help you, let me know if you need anything else.
For running locally, the mobile service has the same Kudu environment available in azure websites, so you can browse to https://your_service_name.scm.azure-mobile.net If you navigate to the Debug Console from the top nav, you can download everything running in the site/wwwroot folder.
You can run this nodejs project locally (On windows only if you require the SQL Server npm package). Your code is in App_Data/config/scripts. If you replace the downloaded content with your current local git working copy, you can develop and debug locally, and then push changes as usual.
Tools I use:
Eclipse with JS environment (or any nodejs IDE).
Git
Postman
Steps:
Enable source control to your azure mobile service.
Pull to your local and create a eclipse project with the source.
Make changes and push.
Test with POSTman
This procedure allows me to develop really fast and eclipse tell me the common JS errors. But it has obvious downside:
No debugging (I use console.log)
The project ended up with a lot of commits (its hard to use git for proper source control)
I just did a blog post on running Azure Mobile Services locally: http://www.mikelanzetta.com/2014/09/running-azure-mobile-services-locally/ - basically it interrogates the API and starts up express, and allows you to run mocha yourself locally. It's a bit cleaner than pulling down the full wwwroot from the scm link, and I found using my local runner as a git submodule made it easy to work with (and easy for me to use VSO for managing my tests).
Anyway, for actual development, I use the Git integration and WebStorm - it automatically figures out the tasks in my local Gruntfile and makes it easy to run and test. For once it's deployed, Postman is helpful.
I have some questions around the best mechanism to deploy MVC web applications to different environments. Previously I used setup projects (.msi's) but as these have been discontinued in VS2012 I am looking to move to an alternative.
Let me explain my current setup. I currently have a CI setup using TFSBuild 2010 with Team Foundation Server for source control.
A number of developers work on their local machines and check in to the TFS Server. We regularly deploy to a single server dev environment and a load balanced qa environment with 2 servers. Our current process includes installing an msi which carries out some of the following custom actions:
brings current app offline with the app_offline.htm file
run in database scripts (from database project in the solution)
modifies web.config (different for each web server of qa)
labels the code
warmup each deployed file via http request
etc
This is the current process. Now I would like to make some changes. Firstly, I need alternative to msi's. From som research I believe that web deploy via IIS and using MsDeploy is the best alternative. I can use web config transforms for web config modifications. Is this correct and if so, could I get an outline of what I need to do?
Secondly I want to set up continuous delivery via TFSBuild, I have no idea how this may be achieved, would it be possible to get an outline of how it can be integrated in to my current setup? Rather than check in driven, I would like it to be user driven following check in. Also, would it be possible for this to also run in database scripts from a database project in the solution.
Finally, there is also a production environment, but I would like to manually deploy this - can my process also produce an artifact that I can manually install?
Vishal Joshi has some information on his blog that is reasonably good, http://vishaljoshi.blogspot.com/2010/11/team-build-web-deployment-web-deploy-vs.html. It does have the downside that your deployment password is include in the properties you pass to msbuild.
Syed Hashimi has also posted some information on this in another questions Team Build: Publish locally using MSDeploy.
I'm undecided about CRM at the moment. It's a great tool for the business users but so far for development it's been a bit against the grain. The next problem I need to tackle is how to easily source control javascript used within forms. We use TFS for our source control.
Anyone had an experience or have any ideas on how to do this?
Obvious choice would be to copy and paste the JS in to your source control, but it's also an obvious pain in the rear.
A couple of things that we do in our projects:
We use the Web Resource Utility included with the CRM SDK (actually a modified version of it) to deploy JavaScript web resources to a particular solution. Makes it very easy to keep script files checked in to source control as normal and avoid copying and pasting.
We wrote a custom HTTP Module that we use on local deployments. It intercepts requests for JavaScript libraries and redirects them to a location on local disk. That way, we don't have to actually redeploy the web resources as we test, just the JavaScript files to disk. (Note that this would be unsupported in a production environment. We just do it in our development environments to ease the pain of JavaScript deployment).
I answered a very similar question here - Version Control for Visual Studio projects and MS Dynamics CRM (javascript)
My choice for source control is TFS holding each of the 2011 JScript libraries.
We try to mirror the file structure that Dynamics uses for Web resources in a basic Library project. So version control works as normal, we just don't use the output from the project.
You can also try using the new "CRM Solution" project template (installed from the SDK) and have the ability to deploy from the context menu of the project.
I've had some issues with the template but something to check out.
Hope this helps.
You can take a look on my answer on my own question here.
MS Dynamics CRM 2011 SDK has solutionpackager.exe utility what could split all CRM resources into file tree and you can store them either in git or in tfs.
Any web resource in CRM 2011 is a pain to manage. We just end up doing a lot of copy pasting in and out of TFS 2010 (which has actually caused some problems with poor pastes).
Currently out of the box there isn't an easy way to do it.
Only worry about this if you really need the ability to go back to old versions of web resources. I've found that I don't often have to do this. Remember that the web resources are stored in SQL Server just like they would be if you put them in TFS, so as long as your CRM database is being backed up, you won't lose the web resources. In traditional development, it is important to keep the source in TFS because you can't easily get back to it once you compile and release. With CRM development, your web resources are mostly HTML or JavaScript, so you can always get at the source.
If you really need version control, why not build a quick little console app that downloads all customizations every night and stores that zip file in TFS? True, it wouldn't be as easy to get at older versions, but you should gain a lot of productivity by not having to manually keep TFS in sync. This also has the benefit of storing all customizations in TFS, not just web resources.
Silverlight is the obvious exception here - I would definitely store Silverlight web resource source code in TFS, because it is a "compiled" web resource. You are already in Visual Studio, so TFS is a natural fit anyway.
Hope that helps!
With VisualStudio Publish, CruiseControl.NET, MSBuild, aspnet_compiler.exe, and Web Deployment Projects out there, how would one know which tool to use to ultimately get a .NET 2.0 web application into a testing/production environment?
With .NET 1.1, I simply copied all files over to the server's directory and set it to a configured virtual directory in IIS. Unless I am really missing something, it seemed to work just fine. Now I'm reading about how important it is to put some good thought into 2.0 deployment and the the more I read, the more I get confused.
Please breakdown how to choose which tool to use, and why you would use that tool. If more than one tool is needed, please identify how they relate to this process.
CC.NET is for Continuous Integration it can build your setup projects as artifacts, but that is not it's main purpose. MSBuild is the Microsoft build system -- again, not related to deployment. aspnet_complier compiles your web sties, which may make deployment easier, but is not in itself deployment.
Web deployment projects is what you should be looking at. Here's a decent little post that goes over some of the options for deployment and a reference from MSDN. There are also commercial products.
In most cases, you can right-click on project in VS.NET and choose "Publish". This will give you a few options for deploying via FTP or file path.
Publish Web http://img26.imageshack.us/img26/1261/screencfl.png
What we do it publish to an SVN repository, then run SVN UPDATE on the machines it needs to go to...
I use TeamCity, which implements
Rebuilding solution with
devenv.exe in command line
Changing settings in web.config
(connection strings and debug mode)
with sed.exe
Precompiling WebSite
with the aspnet_compiler in command
line.
Copying solution to FTP
(with internal tool)