We have WCF services deployed in azure cloud and runnig. We have some changes in some dlls and want to update in VM but dont want to go through regular deployment/redeployment process.
We are thinking of manually coping dlls to approot and siteroot folders. Will it work?
Will it pick up new dlls when VM restart anytime in future?
To answer your questions
Will manually copying dlls to approot and sitesroot folders work: Yes (make sure you do this on each instance if you have multiple instances running)
Will these dlls survive a reboot: Yes (see Reboot Role Instance: ... Any data that is written to the local disk is persisted across reboots. ...)
But I would suggest to only do this if you're planning to test some things while developing your service.
Do NOT plan to use this for production deployments, because if something goes wrong with your instance, the Fabric Controller might decide to destroy that instance and deploy a new one (same could apply for Windows Updates). This new instance would go back to the initial state of your deployment (the content of the cspkg you deployed).
To make your development deployments even easier you could also activate WebDeploy on your Web Role to deploy from Visual Studio: Enabling Web Deploy for Windows Azure Web Roles with Visual Studio (again, do not use this for real deployments, this is only for when you're testing out some things).
Note: Web Deploy will not work with multiple instances.
No,
And this is not the way to go. If you want to be more dynamic, you have to take the approach of Windows Azure Accelerator for WebRoles. Although not anymore supported and developed project, it will give you a good foundation of dynamically loading assemblies (in this case entire sites) from Blob storage.
Related
We are running out of space on our D:\Temporary Storage drive on our Service Fabric Cluster VM's (5 nodes). I have conversed back and forth with MS support about what is safe to delete from this drive and the answers I'm getting are ambiguous at best.
I've noted that we have many older versions of our applications and services on the VM's that we don't need anymore. Getting rid of these will definitely help free up space. I've asked MS support if it's safe to delete the old versions of the applications and they said yes, but then directed me to these links:
https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-deploy-remove-applications#remove-an-application-package-from-the-image-store
https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-deploy-remove-applications#remove-an-application
https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-deploy-remove-applications#unregister-an-application-type
So the three sections we have are:
Remove an application package from the image store
Remove an application
Unregister an application type
These all deal with PowerShell scripts that need to be run, which I am very novice with. I have direct RDP access to the VM's and have the ability to simply delete the files via Windows File Explorer. Is it ok to do it this way, or do I need to go the Powershell route for deletion and Unregistering the application? At least for #1, removing the application package from the image store, there shouldn't be any issue with me just deleting that from Windows File Explorer in the VM's, correct?
EDIT: this is not a duplicate of Run out of storage on Service Fabric scale set :
I am asking about manually clearing space on the SFC VM's - the above thread is talking about setting up your application deployment to auto-delete old versions of applications. These are not duplicates.
You shouldn't delete manually from within the VM, SF should handle it and you may cause issues.
The right way to remove them is doing like the documentation says, using powershell like:.
Remove-ServiceFabricApplicationPackage -ApplicationPackagePathInImageStore MyApplicationV1
You can also remove it manually via Service Fabric Explorer:
The option in the left will try to delete all the Application Package Versions registered in the cluster(if none in use)
The one in the right will delete a specific version (if not in use)
Keep in mind that to remove a package you should remove any running application that is using the same package version.
The other option is deleting the old version when you deploy a new one. I will link you to this other SO question: Run out of storage on Service Fabric scale set
I'm running an ASP.NET Core web service that's dead easy to deploy to IIS on a Windows Server using WebDeploy: Right-click the project in VS, choose Publish, done. The currently running site is backuped and then updated. Couldn't be easier.
However, Windows services are another matter. I'm taking over maintenance of a service that is currently updated by building the Release config, remote desktoping to the server, stopping the current service, renaming the folder the service executable resides in (for easy rollback in case something goes wrong), copying over the build output from the dev machine, renaming the copied folder to the service folder name so the service finds the executable, and starting the service.
Is there a way to publish/update Windows services that is as easy as running WebDeploy? I haven't found any. There's always the possibility of abusing WebDeploy and just replacing the Windows service head project with an ASP.NET head project running on IIS, but the service has nothing to do with the web and I have no idea whether there would be any side effects to this solution.
(I could of course just write a script that performs the copying etc., but a built-in VS deployment process as with web projects would be optimal. As for continuous integration/deployment, it is unfortunately not something we have ever done at my company.)
I have master data model with some entity and it is deployed on production server.
Now i have created 2 more new entity in development server and wanted to move only these two entity.
If anyone has any idea please share with me.
Thanks !
You have two options.
Web-app(easiest): On your Dev server, go to System Administration. Click on Deployment and create a package. You then deploy this package by going on the production server, follow the same steps, but choose deploy instead of create under the 'deployment' button.
The alternative is to use the MDSModelDeploy.exe. You can find it on the server by going to the appropriate folder. Generally it's in this path: C:\Program Files\Microsoft SQL Server\130\Master Data Services\Configuration.
I recommend you use this method, as you have more control. You can choose to deploy with data, or without or clone your model. You can read more here ([https://learn.microsoft.com/en-us/sql/master-data-services/deploy-a-model-deployment-package-by-using-mdsmodeldeploy][1])
I can also recommend you consider the ModelPackageEditor when your model starts getting big. Then you have control over what you need to deploy, as in entities, views, business rules etc.
You need to have a deployment strategy in place, because if your development and production is not exactly the same, then you run into deployment errors. It normally happens when you create, for example business rules on the environment to which you are deploying and it is not on your dev environment. MDS uses copious amounts of id's and if the models are not in sync, then you run into problems.
We've got some legacy on-premise apps that we're evaluating moving off-site, and we are evaluating all our options. I understand that Azure Web Sites would be a lot easier to setup, but at this point it looks like may need some of the additional control that Cloud Services gives us.
However, everything I've read about Cloud Services so far demonstrates how you build an app and then deploy the build to the cloud. Similarly, you can connect to a Visual Studio Online repository, define builds in VSO, and after a commit, a build is performed and the build is deployed to the cloud.
However, in our case some of our pages are Classic ASP pages. In the event that one of these pages changes, I have not been able to figure out a workflow that allows us to deploy the updated files. Remember, classic ASP files do not have a "build" process; it's like a powershell script that is interpreted at runtime.
There is no Visual Studio solution or project involved with these apps. It's just a package of files we want to upload. For a "proof of concept" I decided to start with the simplest possible "app," a simple "hello.txt" file, and I have not been able to figure out a way to deploy this without "wrapping" it in a Visual Studio solution.
I was hoping that I could use, e.g., Publish-AzureServiceProject, however this appears to need a ServiceDefinition.csdef file, and again, I'm not sure how to do this without setting up solution in Visual Studio--a solution that wouldn't be used for anything.
I have a feeling I'm missing something and just need to find the appropriate publish settings file, or proper use of an Azure cmdlet. Is there a straightforward way to publish a package of files to an Azure Cloud Service?
Josh, you will need to package the files into a deployable package. This can be achieved using the cspack commandline tool and a hand-crafted definition file. Your ASP files would be treated as 'content' in this case.
The easiest way would be just to create stub Visual Studio Solution and include a 'Cloud Service' project to which you add all the ASP files. This way all your files will be redeployed in the event that your Web Roles require recycling by the Azure fabric.
While this might seem like a big overhead if you need to tweak just a single file, it is the correct way to manage PaaS deployments in Azure. If this process doesn't work for you then you should consider moving to an IaaS VM you fully manage yourself.
One thing that may be helpful is to realize that the web role in Cloud Services are just VM's using IIS. For that reason, you can connect to them just like any other server, via RDP, FTP, etc. Our team often bypasses the overhead of simple things, like deploying a new CSS file, an image, etc. by simply copying it in the old school way.
Again, not sure if this helps you, but old school techniques work just as well. :-)
We have a c#, .NET 4.0, windows application which we deploy to a terminal server. (Developed using VS 2010). This application makes use of several WCF services sitting on another server.
Our users access the front-end via remote desktop session. (They all have a .RDP file on their desktops.)
My question is regarding the deployment of this front-end. Currently, if we need to do an emergency deployment during business hours, we need to kick all the users off that are hooked into the app (as they are using the dll's that we need to replace). This is not ideal, obviously. We work in quite a business-critical environment, so these deployments are unavoidable. I've investigated ClickOnce, but have read that you cannot use this with terminal services application here. (Which kind of makes sense since it's essentially one app being "accessed" by several clients...)
I would like to be able to do a "silent" deployment whereby the user knows nothing about the fix until they restart their instance of the application. I'm not sure this is even possible?
I would appreciate any guidance or suggestions on this!
Yep, I do this all the time with a RD app -- you just need to move or rename the DLLs instead of deleting them. Windows allows moves and renames when DLLs are in use, but prevents you from deleting them. If you use Windows Installer to deploy your app, it will do the moves automatically (and delete the old versions when the system is next rebooted).
Once you replace the DLLs this way, existing sessions will continue to use the old, renamed versions, and new sessions will use the new versions. Of course, depending on how many DLLs you have, how long it takes your app to load them into memory, and how much activity you have on your server, you could run into a scenario where the app loads some of the old DLLs and some of the new ones when you're in the middle of updating them, but that would likely be rare.