If you have a web AND a worker role in an Azure solution, all the waiting for the publishing an update package, uploading to the cloud storage, waiting for the package to be deployed could be exhausting, an waste a lot of time.
How to upload/deploy only the worker or web role of an Microsoft Azure Solution, that contains both roles, and save both internet traffic and time?
There is no option to build a package for only one of the both roles, but if you have limited bandwidth or traffic, and want to save from the upload time (which can be quite a big portion if you have a lot of static content: Look here for an example), there is one option.
As maybe you know, the package generated from Visual Studio for deployment (the 'cspkg' file) is nothing more, than an archive file.
Suppose, you want to update the WORKER role only. The steps are:
Create the update package as normal
Open it with the best archive manager (7zfm)
Inside, besides the other files are 2 'cssx' files - one for each
role. Delete the unnecessary cssx file.
Upload to Azure Blob Storage (optional)
Update the instances from the Azure Management Portal using the
'local' or 'storage' source as normal
On the Role dropdown, select only the role you want to update
Press OK :)
Hope this helps.
It is a lot easier to just add two additional cloud projects to your solution. In one project, have it reference only your web role. In the other project, have it reference only your worker role.
You can keep the cloud project that references both roles and use that for local debugging but when it is time to deploy, right click the cloud project that references only role you wish to deploy and click "Publish"
You will end up maintaining configuration files for each cloud project but that sounds a lot easier than manually messing around with editing the package file each time.
Related
I need help to understand better how to create complete CI/CD with Azure Devops for APim. Ok I already has explored the tools and read docs:
https://github.com/Azure/azure-api-management-devops-resource-kit
But I still have questions, my scenario:
APim dev instance with APi and operations created and others settings, as well APim prod instance created but empty.
I ran the extract tool and got the templates (not all), still need the Master (linked templates), and on this point seat my doubt, I already have 2 repos too(dev and prod).
How can I create the Master template, and how my changes from dev environment will be automatically applied to prod?
I didn't used the policyXMLBaseUrl parameters not sure what Path insert there, although seems #miaojiang inserted a folder from azure storage.
After some search and tries I deployed API's and Operations from an environment to another, but we still don't have a full automated scenario, where I make a change in a instance and that is automatically available.Is necessary to edit policies and definitions directly on the repositories or run the extract tool again.
The company I work for currently has several projects in TFS 2015, each with their own build definitions.
We are in the process of transitioning from TFS to VSTS.
One of the features of VSTS that we were trying to utilize in the builds is the Secure Files in the projects' Library, to store the certificate issued by our company used for digitally signing assets in different installers.
It is currently duplicated in each project.
We were planning on uploading the certificate to one project and then use the 'Download Secure File' task in all of the projects' builds (that need it), to eliminate the maintenance of having the same certificate in multiple projects.
Not surprisingly, after uploading to one project, the file is not listed in any other project's Library or available for download as part of the build, even if I try to assign the project(s)/team(s) as a security role to the secure file (even giving it Administrator role).
Is there a way to have a secure file in one project's Library be shared across other projects, so that it can be downloaded as part of a build task?
It's not a big secret that the VSTS team has been working towards making individual projects portable. To allow you to take a project, with all that belongs to it, and then move it to another account. No clue when or even if this will ever be released, but it serves as a basis behind some of the separation between separate team projects.
In order to make this seamless, direct links and dependencies between projects are actively being discouraged and old features that are cross-project are slowly disappearing from the UI (even if the API supports it in many cases).
I suspect that if you configure the builds scope to be "Collection" that the REST API will be able to access the secret file from the other project, but it would require a custom task.
The guidance would be to replicate the secret file to each team project that needs access to it.
I'm using Visual Studio Team Services and I'm trying to set up Release Management to allow automated deployments for our Azure Web App to multiple environments. I would like the same source to be deployed to each environment, but with modified configuration settings.
I was hoping that I could create a single Build for my application, and then modify the configuration at deployment time for each environment. I'm aware that this can be done for appSettings and connectionStrings (either through Tokenization, or even managing those settings via the Azure portal), but I'd like to be able to make more general changes to the web.config file. For example, I want to be able to:
Update 'simple' settings such as appSettings/connectionStrings
Update multiple attributes on elements (like httpErrors)
Insert or rewrite sections of the config file itself (for example to add IIS rewrite rules, or to remove unwanted HTTP handlers for production)
Currently we achieve this by using config file transformations and separate publish profiles for each environment (manual deployment). I can't see a way to re-use this if I want a single release pipeline.
I'm hoping someone can help point me in the right direction. I'm also happy to accept alternative solutions - ultimately I just want to be able to deploy the same source (e.g. from the same commit in source control) to multiple environments, with different configuration, while keeping some kind of flow from dev, to test, to eventually production.
You can use Tokenization Task to update the files base on the environment variables.
More similar extensions: Replace Tokens and Colin's ALM Corner Build & Release Tools.
Am a newbie to ATG. Got a question. Please read through my understanding below and find the my questions at last. Correct me if my understanding is wrong as well.
A typical staging enabled ATG will look like below (at basic level) as per my understanding so far in ATG learning,
Asset management server - Stores/Manages internal users(BCC/CA/Merchandising/ACC users), versioning commerce assets & other versioned repositories
Staging server - Unversioned/non-versioned commerce items and other repositories
Production server - Unversioned/non-versioned commerce items & other repositories and Stores/Manages External users(customer)-"core schema"
In this, the external(customers) profiles are stored only in production site.
As staging site is basically termed as replica of production site,
Should the Store(customer facing) application to be deployed in staging server as well?
If so, how does it will point production core schema?
Keeping this on one side, I have also heared 'preview feature/server'? Isn't this staging? what is the difference?
Using the 'Asset Management Server' you are able to create or update assets in the scope of a single project. These changes can only be viewed within the context of the project in which they are edited and as such you are able to 'preview' them on the `Asset Management Server'. This also only renders the asset in a popup and does not give you access to the navigation of the site around the asset.
Assume you want to be able to 'preview' your changes in the context of other projects but don't yet want to make this go live. In this instance you will create a 'Staging server' and through your project workflow publish your changes to the 'Staging server' for 'review'. Now you are able to see your changes (ie. 'preview') along with other projects that have also been published to the 'Staging server' without exposing this to your external customers. This is particularly useful when you are also using Endeca in the scope of an Oracle Commerce solution.
Once you are happy with your project(s) in the 'Staging server' you would typically then approve and deploy to your 'Production server'.
Your 'Staging server' will need its own Core and Switching Schemas. It will also require a code deployment, similar to what you deploy onto your 'Production Server'. You will need to configure additional Data Sources within your application container and add new components, pointing at these data sources in your environment layer. For example you will need a new JTDataSource_staging.properties, to be added into the 'Asset Management Server' environment. You will also need to add pointers in your repositories to access the new environment, for example ProductCatalog_staging.properties.
So overall your 'Staging server' is a copy of your 'Production server' but with access to your published projects prior to them being made accessible to your external customers.
I think I may be missing something and hope you can advise
I have been developing a project using VS2013 with EF6. I use Visual Studio each time I want to deploy the latest version of the system to my Azure Website.
The Azure Website has a linked database resource (SQL Azure database).
This has been going great. However, yesterday I decided to create a Virtual Machine and move the SQL database to a dedicated Azure Virtual Machine. So I did this and now I have a new database as well as the old linked resource one
So, i'm ready to publish the APP and set the new database settings on the VM.
I changed the connection string in the publish wizard and published being sure to have the right settings, i.e. use this connection string at runtime and execute code first migrations etc
However, it took me a while to realise that the APP on the cloud server I just published too is still pointing to the OLD linked resource Azure database
I'm not sure what else I have to do to, I thought it was only about changing the publish setting for the database connection string
Am I missing something, should I delete the linked resource in the Azure Website settings, if i do would that make it work. Just weird because like I say i'm publishing the site again with new settings, or does Azure read the portals publish settings and somehow overidde what I want it to point to database wise
Please advise, many thanks
John
PS I can connect fine to the new database from my local management studio. I have no errors i'm just not sure how to tell Azure to use the connect string in publish profile other than what i am doing
The "linked resource" in the Windows Azure management portal should have no impact on your application's functionality. It is really just a way to help you understand / visualize the resources your application is using.