Automate backup of Azure API management services using Jenkins or any other tool - powershell

Have written multiple Powershell scripts to take backup of AZure API management service. One script calls another and exports all the products and properties to my local machine.
I want to automate this process of backing up everything directly to Bitbucket. To achieve this i have configured Jenkins which is running on a CentOS server but, i don't know how to automate things using it?
I tried installing the Powershell Plugin on Jenkins but as i have multiple scripts written having dependency on one another, i just can't paste the whole thing as it is and run.
so is there a way i can run this multiple codes on Jenkins Powershell plugin instead of composing one single code and then run it?
Second thing, should Jenkins be installed on my local machine rather than on CentOS server in order to achieve this?
Or
if Jenkins has a plugin to link with my Azure account and export services to Bitbucket? (with Powershell codes out of picture)
Is there any other tool / alternative which is little less complicated?

Related

How to integrate JMeter with Azure DevOps Pipeline when the scripts are kept in a git repository?

I have a requirement to integrate the JMeter scripts, checked-in a Git repository, with a DevOps pipeline so that I can run the JMeter scripts using a specific VM in Azure.
Basically, I should have all my jmxs and csvs in a git repository and when I run the pipeline, having a parameter of the script name, it should run the script on a specific VM (not with a static IP) and copy the jtl in some storage.
What is the best way to achieve this?
With a DevOps pipeline so that I can run the JMeter scripts using a
specific VM in Azure. What is the best way to achieve this?
If the specific VM exists before the current pipeline, you can consider installing self-hosted agent there.
To do CI/CD using Azure pipelines, we need at least one agent. If we use microsoft-hosted agent, it will provide one fresh VM for us to run jobs. Since you need to run the script in your own specific VM, I suggest using self-hosted agent. You can follow the steps here to install one agent into your own VM. (The steps are quite easy and only cost several minutes)
After making your VM a self-hosted agent, the pipeline will call your VM to run the jobs. Now your original issue turns into how to run JMeter locally with command-line. See similar issues here: Five Ways To Launch a JMeter Test without Using the JMeter GUI and Run .jmx file through command line ....
1.So now we can use a command-line task in pipeline to run the JMeter related commands shared in the similar topics above. And these jobs are done in your specific VM.
2.I'm not sure which location you want to copy the jtl to, but you can use Azure File Copy task to copy files to Microsoft Azure storage blobs or virtual machines (VMs). Or a simple copy/xcopy command in your command line task to copy files to another location in same machine. (Specific VM)
Hope all above helps :)
I have Use following Task in Azure CD pipeline.
"Run Taurus" Task is as following.
Where "_WM WebClient TestArtifacts" is git/Azure Repo directory where .jmx file kept(in Code).

Creating release pipeline to several VMs

I have multiple ASP.NET web applications which I want to release into different VMs. Manually installing a DevOps agent in every single VM seems really inefficient. Is there a way to make this process faster? Is it possible to create release pipeline that could directly push the code to the public ip of the VM?
As workaround, you can prepare scripts to register each new agent. Here you can find parameters: Self-hosted Windows agents - Unattended config.
If you deploy your web application with IIS deployment task,
You can have a try using task Manage IIS, which can create website in a remote machine.
And then you can add a Windows machine file copy task to copy the build artifacts to the website Physical path in the remote machine.
Another workaround is that you can manage IIS with powershell script. So that you can add task PowerShell on target machines to run powershell script to manage IIS website. You can refer the example scripts at this page, and this page. For more information about IIS powrshell command you can refer here

Install or use software on Azure Devops Hosted Agent

I am new to Azure Devops and hoping this is a simple fix. I have a powershell script that uses Tabular Editor to deploy .bim file to Azure Analyses Services. This works great on my local machine. I have tried to get this working in the devops pipelines with no luck. I haven't found away to install the software on the hosted agent - Question 1) can I install software on a Hosted Agent e.g. on Hosted VS2017.
Failing being able to install software on Microsoft's hosted agent. I checked in the TabularEditor.exe file into the source code (I know this ins't best practise). The executable file gets put into the build artifact and publishes. Then in the release when my powershell script is called it just hangs, the script gets stuck here. The powershell script reads from a config file and also uses the path to the tabulareditor executable.
The script I am using works fine if you use a self hosted machine assuming the agent has the correct permissions.
I have another Analyses Services script that is ready and works provided Someone creates an XMLA of the model first, then we provide that as an input instead of a .bim file. But this is not quite the automated route I am looking for.
Also I am aware that there is a third party task that does azure analyses services deployment but I want to avoid using that.
In summary I am looking to find out
1) if I can indeed install software on Microsofts Hosted Agent
2) Should I be able to use the executable in my build artifact instead
3) Is there a better way to deploy Analyses services with a .bim file
I appreciate this is long winded and slightly unique but any insight or information would be appreciated.
Thanks

OnPrem TFS 2015.1 vNext - What step to Release to on premises IIS server?

I'm trying to use TFS 2015.1 on premise to build a CI pipeline for our dev & uat. I've created a vNext CI build, which builds fine. But when I want to add a deploy step for on prem IIS server, I only then see Azure Web Deployment options.
Ideally I wanted to add a step which uses the existing deploy (MS Deploy) profiles, which I'm able to use from VS2015 directly, using 'Publish'. However I see no option to do so.
How can I deploy the latest build to internal dev servers (not Azure)? I would like to use the MS Deploy option, unless there's a better way of doing it?
The fact that their is no option to starts to make me think there's probably a different way to accomplish it!
Thanks.
If you're able to upgrade to TFS 2015.2, web-based Release Management came out with it that works similarly to Build vNext with flexible and open-source tasks. You can also customize tasks.
Here's a link for IIS Web App Deployment from the vso-agent-task's GitHub repo where Microsoft stores updated versions of their tasks that you can download for web-based Build and Release Management.
I'll be publishing a blog about web-based RM with TFS 2015 Update 2 or VSTS on my website in the next few weeks. To give you an idea though, the starting point (for a web application) is a folder in your web project called WebDeploy (no significance - any name will do) that contains a PowerShell DSC script that configures the server, deploys the web files and then replaces any tokenised configs. To give you an idea see this post about how to use DSC to configure servers. (Only covers part of the final script though!) The next steps are:
In the build hub create a Website artifact - containing your web files and DSC script.
In the release hub for an environment use a Windows Machine File Copy task to deploy the artifact to a temp folder on the target node.
Then use a PowerShell on Target Machines task to execute the DSC script. After configuring the server the script copies the web files to their proper location, sorts out config using xReleaseManagement and cleans up the WebDeploy folder.
See this article for general details of the route I'm taking, but watch out as it has some errors eg the firewall instructions are incomplete (file and print sharing through the firewall needs to be enabled).
I can thoroughly recommend the PowerShell DSC route - I've had a few glitches but on the whole it feels very productive and the right way to be going.

Source control for server side scripts in Azure Mobile Service

I am using Azure Mobile Services as the backend for my mobile app. Despite my best efforts, my server side scripts are getting complex now. Is there a way I can keep the insert, update, read, delete scripts for the tables in my service, in source control and maybe have a way to deploy them from within Visual Studio?
Have you checked out the node Azure Command Line Tools? This will likely hold the solution to your problems. These tools allow you to neatly manage your mobile service from your dev machine. The newly added cli tools for Mobile Services also support downloading your scripts. Just run the following command in your Azure Powershell:
azure mobile script download <service_name> <script_name>
The script name syntax is as follows:
For tables: table/.{insert|read|update|delete}
Apple Push Notification: shared/apnsFeedback
Scheduler: scheduler/
Once you have your scripts downloaded and placed on your local filesystem, you could put them in source control with your client that consumes your mobile service, or just throw them in their own git repository. You can't, however, sync your source control repository with your mobile service. In order to upload any changes you've made to your scripts, you'd need to execute the following command in the Azure-CLI again:
azure mobile script upload <service_name> <script_name>
I'm not sure if you can upload multiple scripts at once though. You could probably use some of the Azure-CLI Automation scripts I saw Glenn Block post on github. This could allow you to somehow automate uploading the scripts as a part of your build workflow.
Edit:
I found a few more resources that might help you with this:
Getting started with the CLI and backing up your scripts
More CLI – changing your Mobile Services workflow
These are some great resources from Josh Twist. I'm sure they will push you in the right direction.
Since this question has been answered, a new feature has been added to Azure Mobile Services - integration with Git source control. Basically you can enable this feature in the dashboard of your mobile service, and it converts the storage in a Git repository which you can clone / pull and push updates to.
You can find more information in the tutorial at http://www.windowsazure.com/en-us/develop/mobile/tutorials/store-scripts-in-source-control/.