I am new to Azure Devops and hoping this is a simple fix. I have a powershell script that uses Tabular Editor to deploy .bim file to Azure Analyses Services. This works great on my local machine. I have tried to get this working in the devops pipelines with no luck. I haven't found away to install the software on the hosted agent - Question 1) can I install software on a Hosted Agent e.g. on Hosted VS2017.
Failing being able to install software on Microsoft's hosted agent. I checked in the TabularEditor.exe file into the source code (I know this ins't best practise). The executable file gets put into the build artifact and publishes. Then in the release when my powershell script is called it just hangs, the script gets stuck here. The powershell script reads from a config file and also uses the path to the tabulareditor executable.
The script I am using works fine if you use a self hosted machine assuming the agent has the correct permissions.
I have another Analyses Services script that is ready and works provided Someone creates an XMLA of the model first, then we provide that as an input instead of a .bim file. But this is not quite the automated route I am looking for.
Also I am aware that there is a third party task that does azure analyses services deployment but I want to avoid using that.
In summary I am looking to find out
1) if I can indeed install software on Microsofts Hosted Agent
2) Should I be able to use the executable in my build artifact instead
3) Is there a better way to deploy Analyses services with a .bim file
I appreciate this is long winded and slightly unique but any insight or information would be appreciated.
Thanks
Related
Is it possible to browse the file system in Azure Devops. Like when using SSH to connect to a server? Or if it's possible to browse using Explorer.
It would really simplify things if I could see what files were created and where they end up after builds.
Now I don't feel I have any way to know which files ended up where after the builds are done.
Thanks!
I don`t think so. You may add build steps (Build and release tasks - Utility) and create cmd or bath file to browse the file system of the build servers.
As alternative way, you may use your own build server (Self-hosted agents) on Azure VMs and you will have the full control.
Have written multiple Powershell scripts to take backup of AZure API management service. One script calls another and exports all the products and properties to my local machine.
I want to automate this process of backing up everything directly to Bitbucket. To achieve this i have configured Jenkins which is running on a CentOS server but, i don't know how to automate things using it?
I tried installing the Powershell Plugin on Jenkins but as i have multiple scripts written having dependency on one another, i just can't paste the whole thing as it is and run.
so is there a way i can run this multiple codes on Jenkins Powershell plugin instead of composing one single code and then run it?
Second thing, should Jenkins be installed on my local machine rather than on CentOS server in order to achieve this?
Or
if Jenkins has a plugin to link with my Azure account and export services to Bitbucket? (with Powershell codes out of picture)
Is there any other tool / alternative which is little less complicated?
I'm trying to use TFS 2015.1 on premise to build a CI pipeline for our dev & uat. I've created a vNext CI build, which builds fine. But when I want to add a deploy step for on prem IIS server, I only then see Azure Web Deployment options.
Ideally I wanted to add a step which uses the existing deploy (MS Deploy) profiles, which I'm able to use from VS2015 directly, using 'Publish'. However I see no option to do so.
How can I deploy the latest build to internal dev servers (not Azure)? I would like to use the MS Deploy option, unless there's a better way of doing it?
The fact that their is no option to starts to make me think there's probably a different way to accomplish it!
Thanks.
If you're able to upgrade to TFS 2015.2, web-based Release Management came out with it that works similarly to Build vNext with flexible and open-source tasks. You can also customize tasks.
Here's a link for IIS Web App Deployment from the vso-agent-task's GitHub repo where Microsoft stores updated versions of their tasks that you can download for web-based Build and Release Management.
I'll be publishing a blog about web-based RM with TFS 2015 Update 2 or VSTS on my website in the next few weeks. To give you an idea though, the starting point (for a web application) is a folder in your web project called WebDeploy (no significance - any name will do) that contains a PowerShell DSC script that configures the server, deploys the web files and then replaces any tokenised configs. To give you an idea see this post about how to use DSC to configure servers. (Only covers part of the final script though!) The next steps are:
In the build hub create a Website artifact - containing your web files and DSC script.
In the release hub for an environment use a Windows Machine File Copy task to deploy the artifact to a temp folder on the target node.
Then use a PowerShell on Target Machines task to execute the DSC script. After configuring the server the script copies the web files to their proper location, sorts out config using xReleaseManagement and cleans up the WebDeploy folder.
See this article for general details of the route I'm taking, but watch out as it has some errors eg the firewall instructions are incomplete (file and print sharing through the firewall needs to be enabled).
I can thoroughly recommend the PowerShell DSC route - I've had a few glitches but on the whole it feels very productive and the right way to be going.
I'm trying to deploy an Azure WebJob using the Azure PowerShell SDK or Rest API and I'm having troubles finding support for deploying WebJobs. It's super simple to do through the UI in VS or Azure Management Portal but there doesn't seem to be much automation support.
According to some sites, when you deploy an Azure Website with an associated web job, it's supposed to deploy the Web Job automatically but I'm not seeing that happen when I publish the web project in VS and I don't see how that would work after the bits are compiled through TFS.
I've found some great resources that I've tried following but don't seem to be working for me:
http://azure.microsoft.com/en-us/documentation/articles/websites-dotnet-deploy-webjobs/#deploy
http://azure.microsoft.com/blog/2014/08/18/enabling-command-line-or-continuous-delivery-of-azure-webjobs/
Unfortunately, I need to integrate this automation into a standalone deployment orchestration so I can't tie it to MSBuild.
I'd be happy with deploying both the Azure Website and the WebJob together if that's possible. I would imagine there must be a way to automate the uploading of a zip file containing the WebJob executable to the website like you can through the portal but I haven't had any luck finding it.
Apparently according to this post the job just gets put in a folder matching the job name in your websites App_Data folder.
You might have to manually upload the job as a zip file once, but afterwards I assumed you could do some magic in your deployment or in a post build task to build the web job console app and stick it in the appropriate App_Data folder.
We decided to use AMAZON AWS cloud services to host our main application and other tools.
Basically, we have a architecture like that
TESTSERVER: The EC2 instance which our main application is
deployed to. Testers have access to
the application.
SVNSERVER: The EC2 instance hosting our Subversion and
repository.
CISERVER: The EC2 instance that JetBrains TeamCity is installed and
configured.
Right now, I need CISERVER to checkout codes from SVNSERVER, build, if build is successful, unit test it, and after all tests pass, the artifacts of successful build should be deployed to TESTSERVER.
I have completed configuring CISERVER to pull the code, build, test and produce artifacts. But I couldn't manage how to deploy artifacts to TESTSERVER.
Do you have any suggestion or procedure to accomplish this?
Thanks for help.
P.S: I have read this Question and am not satisfied.
Update: There is a deployer plugin for TeamCity which allows to publish artifacts in a number of ways.
Old answer:
Here is a workaround for the issue that TeamCity doesn't have built-in artifacts publishing via FTP:
http://youtrack.jetbrains.net/issue/TW-1558#comment=27-1967
You can
create a configuration which produces build artifacts
create a configuration, which publishes artifacts via FTP
set an artifact dependency in TeamCity from configuration 2 to configuration 1
Use either manual or automatic triggering to run configuration 2 with artifacts produced by configuration 1. This way, your artifacts will be downloaded from build 1 to configuration 2 and published to you FTP host.
Another way is to create an additional build step in TeamCity for configuration 1, which publishes your files via FTP.
Hope this helps,
KIR
What we do for deployment is that the QA people log on to the system and run a script that deploys by pulling from the team city repository whenever they want. They can see in team city (and get an e-mail) if a new build happened, but regardless they just deploy when they want. In terms of how to construct such a script, the team city component involves retrieving the artifact. That is why my answer references getting the artifacts by URL - that is something any reasonable script can do using wget (which has a Windows port as well) or similar tools.
If you want an automated deployment, you can schedule a cron job (or Windows scheduler) to run the script at regular intervals. If nothing changed, it doesn't matter much. I question the wisdom of this given that it may mess up someone testing by restarting the system involved.
The solution of having team city push the changes as they happen is not something that team city does out of the box (as far as I know), but you could roll your own, for example by having something triggered via one of team city's notification methods, such as e-mail. I just question the utility of that. Do you want your system changing at random intervals just because someone happened to check something in? I would think it preferable to actually request the new version.