I am trying to automate deployment to Azure Service Fabric with Jenkins and ServiceFabric PowerShell extension. Jenkins ServiceFabric plugin is not a good option in my case due to lack of control and flexibility over deployment process.
I've faced following issue - Jenkins can't recognize SF PowerShell cmdlets
Connect-ServiceFabricCluster : The term 'Connect-ServiceFabricCluster'
is not recognized as the name of a cmdlet, function, script file, or
operable program. Check the spelling of the name, or if a path was
included, verify that the path is correct and try again
ServiceFabric setup is correct because tt works like a charm when I run the script locally from PowerShell.
So, I've tried to run Jenkins locally instead of service mode as suggested in different posts over the internet, but this haven't resolved the issue.
The other things i've tried:
run the script with self-elevation to admin
run x86/x64 powershell modes
run the script via calling PowerShell exe from cmd runner instead
powershell plugin
forcing "unrestricted" mode
double-dot before script name
I'm still receiving the same result.
So, I tried ServiceFabric Python Cli as an alternative, but faced the other issue - it returns "Bad SSL handshake" on "sfctl cluster select" with certificate, which worked with PS ServiceFabric cmdlets locally
Any ideas?
This is similar to Azure/service-fabric-issues issue 491 which was about a mismatch between the Azure Service Fabric SDK and the Service Fabric runtime.
For instance:
The 2.7 SDK will work against a version 6.0 cluster, but the task will not work on with the 2.8 SDK installed on the agent.
Plus:
Service Fabric PowerShell cmdlets requires PowerShell 3.0 or higher.
Service Fabric uses Windows PowerShell scripts for creating a local development cluster and for deploying applications from Visual Studio. By default, Windows blocks these scripts from running.
To enable them, you must modify your PowerShell execution policy. Open PowerShell as an administrator and enter the following command:
Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Force -Scope CurrentUser
So: if that script is working locally, but not through a Jenkins job on a Jenkins agent, look for differences between the local execution environment (where it is working) and the Jenkins one (where it fails).
The user might not be the same and/or the runtime version might not be compatible with the SDK version.
Do you have Jeknis PowerShell Plugin installed in your system ??
if so can you add your commands into the Power Shell dialog box and see if it works :)
Related
I'm facing an issue actually, i have to run a powershell script on all my VM but this one has to be runned with the minimum of version 3.
Our VM are using the version 2 ans the HyperV the version 4.
I attempt to use the remote access from powershell using "Enter-PSSession" but when i try to run the script this one is using the VM's powershell.
Is it possible to force the script to run on the VM using the powershell version of the HyperV ? Upgrading the version on the VMs is not feasible for the moment.
Thank you
No, this is not possible. PowerShell Remoting executes the code locally on the target computer.
If your code doesn't work and you can't upgrade WMF (PowerShell) on the target computers, then you would have to rewrite your code. Maybe you can use WMI, C#, legacy commandline utilities++ to achieve the same thing?
I installed VSTS build agent on mac to build xamarin iOS project. Builds worked fine until I added powershell build step.
Even though I installed powershell for mac (https://github.com/PowerShell/PowerShell) and re-installed the agent, VSTS complains it does not have agent that is capable to run the build.
No agent could be found with the following capabilities:
DotNetFramework, Xamarin.iOS, npm
When I disable the build step, builds work just fine.
Is it possible to run powershell build step on Mac?
As MrHinsh clarified, the PowerShell task cannot be used on Mac.
As a workaround I used ShellScript task:
With the following bash script:
#!/bin/bash
powershell ./SetAppVersion.ps1
Also, the powershell installer did not seem to add powershell to my PATH so I had to add it:
$ export PATH=$PATH:/usr/local/microsoft/powershell/6.0.0-alpha.16
If you're sure that DotNetFramework is installed then you can go to the Agent Queues settings and add a custom Capability to it called exactly that.
That should allow it to run but it might fail after that if the agent can't actually find them, but it might also succeed so it's probably worth a try.
No, you can't use a PowerShell task on a Mac, only node tasks are supported.
PowerShell tasks as currently written in PowerShell3 which is not supported on Mac. You can request that the team implement this on http://visualstudio.uservoice.com
In TFS build go to Agents Queues=>Capablilities=>Add variable named as DotNetFramework and give value for mac agent's dotnet framework path.
It's fix for the issue "No agent could be found with the following capabilities:DotNetFramework"
This is a follow-up to the accepted answer to address a question in a comment which I also had.
Thanks to spatialguy for posting and finding a simple solution to this problem. I had the same problem as KeithA45:
QUESTION: What if you wanted to do the same, but also pass arguments to the Bash script which passes them to the Powershell script?
I found a solution to this, first off, I modified the shell script task to include the Visual Studio Team Services (VSTS) environmental variables that I wanted to pass to the powershell script.
Next, I pass the arguments through to the called powershell script by slightly modifying the shell script mentioned by the accepted answer.
#!/bin/bash
powershell ./Version.ps1 $1 $2
Finally, in the powershell script, I catch the arguments that have been passed through using using param like this:
param([string]$version, [string]$path)
Wherein I can now use the variables $version and $path which contain the original arguments entered in VSTS to the needs of my powershell script.
Things seem to have moved forward because I ran successfully today a PowerShell#2 task on a Mac Self-Hosted Agent from an Azure DevOps build pipeline.
By checking "Enable system diagnostics" when queuing the build, the log shows me that the task found itself the path to the PowerShell Core (pwsh) that I installed on my Mac with the help of Homebrew (brew cask install powershell - see https://learn.microsoft.com/fr-fr/powershell/scripting/install/installing-powershell-core-on-macos).
When trying to Deploy IIS content files using powershell, I am seeing the following error:
Failing task since return code of [powershell -ExecutionPolicy bypass
-Command /bin/sh file.ps1
We are trying to deliver web content (several web service folders) and a set of associated web.config files to an IIS server using a powershell scrpt. The script and config files are stored in SCM (Stash) and pulled across as part of a checkout task in Bamboo which are then published as artifacts to the bamboo deploy job.
The deploy job is failing.
Does anyone know the correct way to set this up both in the Bamboo Deploy job or on the Windows Server?
Many thanks in advance.
For anyone trying to resolve this issue in future, I resolved this issue by installing a Bamboo agent onto the Windows Server.
This means that the powershell script gets checked out onto the Windows server as part of the deploy job (using the Sourcecode checkout task) and so can then be run as powershell rather than the Bamboo local agent which is running on a Linux box trying to run the powershell script as a linux shell.
The only other issue that had to be overcome was ssl certs needed to be installed into the JKS (cacerts) file on the Windows Server (where the remote agent is running) as we have Bamboo running over https.
Currently we are having application which will be in DVD. there will be setup.exe and user will click on that and fill the inputs it asks for . Inputs such as path where the application to be installed, SQL server instance where db will be created and port numbers which required to be bind.
I am hearing that Powershell DSC can be used for application deployment. But it is not like running some setup.exe and get some inputs for installation.
Whether Powershell DSC can really be used for application deployment? or is it only for environment preparation?
If it is being used for application deployment , how it is being achieved? Whether the end user told to fill the data in some configurationdata psd1 file manually and then run the script?
You can use the built-in Package resource. However you may want to explore looking at cChoco instead as Chocolatey is much more geared towards software management (application deployment) with handling installs, upgrades and uninstallation.
https://github.com/PowerShellOrg/cChoco
Powershell DSC its for Application Deployment, but... you can use it as an exe, what you can do is create a simple console or windows forms EXE program that embeds the script as a resource and the EXE, upon loading retrieves the script and throws it at a PowerShell runspace to execute.
This is a link about it Make PSexe
I try to configure TFS for Continuous Delivery to Azure by this article
In article TFS published package to Azure with Powershell script.
When build starts I get errors like ObjectNotFound: (Set-AzureDeployment:String) [], CommandNotFoundException. Looks like I didn't install Azure cmdlets, but I install all from Web Platform Installer.
And when I try to run script locally on server - it works and deploys package.
In article Powershell starts by adding InvokeProcess to template with Filename="PowerShell".
I think I just don't run Powreshell correctly.
Maybe somebody has some ideas which command should I use?
Find a solution
Powershell cann't find Azure module.
Add this before Import-Module Azure command in script
$env:PSModulePath=$env:PSModulePath+";"+"C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell"
Could be that you installed the cmdlets on the user profile. Try re-install after logging in with the account running the build service.