I have a PowerShell script that orchestrates a deployment to servers all over the place, but don't fear, it never gets past line 2.
Its the same setup on two project build configurations. However, while it works on one, it fails immediately on another when performing a very early check for a required component.
The script runs a Get-Command New-SshSession and checks the output to see if that cmdlet is available, whether the SSH module is setup.
Considering that the same script runs under the same .. I've just worked it out. I'm going to continue on a post an answer!
Anyway, considering the script path is the same, the params are almost the same, how can it fail on one and not on another?
Luke
Have you checked the x86|x64 combobox for the build step is the same for both? Otherwise PowerShell will run from SysWow64 which has a different modules folder than the 'normal' 64-bit version under System32.
Good luck,
Luke
Related
I installed VSTS build agent on mac to build xamarin iOS project. Builds worked fine until I added powershell build step.
Even though I installed powershell for mac (https://github.com/PowerShell/PowerShell) and re-installed the agent, VSTS complains it does not have agent that is capable to run the build.
No agent could be found with the following capabilities:
DotNetFramework, Xamarin.iOS, npm
When I disable the build step, builds work just fine.
Is it possible to run powershell build step on Mac?
As MrHinsh clarified, the PowerShell task cannot be used on Mac.
As a workaround I used ShellScript task:
With the following bash script:
#!/bin/bash
powershell ./SetAppVersion.ps1
Also, the powershell installer did not seem to add powershell to my PATH so I had to add it:
$ export PATH=$PATH:/usr/local/microsoft/powershell/6.0.0-alpha.16
If you're sure that DotNetFramework is installed then you can go to the Agent Queues settings and add a custom Capability to it called exactly that.
That should allow it to run but it might fail after that if the agent can't actually find them, but it might also succeed so it's probably worth a try.
No, you can't use a PowerShell task on a Mac, only node tasks are supported.
PowerShell tasks as currently written in PowerShell3 which is not supported on Mac. You can request that the team implement this on http://visualstudio.uservoice.com
In TFS build go to Agents Queues=>Capablilities=>Add variable named as DotNetFramework and give value for mac agent's dotnet framework path.
It's fix for the issue "No agent could be found with the following capabilities:DotNetFramework"
This is a follow-up to the accepted answer to address a question in a comment which I also had.
Thanks to spatialguy for posting and finding a simple solution to this problem. I had the same problem as KeithA45:
QUESTION: What if you wanted to do the same, but also pass arguments to the Bash script which passes them to the Powershell script?
I found a solution to this, first off, I modified the shell script task to include the Visual Studio Team Services (VSTS) environmental variables that I wanted to pass to the powershell script.
Next, I pass the arguments through to the called powershell script by slightly modifying the shell script mentioned by the accepted answer.
#!/bin/bash
powershell ./Version.ps1 $1 $2
Finally, in the powershell script, I catch the arguments that have been passed through using using param like this:
param([string]$version, [string]$path)
Wherein I can now use the variables $version and $path which contain the original arguments entered in VSTS to the needs of my powershell script.
Things seem to have moved forward because I ran successfully today a PowerShell#2 task on a Mac Self-Hosted Agent from an Azure DevOps build pipeline.
By checking "Enable system diagnostics" when queuing the build, the log shows me that the task found itself the path to the PowerShell Core (pwsh) that I installed on my Mac with the help of Homebrew (brew cask install powershell - see https://learn.microsoft.com/fr-fr/powershell/scripting/install/installing-powershell-core-on-macos).
Is there any difference between running a Powershell script:
From the command line powershell.exe -File my_scipt.ps1
From Powershell ISE (Open the script in the editor and press the green play button to run)
From a Windows Powershell Host application?
And if there is a difference, is there a way in Powershell to check it?
The reason for asking this is that we are seeing one script have slightly different behaviour in these three environments, even though we had expected to see the same outcome. The behaviour is that a (3rd party non-public) .Net library we are using crashes in the second two environments, but works fine in the first one.
We have checked the obvious things, such as:
directory of the powershell process set the same (which we set via [System.IO.Directory]::SetCurrentDirectory($my_path) in the script)
Powershell and .Net version (confirmed via identical $PSVersionTable)
System path
My hope in asking this question is that there is some difference which we are unaware of, and that by identifying it we can resolve the crash we are seeing. I'd also be interested to hear of similar experiences from anyone here.
we have application that needs to simply copy somefiles from source to destination and manipulate config files based on the environment. We use Jenkins for deployment. Since i am comfortable with C# i thought of writing simple console application (.exe ) and invoke that exe on post-deployment by passing some command line argument. and i think this would work.
But i see people are recommending power-shell for deployment. and i have used PS for other projects for deployment.
i just wanted to know what powershell can do that windows console application cannot do?
Since PowerShell could be wholly embedded (not really the right term but it works for this explanation) in C# , there's nothing you could do in PowerShell that couldn't also be achieved in C#.
You can also embed C# in PowerShell, but for various reasons you don't get exactly the same scope of functionality that you can with an .exe.
The point of using PowerShell has to do with the context of it being part of a deployment step.
A PowerShell command or script is more easily changed. A build process is not required.
Its contents are more readily visible and readable to someone who wants to understand the process.
The code written will (likely) be less verbose, further making it easier to understand, and for deployment steps it may be much more straightforward to do those steps in PowerShell (a single cmdlet may do what would be several (dozen) lines in C#).
I'm installing a PowerShell module via Octopus Deploy onto a number of different servers. For testing purposes, I went with the guidance of Microsoft's documentation for installing PowerShell Modules.
This worked fine, but as the documentation stated, my changes would be visible only for the current session. That is, if I were to do the following:
$modulePath = [Environment]::GetEnvironmentVariable("PSModulePath", [EnvironmentVariableTarget]::Machine)
# More practically, this would be some logic to install only if not present
$modulePath += ";C:\CustomModules"
[Environment]::SetEnvironmentVariable("PSModulePath", $modulePath, [EnvironmentVariableTarget]::Machine)
When running this installer automatically on tentacle servers, future PowerShell sessions do not appear to see the newly installed modules.
How can I install a PowerShell module in a profile agnostic way so that every PowerShell session started can see it?
PowerShell can only "see" modules installed in one of the directories listed in $env:PSModulePath. Otherwise you'll have to import the module with its full path.
To make a new module visible to all users you basically have two options:
Install the module to the default system-wide module directory (C:\Windows\system32\WindowsPowerShell\v1.0\Modules).
Modify the system environment so that PSModulePath variable already contains your custom module directory (e.g. via a group policy preference).
The latter will only become effective for PowerShell sessions started after the modification was made, though.
This profile applies to all users and all shells.
%windir%\system32\WindowsPowerShell\v1.0\profile.ps1
After taking the steps you spelled out in your question (which I think is the general way to go), I found two ways to get the new module source recognized by Powershell:
Restart the machine. (Works every time.)
Reset the PSModulePath in each open session.
$env:PSModulePath=[Environment]::GetEnvironmentVariable("PSModulePath", "Machine")
I found this was necessary to run in both normal and elevated prompts to get this to work without restarting in each type of prompt. (See also the conversation # Topic: PSModulePath.)
My build script runs on linux and invokes things like gcc, shell scripts, etc.
Part of the solution is written in mono and could be compiled easily on linux.
But I want to obfuscate the code. Not manually, but as part of the build process.
Therefore I need to invoke Dotfuscator and Dotfuscator so far only runs on windows.
Is there a good solutions to invoke command line based workers/build scripts remotely from linux on a windows machine? I don't just want to run a command remotely, but also pass files along.
Like a windows service that is accessed using simple curl-uploads of a tar file, creates a temp folder for each concurrently connected client (or blocks concurrent calls) and unpacks the file, invokes something on these files and packages the result again as tar file to give it back to the caller? And clears the temp file even in case of failures?
Maybe someone knows a good solution that saves me from writing this myself!
It should not be so uncommon that a build process spans multiple platforms, yet common build server answers I found mainly talk about only one build script.
Also think about running e.g. the nsis setup builder from a linux driven build script, if part of your solution has a tiny windows component