In TFS 2017 Update 2 Release Management, what advantages does the "PowerShell on Remote Machines" task provide over PSRemote PowerShell scripts executed from the deployment agent machine?
As part of our release process, we use PowerShell scripts to validate and configure the servers being deployed to (e.g. install SSL certs, .NET version, etc.). We've historically used PowerShell remoting (New-PSSession/Invoke-Command) with CredSSP to execute scripts from the deployment agent to configure the destination machines.
The PowerShell on Remote Machines task appears to involve copying your ps1 to the destination machine and then executing it from that machine's context. Compared to the PSRemote method, it looks like all we gain is simpler syntax. Also, it looks harder to trace and troubleshoot from RM if we copy script files to a collection of servers and then let those boxes execute the scripts while we wait for the result. Given the popularity of websites referencing using this task, I feel I must be missing something.
The PowerShell on Target Machinestask makes the deployment more convenient and effective.
This task can run both PowerShell scripts and PowerShell-DSC scripts. It can execute PowerShell scripts on remote machines with a comma-separated list of machine FQDNs or IP addresses, optionally including the port number. Also pass other arguments easily.
Refer Deploy: PowerShell on Target Machines to know the details.
And this article for PSRemote : How to Run PowerShell Commands on Remote Computers
You can compare with them.
Related
I am trying to build a CI/CD pipeline with azure. The deployment is working until the final stage where i need to run a powershell/cmd script on the machine that is running the deployment group agent. Can someone please assist on how to run a cmd/powershell script on the machine that is running the deployment group agent?
I have tried using remote powershell but that requires a username and password which i can not use for security reasons.
For context
I have a local server. I have a repo on azure. I have created a pipeline that builds the repo and the artifacts of the build are then copied to my local server. Now I want to run a powershell/cmd on the local server through the pipeline.
Refer to the documentation here:
https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/powershell?view=azure-devops&tabs=yaml#add-a-powershell-script
The syntax for including PowerShell Core is slightly different from the syntax for Windows PowerShell.
Push your PowerShell script to your repo.
Add a pwsh or powershell step. The pwsh keyword is a shortcut for the PowerShell task for PowerShell Core. The powershell keyword is another shortcut for the PowerShell task but Windows PowerShell and will only work on a Windows agent.
# for PowerShell Core
steps:
- pwsh: ./my-script.ps1
# for Windows PowerShell
steps:
- powershell: .\my-script.ps1
However as you would notice, this would only run on the agent.
You can also use the classic alternative, also described in the same documentation using the UI provided by Azure
Another alternative which may be suited for your case is to create a VM extension by navigating to the virtual machine in the Azure Portal, clicking on "Extensions" in the left sidebar, and then clicking the "+Add" button.
Otherwise, your only option may be the "Azure Remote Run", however you mention you cannot get the credentials for that.
SSIS package runs multiple Process tasks (5) in parallel which call / invoke the same powershell script (with different parameters being passed to it). The package works great when running locally on my machine, but when deployed to an Integration Services Catalog in SQL Server on a windows server, only 4 out of the 5 Process Tasks are calling the powershell script. Does anyone know of a setting that might be preventing the 5th Process Task from calling the Powershell script? Or is there a way I should be calling a single powershell script so that it can be processed simultaneously on a server?
I have a simple .exe on a network share that merely creates a dummy file on a network share. The program works. I've wrapped it in a .bat file, a .ps1 file, and a .vbs file, and they all work. However, when I create a SCOM rule to invoke any of these beasts it does not run. Am I missing a management pack or building the rule wrong such that SCOM doesn't run my module? What's the secret to having SCOM run an external module? Thanks.
First, Does your SCOM Agent's RunAs account have permission to access the file?
Most folks deploy the SCOM agent and leave it running under a local account.
Second, if this is a custom authored rule, is your rule properly configured to run on the target system or is it running on the management server? ( what is your target? )
With the basics covered, I have a hunch that your SCOM rule is executing PowerShell based on your use of 'invoke'. If you run PowerShell remotely without enabling CredSSP then you wont be able to make an authenticated connection to the file share downstream.
This guy explains it better then I can: https://4sysops.com/archives/using-credssp-for-second-hop-powershell-remoting/
If this is not the issue can you paste in the actual action the rule is taking?
I use a powershell script, triggered by teamcity, to spin up new Windows Server VMs. Currently, when the machine is up and running, I need to log in via the VMM console to make a couple of configuration changes (enable file sharing, network discovery, msdeploy and remoting over winrm) in order to allow other teamcity jobs to be able to deploy enterprise apps to the VM.
I haven't found any way to run my config setup scripts on the new VM other than by using the GUI console in VMM. For VMHosts, there is Invoke-SCScriptCommand, but this doesn't work for virtual machines themselves. Am I missing something or do I have to alter the template that my VM's are built from, in order to get the required config on the VMs?
One way you could achieve what you require is by putting all your config changes in a powershell script sitting inside VM template and adding it to VM's startup scripts.
The script's first step is checks whether the config changes have been applied in the past by checking some kind of a flag(ie. a file c:\deployed.flag) and last step is to create the flag.
if(Test-Path c:\deployed.flag){
## deployment script run already, do nothing
}
else{
## your config changing code block
New-Item c:\deployed.flag -Type f
}
In VMWare/PowerCLI you can run Invoke-VMScript which executes command directly on a VM via VMWare tools but alas Hyper-V Integration Services don't have such functionality.
I am working on an automated deployment process for a web application. The deployment will need to:
Deploy DB changes to database using sqlpackage.exe
Deploy reporting services reports to the reports server using the web service
Deploy web app to web server(s)
Deploy fonts for reports
among other things
The first two are reasonably straightforward to run from the web server, as the web service and db are contactable, and the tools to deploy run over the network.
From reading it appears that powershell remoting should be the way to go, and internally this would not be a problem. However when deploying to production, this will be being carried out in a datacentre, where the machines (2web, 1db) are not on a domain at all. I'd like to come up with a generic process that can run both internally and externally with the appropriate configuration. Powershell remoting, with machines not in a domain appears to require a fair bit of configuration using https etc., as NT credentials can't be validated.
Should I battle out configuring powershell remoting, or would configuring this to just use psexec to execute a powershell script directly on the remote machien, copying the deployment artifacts to a drop folder on the remote machine be the best way to go?
psexec seems to "just work". It appears powershell remoting comes with a lot more pain.
Why not use psexec then? You can restrict it's role to just getting you on to the remote machine, and not let it infect your scripts. I have not attempted to get ps remoting working on a non-domain, but it general I have found it to be fairly high effort to get going. Psexec, as you say, can often be simpler.
Excuse the peddling, but the open source framework I helped build called PowerUp essentially does all this for your. It uses a model in which the powershell (well psake) scripts can move execution to another machine by calling a specific function. This can either be done with powershell remoting or psexec - you wouldn't need to change the script, it just requires a setting per environment to say which you would like to use.
Check other the sample at https://github.com/AffinityID/PowerUpSamples/tree/master/SimpleWebsite.
Hopefully that shows you enough, but if not let me know and we can go into more detail.