How to run powershell script remotely using chef? - powershell

I have powershell script which is present on chef server to run on remote windows server, how can i run this powershell script from chef server on remote windows server.

Chef doesn't do anything like this. First, Chef Server can never remotely access servers directly, all it does is stores data. Second, Chef doesn't really do "run a thing in a place right now". We offer workstation tools like knife ssh and knife winrm as simplistic wrappers but they aren't made for anything complex. The Chef-y way to do this would be to make a recipe and run your script using the the powershell_script resource.

Does it mean chef is also running on Windows server ?
If yes, why not to use psexec from Windows Ps tools ?
https://learn.microsoft.com/en-us/sysinternals/downloads/psexec

Here is my understanding of what you are trying to achieve. If I'm wrong then please correct me in a comment and I will update my answer.
You have a powershell script that you need to run on a specific server or set of servers.
It would be convenient to have a central management solution for running this script instead of logging into each server and running it manually.
Ergo you either need to run this script in many places when a condition isn't filled, such as a file is missing, or you need to run this script often, or you need this script to be run with a certain timing in regards to other processes you have going on.
Without knowing precisely what you're trying to achieve with your script the best solution I know of is to write a cookbook and do one of the following
If your script is complex place it in your cookbook/files folder (assuming the script will be identical on all computers it runs on) or in your cookbook/templates folder (if you will need to inject information into it at write time). You can then write the .ps file to the local computer during a Chef converge with the following code snippet. After you write it to disk you will also have to call it with one of the commands in the next bullet.
Monomorphic file:
cookbook_file '<destination>' do
source '<filename.ps>'
<other options>
end
Options can be found at https://docs.chef.io/resource_cookbook_file.html
Polymorphic file:
template '<destination>' do
source '<template.ps.erb>'
variables {<hash of variables and values>}
<other options>
end
Options can be found at https://docs.chef.io/resource_template.html
If your script is a simple one-liner you can instead use powershell_script, powershell_out! or execute. powershell_out! has all the same options and features as the shell_out! command and the added advantage that your converge will pause until it receives an exit status for the command, if that is desirable. The documentation on using it is a bit more spotty though so spend time experimenting with it and googling.
https://docs.chef.io/resource_powershell_script.html
https://docs.chef.io/resource_execute.html
Which ever option you end up going with you will probably want to guard your resource with conditions on when it should not run, such as when a file already exists, a registry key is set or what ever else your script changes that you can use. If you truly want the script to execute every single converge then you can skip this step, but that is a code smell and I urge you to reconsider your plans.
https://docs.chef.io/resource_common.html#guards
It's important to note that this is not an exhaustive list of how to run a powershell script on your nodes, just a collection of common patterns I've seen.
Hope this helped.

Related

Newby Trouble With Remote PowerShell scripts

I do know about the double hop issue. My scenario is: I have a script I want to run remotely that calls another script located on a network share that calls a third script located on a second network share in a different domain.
Currently what I am doing is using Credssp (I've read there can be security issues but this environment is not public facing) to pass credentials for the 1st network share that has script2. I do not have access to the computer with the second domain so I cannot setup credssp on it. In order to work around that, inside of the script2 I am using "net use" command on the third script in order for the script to be able to find the path. I am then using "Copy-Item" to copy the third script on to the machine running script2 (the remote machine).
Up to this step, everything is working when I run script1. I can see script3 is copied over onto the remote machine. When script3 is called, it should make a web request that sends text to stdout (which I pipe to Out-File in script2). However, whenever I try to run the copy of script3 (located on the remote machine) from script2 (running on the remote machine) it does not seem to do anything. If I run script2 locally on the remote machine then it works fine (file is generated from script3's output).
Any idea's on why this won't work? I've tried running script 3 using several variations of invoke-expression, invoke-command, start-process, and even trying to run with cmd. I'm also having trouble getting output on what exactly is causing the issue (stdout and stderr are many times empty when using the different commands). Am I missing some command or tool that may make this easier to troubleshoot? It almost seems like script3 is still running into a double hop issue despite it only making a web request? And if it was running into that, I thought it would have had an error returned.
There my be a better design for doing what I'm trying to do. I'm fairly new to PowerShell and may be over complicating this.
Edit: Rewrote my scripts in python and got it working.

How to run a powershell script on Amazon EC2 instance at Startup?

I have to think this is a solved issue but I am just not getting it to work. So I have come to you StackOverflow with this issue:
I have a windows server 2016 machine running in amazon ec2. I have a machine.ps1 script in a config directory.
I create an image of the box. (I have tried with checking noreboot and unchecking it)
When I create a new instance of the image I want it to run machine.ps1 at launch to set the computer name and then set routes and some config settings for the box. The goal is to do this without logging into the box.
I have read and tried:
Running Powershell scripts at Start up
and used this to ensure user data was getting passed in:
EC2 Powershell Launch Tools
I have tried setting up a scheduled task that runs the machine.ps1 on start up (It just hangs)
I see the initializeInstance.ps1 on start up task and have tried to even coop that replacing the line to run userdata with the line to run my script. Nothing.
If I log into the box and run machine.ps1, it will restart the computer and set the computer name and then I need to run it once more to set routes. This works manually. I just need to find a way to do it automagically.
I want to launch these instances from powershell not with launch configurations and auto scale.
You can use User data
Whenever you deploy a new server, workstation or virtual machine there is nearly always a requirement to make final changes to the system before it’s ready for use. Typically this is normally done with a post-deployment script that might be triggered manually on start-up or it might be a final step in a Configuration Manager task sequence or if you using Azure you may use the Custom Script Extension. So how do you achieve similar functionality using EC2 instances in Amazon Web Services (AWS)? If you’ve created your own Amazon Machine Image (AMI) you can set the script to run from the Runonce registry key, but then can be a cumbersome approach particularly if you want to make changes to the script and it’s been embedded into the image. AWS offers a much more dynamic method of injecting a script to run upon start-up through a feature called user data.
Please refer following link for ther same:
Poershell User data
Windows typically won't let a powershell script call another powershell script unless it is being run as Administrator. It is a weird 'safety' feature. But it is perfectly okay to load the ps1 files and use any functions inside them.
The UserData script is typically run as "system". You would THINK that would pass muster. But it fails...
The SOLUTION: Make ALL of your scripts into powershell functions instead.
In your machine.ps1 - wrap the contents with function syntax
function MyDescriptiveName { <original script contents> }
Then in UserData - use the functions like this
# To use a relative path
Set-Location -Path <my location>
# Load script file into process memory
. <full-or-relpath>/machine.ps1
# Call function
MyDescriptiveName <params-if-applicable>
If the function needs to call other functions (aka scripts), you'll need to make those scripts into functions and load the script file into process memory in UserData also.

Can RemoteSigned run scripts created on same domain?

I'm creating and testing some powershell scripts to do some basic file copying. I've set my executionpolicy to RemoteSigned. According to the help, this should allow me to run scripts that were not downloaded from the internet. However, my observations seem to indicate that this will run only scripts created on the local machine.
For instance, if I create a script on my development machine and try to copy to my server (on my same domain), the script will not run. However, if I open up the Powershell ISE on the server and open my script, copy the code and paste it into a new file window and save it to the server, the script then runs. Further, if I want to create a self-signed certificate, it will not run on other computers (per the help).
So, this all seems a bit cumbersome that I have to develop my scripts on the machine they are to be run or go through the copy/paste routine mentioned above to get them to run on my server. I just want to know that I've understood all of this correctly and there is no other way to create a script within the same domain and run it under the remotesigned execution policy without paying the fee for a certificate.
this post here provide the method for executing script from shared folder. hope this could help you :-)

How secure is a powershell runspace/session

If i create a powershell runspace, either programatically with .NET or just by launching the powershell console; How secure are the scripts/commands that are run?
I'm not speaking about signing scripts, but the actually memory space that the scripts are run in.
I'm worried that if sensitive information is gathered as part of the script (a sql query into a salary database for example) that someone could hack this data out.
I know most people are thinking SecureString at this point, i know about SecureString.... I'm wanting to know specifically about the powershell runspace, not how to store strings securely inside a runspace (lets hope that last sentence didn't just answer my own question).
Specifically :
Are other applications/scripts/whatever able to peer into the runspace and see the commands i'm running?
Powershell script security works by controlling whether or not a script is "allowed" to run on your machine. If you have a machine running an execution policy of "AllSigned", that machine will require the Powershell script to be signed by a trusted certificate.
Scott Hanselman has a really good article on it here.
To my knowledge, your command history isn't permanently saved. You can do a "get-history" to see the commands you've entered in your current session, but it's not like linux/unix where "history" will contain all of the commands you've ever run on the system. As far as other applications being able to "peer into" or query your session, I have no idea.

Application Deployment with Powershell

I've developed a Powershell script to deploy updates to a suite of applications; including SQL Server database updates.
Next I need a way to execute these scripts on 100+ servers; without manually connecting to each server. "Powershell v2 with remoting" is not an option as it is still in CTP.
Powershell v1 with WinRM looks the most promising, but I can't get feedback from my scripts. The scripts execute, but I need to know about exceptions. The scripts create a log file, is there a way to send the contents of the log file back to the "client" (the local computer making the remote calls)?
Quick answer is No. Long version is, possible but will involve lots of hacks. I developed very similar deployment script/system using PowerShell 2 last year. The remoting feature is the primary reason we put up with the CTP status. PowerShell 1 with WinRM is flaky at best and as you said, no real feedback apart from ok or failed.
Alternative that I considered included using PsExec, which is very much non-standard and may be blocked by firewall. The other approach involves using system management tools such as MS's System Center, but that's just a big hammer for a tiny nail. So you have to pick your poison...
Just a comment on this: The easiest way to capture powershell output is to use the start-transcript cmdlet to pipe console output to a file. We have a small snippet at the start of all our script that sends a log file with the console output from each script to a central file share, and names the log file with script name and date executed so that we'll have an idea of what happened. Its not too hard to pipe all those log files into a database for further processing either. Probably won't seolve all your problems, but would definitely help on the "getting data back" part.
best regards,
Trond