How to keep PowerShell process open with input/output on runtime? - powershell

I'm researching these days on how to can I keep a powershell process alive so I can run PS code without opening a new process each time.
The need:
- Running multiple PS scripts dynamically, so they have the same base (custom) modules, as efficient as possible.
Be able to communicate with stdout/stdin/stderr of these scripts process while it is still running.
Ideally I'd want one process to open with a docker, import my modules, and then collect the code itself to run, run it in the same process as the one already opened so it won't have to open another process nor import again my modules.
The problem:
- Setting up PS process in a docker container takes tremendous amount of time. (Roughly 2.5s, before I have even begun to run any code, and I'm talking about the PS process alone)
As of yet I did not find a PS way to run dynamic code on the same process without creating a new process & importing my modules again. Nor did I find a way to dynamically communicate with the new process while it still runs.
Possible Solutions:
- Create the initial PS process with -noprofile so it won't load so slowly. (I am yet to test this, but folks on redit seems to be approving of this method)
- Use start-process with -NoNewWindow flag, so it will generate new process each time but I guess the initial setup time will be spared.
- Trying to use Invoke-Expression on big chunks of code, but from what I understand that is not recommended and probably won't let actively communicate with the code running there until it finishes.
Start-Process -NoNewWindow
And
Invoke-Expression
Are the only relevant mechanisms I could find so far.
I've been told AWS lambda features similar functionality as what I am trying to achieve, but looking at it's code I did not make much progress, figured might be worth asking for help from people who are smarter then me :) Any help would be much appreciated.
I do not seek for already fully working 3ed side solution, simply being able to mimic that behavior in PS code would be good enough for me.

Related

Restarting an updated powershell script

I'm making a makeshift CI/CD system for my app, the app stops itself when notified of a push to a Github repo and the script automatically runs git pull to bring in changes and some more commands depending on the things that changed. Some of the changes could be to the script.
I want the script to restart itself, without infinite nesting where it could hog resources.
While ($true) {
git pull
# check for changes...
If ($runScriptChanged) {
Break
}
node index.js
}
# ???
Omitted error-checking parts and other updating parts for brevity
Calling itself will probably work, but again, it could hog resources infinitely until stopped
Making a new file to run the above script still leaves a file in the repo that cannot be updated automatically
Start-Process is the best I've found for this, but I'm not sure about it's behavior on Linux
When does the launching shell close? Is it the same as on Windows with -NoNewWindow (where it will stay open, as long as there's something using it)? (Currently I'm running it on Windows Server, so compatibility with Linux isn't a big concern, but it is nice to have)
Which way should I use? Thanks
You may consider using PowerShell jobs and the Start-Job cmdlet. It will start your processes in the background and also has some monitoring and management capabilities using other -Job cmdlets such as Get-Job, Wait-Job,Stop-Job, etc.
See about_Jobs for more information.

How to run an Event (FileSystemWatcher) through Task Scheduler

I currently have a pretty simple Powershell Script that creates an IO.FileSystemWatcher object, and calls an executable upon that event being triggered.
I can run this script without issue from Administrator Powershell on my 2012 Windows Server, however it seems to run into issues when I have my script being run from Task Scheduler.
I've attempted running the task while logged on, and on a trigger while I'm logged off and in both instances the Event status reads: "Running" when I check. However interacting with the folder that should be watched produces no results. I've added a log file to document which parts of the code are functioning and the script DOES create the event, however it is the event triggering that seems to be the issue. Has anyone heard of an issue with creating events through Task Scheduler?
I've read some forums that say it might be a domain user issue
HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Lsa
Change the ‘REG_DWORD’ with ValueName ‘disabledomaincreds’ to a Value to “0
Although this was already the case, and I've tried multiple variations of settings in the Task Properties as per Scripting Guy and SpiceWorks. The general consensus I've found is that it needs to be ran with a -NoExit argument in order for the event to properly run when the user is not logged in.
Extra notes:
Powershell script is located on a network location rather than physically on the computer (\serverName\FTP\Folder\script.ps1
I came across the same problem. I don't know why this works, but in your Scheduled Task, when referring to the PowerShell Script, instead of using
\serverName\FTP\Folder\script.ps1
use
. \serverName\FTP\Folder\script.ps1
(noting the .).
As I understand, as a powershell novice, the events you register with FileSystemWatcher will only fire if the powershell instance is still running. I wouldn't trust that task manager says the task is running since it is notoriously unreliable, which seems to be the Microsoft standard. I think once your script finishes executing it kills the powershell instance and all event listeners are garbage collected.
I just put my script to sleep forever and it works. At the end of my script, it has
while ($true) {sleep 1}
It probably wouldn't hurt to increase the sleep time, but this works.

PS1 uninstallation script in SCCM

I'm a nub scripter and am trying to write a really simple script to taskkill 2 programs and then uninstall 1 of them.
I wrote it in Powershell and stuck it in SCCM for deployment...however every time I deploy it, it's not running the last line to uninstall the program.
Here's the code:
# Closing Outlook instance
#
taskkill /IM outlook.exe /F
#
# Closing Linkpoint instance
#
taskkill /IM LinkPointAssist.exe /F
#
# Uninstalling Linkpoint via uninstall string if in Program Files
#
MsiExec.exe /X {DECDCD14-DEF6-49ED-9440-CC5E562FDC41} /qn
#
# Uninstalling Linkpoint via WmiObject if installed manually in AppData
Get-WmiObject -class win32_product -Filter "Name like '%Linkpoint%'" | ForEach-Object { $_.Uninstall()}
#
Exit
Can someone help? SCCM says the script completes with no error and I know it's able to execute it since the taskkills work...but it's not uninstalling the program.
Thanks in advance for any input.
So, SCCM is running this script, and nothing in the script is going to throw an error.
If you want to throw an error which SCCM can return to know how the deployment went, you need to add an extra step.
$result = Get-WmiObject -class win32_product -Filter "Name like '%Linkpoint%'" | ForEach-Object { $_.Uninstall()}
if ($result.ReturnValue -ne 0){
[System.Environment]::Exit(1603)
}else
{
[System.Environment]::Exit(0)
}
I see a lot of these kinds of questions come through on SO and SF: Someone struggling with unexpected behavior of an application, script, or ConfigMgr and very little information about the assumptions I can make about their environment. At that stage, it would typically be days of interaction to narrow the problem to a point where a solution is possible.
I'm hoping this answer can serve as a reference for future such questions. The first question to OP should be "Which of these 9 principles are you violating?" You could think of it as a sort of Joel Test for ConfigMgr application packaging.
Nine Steps to Better ConfigMgr Application Packages
I have found that installing and uninstalling applications reliably using ConfigMgr requires carefully sticking to a bunch of principles. I learned these principles the hard way. If you're struggling to figure out why an application is not working right under ConfigMgr, odds are that you will answer "no" to one of the following questions.
1. Are you testing the entire lifecycle?
In order to have any hope of reliably managing an application you need to test the entire lifecycle of an application. This is the sequence I test:
Detect: make sure the detection script result is negative
Install: install the application using your installation script
Detect: make sure the detection script result is positive when run
Uninstall: uninstall using your uninstallation script
I run this sequence repeatedly making tweaks to each step until the whole sequence is working.
2. Are you testing independently of ConfigMgr first?
Using ConfigMgr to test your application's lifecycle is slow and has its own ways of failing that can mask problems with your application package. The goal, then, is to be able to test an application's installation, detection, and uninstallation separate from but equivalent to the ConfigMgr client. In order to achieve that goal you end up with three separate scripts for each application:
Install-Application.bat - the entry point for your installation script
Detect-Application.ps1 - the script that detects whether the application is install
Uninstall-Application.bat - the entry point for your uninstallation script
Each of these three scripts can be invoked directly by either you or the ConfigMgr client. For applications installed as system you need to use psexec -s to invoke scripts in the same context as ConfigMgr (caveat).
3. Are you aware of context?
Installers can behave rather differently depending on the context they are invoked in. You need to consider whether an application is installed for a user or the system. If it is installed for the system, when you test independently of ConfigMgr, use psexec -s to invoke your script.
4. Are you aware of user interaction?
An installer can also behave rather differently depending on whether a user can interact with it. To test a script as system with user interaction, use psexec -i -s.
5. Did you match ConfigMgr to the tested context and user interaction?
Once you have the full lifecycle working, make sure you select the correct corresponding options for context (installed for user vs. system) and interaction (user can interact with application, or not). If you don't do this, the ConfigMgr client will be installing the application different from the way you tested, so you really can't expect success.
6. Are you aware of the possibility of application detection context mismatch?
The context that detection scripts run in depends on whether the application is deployed to users or systems. This means that in some cases the installation and detection contexts won't matched. Keep this in mind when you write your detection scripts.
7. Have you structured your scripts so that exit codes work?
ConfigMgr needs to see exit codes from your installation and uninstallation scripts in order to do the right thing. Installers signal failure or the need to reboot using exit codes. In order for exit codes to get to the ConfigMgr client you need to ensure that your install and uninstall scripts are structured correctly.
for batch scripts, use exit /b %errorlevel% to pass the exit code of your executable out to the ConfigMgr client
for PowerShell scripts, this is the only way I have seen work reliably
8. Are you using PowerShell scripts for detection?
ConfigMgr has a nice user interface for checking things like the presence of files, registry keys, etc as a proxy for whether an application is installed. The problem with that scheme is that there is no way to test application detection separately from and equivalent to the ConfigMgr client. If you want to test the application lifecycle independent of the ConfigMgr client (trust me, you want that), all your detection must occur using PowerShell scripts.
9. Have you structured your PowerShell detection scripts correctly?
The rules ConfigMgr uses to interpret the output of a PowerShell detection script are arcane. Thankfully, they are documented.

Powershell for scripting large analysis runs

I'm completely new to Powershell and I know that a number of people use it to automate tasks much in the way bash and c-shell programming is done in *NIX. I've successfully recompiled some ancient analysis software written in FORTRAN that takes individual input files. I now need to somehow run just under 1000 cases with only slightly varied input files. The analysis software writes intermediate files, so for concurrent runs, every run has to be within a different directory. Each case can take up to 40 minutes to solve, so individually running these will take a lot of time and be prone to error.
So now for the question, can Powershell automate this and is there some similar script out there that I can modify to do it?
The automation would need to do the following (as I see it):
Take in an input file with the various runs that have to be run
Create a subdirectory relative to the run name/number
Save a version of the input files with the variables switched in the subdirectory
Run the analysis software in the subdirectory
Look at standard/error output of analysis software to confirm it was successful
Append to a file success or failure of a run
Ideally would be able to run up to some number of analyses concurrently (4-6 for my machine)
If IT reboots the machine (as they do whenever they choose), I'd like to be able to restart where it left off, though I expect the loss of anything that the analysis software was running during the forced reboot.
I've tried recompiling the software with vectorization and automated parallelization and on the tested cases, the convergence time was only minimally reduced, so it is safe to assume that this is effectively single threaded.
Powershell has lots of familiar aliases for Unix users. ls, cat, cp etc are implemented as aliases to native Powershell commands. The commands are not case sensitive. What's more, you can search help even with alias' name. That is,
man ls <=> get-help get-childitem
apropos <=> get-help <keyword>
get-help loop
about_Break
about_Continue
about_do
about_For
about_Foreach
about_Language_Keywords
...
This should help converting an existing script. For the rest, I'll give some hints as the description is somewhat vague.
Get-Content is used to read file contents into a variable: $myVar = cat c:\some\file.txt.
Directory creation is just md.
Capturing exe output is done by assigning to a variable: $exeOutput = c:\myApp.exe
Adding stuff to a file is Add-Content.
Background jobs are started with Start-Job.

How can I pause Perl processing without hard-coding the duration?

I have a Perl script that contains this code snippet, which calls the system shell to get some files by SFTP and unzip them with WinZip:
# Run script to get files from remote server
system "exec_SFTP.vbs";
# Unzip any files that were retrieved
foreach $zipFile (<*.zip>) {
system "wzunzip $zipFile";
}
Even if some files are retrieved, they are never unzipped, because by the time the files are retrieved and the SFTP connection is closed, the Perl script has already completed the unzip step, with the result that it doesn't find anything to unzip.
My short-term fix is to insert
sleep(60);
before the unzip step, but that assumes that the SFTP connection will finish within 60 seconds, which may sometimes be a gross over-estimate, and other times an under-estimate.
Is there a more sound way to cause Perl to pause until the SFTP connection is closed before proceeding with the unzip step?
Edit: Responders have questioned (and reasonably so) the use of a VB script rather than having Perl do the file transfer. It has to do with security -- the VB script is maintained by others and is authorized to do the SFTP.
Check the code in your *.vbs file. The system function waits for the child process to finish before execution continues. It appears that your *.vbs file is forking a background task to do the FTP and returning immediately.
In a perfect world your script would be rewritten to use Net::SFTP::Foreign and Archive::Extract..
An ugly quick-hackish kind of way might be to create a touch-file before your first system call, alter your sftp-fetching script to delete the file once it is done and have a while like so
while(-e 'touch.file') {
sleep 5;
}
# foreach [...]
Of course, you would need to take care if your .vbs fails and leaves the touchfile undeleted and many other bad side effects. This would be for a quick solution (if none of the other suggestions work) until you get the time to rewrite without system() calls.
You need a way for Perl to wait until the SFTP transfer is done, but as your script is currently written, Perl has no way of knowing this. (It looks like you're combining at least two scripting languages and a (GUI?) SFTP client; this can work, but it's not exactly reliable or robust. Why use VBscript to start the SFTP transfer?)
I can think of four options:
Your Perl script could do the SFTP transfer itself, using something like CPAN's Net::SFTP module, rather than spawning an external job whose status it cannot track.
Your Perl script could spawn a command-line SFTP utility (like PSFTP) that doesn't return until the transfer is done.
Or change exec_SFTP.vbs script to not return until the transfer is done.
If you're currently using a graphical SFTP client and can't switch for whatever reason, I'd recommend using a scripting language like AutoIt instead of Perl. AutoIt has features to wait for windows to change state and so on, so it could more easily monitor for an activity's completion.
Options 1 or 2 would be the most robust and reliable.
The best I can suggest is modifying exec_SFTP.vbs to exit only after the file transfer is complete. system waits for the program it called to complete, so that should solve your problem:
system LIST
system PROGRAM LIST
Does exactly the same thing as "exec LIST", except
that a fork is done first, and the parent process
waits for the child process to complete.
If you can't modify the vbs script to stay alive until it terminates, you may be able to track subprocess creation. If you get subprocess ids, you can monitor them thereby know when the vbs' various offspring terminate.
Win32::Process::Info lets you get a subprocess ids from a running process.
Maybe this is a dumb question, but why not just use the Net::SFTP and Archive::Extract Perl modules to download and unzip the files?
system will not return until the shell it's running the command in has returned; this may be wrong for launching graphical programs and file associations.
See if any of the following help?
system('cscript exec_SFTP.vbs');
use Win32::Process;
use Win32;
Win32::Process::Create(my $proc, 'wscript.exe',
'wscript exec_SFTP.vbs', 0, NORMAL_PRIORITY_CLASS, '.');
$proc->Wait(INFINITE);
Have a look at IPC::Open3
IPC::Open3 - open a process for reading, writing, and error handling using open3()