I'm aiming to run a Powershell script on the startup on my windows 10. This tutorial has perfectly explained that: https://devblogs.microsoft.com/scripting/use-powershell-to-create-job-that-runs-at-startup/
But I expect the job to keep my script running as my app is listening to some events but when I check the status of the job using the Get-job cmdlet it shows that it's completed. I think that it's treating my script as an asynchronous one and eventually, my doesn't listen to anything. Any idea how can I make it synchronous i.e. keep my script running forever? This is the content of my ps1 file:
C:\Users\m\Desktop\Scripts\Activate.ps1
python C:\Users\m\Desktop\app.py
Related
I am fairly new to writing code in Powershell. For my job I have to write multiple Powershell scripts to make changes in the Hardware and Software settings as well as the Registry and Group Policy Editor to get these applications to run. These applications are a little older. Upgrading these software applications or the hardware then run on is NOT an option. as an example, when Microsoft releases the new patches on like Patch Tuesday...when those patches are applied there is a high probability that something will be changed which is where I come in to write a script to fix the issue. I have multiple scripts that I run. When those scripts are ran they may end up terminating because of an Error Code or an Exit Code. A large part of the time I do not that the script has failed immediately.
I am trying to figure out a script that I can run in a 2nd PowerShell Console Window. I am thinking that the only purpose of this script is to just sit there on the screen and wait and monitor. Then when I execute a script or Application (the only file extensions that I am worried about are: EXE, BAT, CMD, PS1) if the script/application that I just ran ends with an exit code or an error code....then output that to the screen...in REAL TIME.
Below, I have a small piece of code that kind of works, but it is not what I am wanting.
I have researched online and read and read tons of stuff. But I just can't seem to find what I am looking for.
Could someone please help me with getting a script that will do what I am wanting.
Thank you for your help!!!!
$ExitErrorCode =
"C:\ThisFolder\ThatFolder\AnotherFolder\SomeApplication.EXE # (this
would
# either be an EXE or CMD or BAT or PS1)"
$proc = Start-Process $ExitErrorCode -PassThru
$handle = $proc.Handle # cache proc.Handle
$proc.WaitForExit();
if ($proc.ExitCode -ne 0) {
Write-Warning "$_ exited with status code $($proc.ExitCode)"
}
Possible duplicate of the approaches shown here:
Monitoring jobs in a PowerShell session from another PowerShell session
Monitoring jobs in a PowerShell session from another PowerShell session
PowerShell script to monitor a log and output progress to another
PowerShell script to monitor a log and output progress to another
We have an application server running as a service, when some configuration is loaded it starts a bat script which has to run the powershell command Stop-ClusterGroup DRMSERVICES and then start it again.
The bat file works flawless when I manually execute it by dobbelt clicking. But when the service is running the bat, it does not finish, or execute the powershell command.
Bat file looks as follows
#echo off
powershell -command Stop-ClusterGroup DRMSERVICES
powershell -command Start-ClusterGroup DRMSERVICES
The service runs the bat file in silent mode, as a main difference.
I have tried with various switches including the -ExecutionPolicy Unrestricted and START /wait etc
Creating a seperate ps1 file and have the bat execute this instead.
All with the same output:
Manually executing the bat works
When the service executes the bat, it does not work.
I know the bat file is executed by the service, as inserting NET STOP servicename is working correct.
In the powershell event viewer I can also see event of the powershell commands take place.
The difference between manually executing and have the service execute the command in the event viewer, is event id 800 which states info about 'execution pipe' this is not present when the service is executing the bat.
The service does not wait for the powershell, and thus it does not have time to stop the cluster before exiting.
I'm lost whether this is a permission issue, syntax error or whatever.
Hopefully somebody can help
UPDATE:
I have tried with all proposed solutions, all with same result, the bat file works when double clicked, but the service does not execute the powershell command. Pure cmd is executed, as I can pipe to a txt file. I even got to a point when trying runas that the output log text wrote "insert administrator password"
I even managed to have our software guy change our software to call a powershell directly instead of a bat, same result. Powershell won't execute the command, this tells me it probably is permission, but everything have been set to log in as admin and run as admin for the sake of success, but still nothing.
I solved the problem.
Because the service is a 32bit process, it will execute a 32bit powershell.
FailoverClusters module only exists as a 64bit module.
By using %SystemRoot%\sysnative\WindowsPowershell\v1.0\powershell.exe
The service is able to open a 64bit session, and thus use the failover cluster module.
As a side note, the sysnative folder is only visible from a 32bit session, therefore it cannot be found via browsing in a 64bit os.
I think i have dealt with this kind of issue before, after the,
powershell -command Stop-ClusterGroup DRMSERVICES
you need to have cmd wait for a certain number of seconds, and then test if the DRMSERVICES is now stopped, if it is stopped then to start the DRMSERVICES again. This way cmd will keep waiting, and then check if the service has stopped.
After a certain number of tries, maybe have a way to stop checking and exit the script, for example it is trying to stop the service, and has run into a problem.
There is a timeout command in cmd
I have a PowerShell task in my definition that calls another script file on its own which takes care of running several things on my build agent (starts several different processes) - emulators, node.js applications, etc.
Everything is fine up until the moment this step is done and the run continues. All of the above mentioned stuff gets closed with most of the underlying processes killed, thus, any further execution (e.g. tests run) is doomed to fail.
My assumption is that these processes are somehow dependent on the outermost (temporary) script that VSTS generates to process the step.
I tried with the -NoExit switch specified in the arguments list of my script, but to no avail. I've also read somewhere a suggestion to set this by default with a registry key for powershell.exe - still nothing.
The very same workflow was okay in Jenkins. How can I fix this?
These are the tasks I have:
The last PowerShell task calls a specified PowerShell file which calls several others on its own. They ensure some local dependencies and processes needed to start executing the tests, e.g. a running Node.js application (started in a separate console for example and running fine).
When the task is done and it is successful, the last one with the tests would fail because the Node.js application has been shut down as well as anything else that was started within the previous step. It just stops everything. That's why I'm currently running the tests within the same task itself until I find out how to overcome this behavior.
I am not sure how you call the dependencies and applications in your PowerShell script. But I tried with the following command in PowerShell script task to run a Node.js application:
invoke-expression 'cmd /c start powershell -Command {node main.js}'
The application keeps running after the PowerShell script task is passed and finished which should meet your requirement. Refer to this question for details: PowerShell launch script in new instance.
But you need to remember to close the process after the test is finished.
There is the Continue on error option (Control Options section). The build process will be continued if it is true (checked), but the build result will be partially succeeded.
You also can output the error or warning by using PowerShell or VSTS task commands (uncheck Fail on Standard Error option in the Advanced section) and terminate the current PowerShell process by using the exit keyword, for example:
Write-Warning “warning”
Write-Error “error”
Write-Host " ##vso[task.logissue type=warning;]this is the warning"
Write-Host " ##vso[task.logissue type=error;sourcepath=consoleapp/main.cs;linenumber=1;columnnumber=1;code=100;]this is an error "
More information about the VSTS task command, you can refer to: Logging Commands
I currently have a pretty simple Powershell Script that creates an IO.FileSystemWatcher object, and calls an executable upon that event being triggered.
I can run this script without issue from Administrator Powershell on my 2012 Windows Server, however it seems to run into issues when I have my script being run from Task Scheduler.
I've attempted running the task while logged on, and on a trigger while I'm logged off and in both instances the Event status reads: "Running" when I check. However interacting with the folder that should be watched produces no results. I've added a log file to document which parts of the code are functioning and the script DOES create the event, however it is the event triggering that seems to be the issue. Has anyone heard of an issue with creating events through Task Scheduler?
I've read some forums that say it might be a domain user issue
HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Lsa
Change the ‘REG_DWORD’ with ValueName ‘disabledomaincreds’ to a Value to “0
Although this was already the case, and I've tried multiple variations of settings in the Task Properties as per Scripting Guy and SpiceWorks. The general consensus I've found is that it needs to be ran with a -NoExit argument in order for the event to properly run when the user is not logged in.
Extra notes:
Powershell script is located on a network location rather than physically on the computer (\serverName\FTP\Folder\script.ps1
I came across the same problem. I don't know why this works, but in your Scheduled Task, when referring to the PowerShell Script, instead of using
\serverName\FTP\Folder\script.ps1
use
. \serverName\FTP\Folder\script.ps1
(noting the .).
As I understand, as a powershell novice, the events you register with FileSystemWatcher will only fire if the powershell instance is still running. I wouldn't trust that task manager says the task is running since it is notoriously unreliable, which seems to be the Microsoft standard. I think once your script finishes executing it kills the powershell instance and all event listeners are garbage collected.
I just put my script to sleep forever and it works. At the end of my script, it has
while ($true) {sleep 1}
It probably wouldn't hurt to increase the sleep time, but this works.
I'm running two Perl scripts in parallel in Jenkins
some shell commands
perl script 1 &
perl script 2 &
wait
some more shell commands
If one of the perl scripts fail in the middle of the execution , the job waits until the other script runs (as it is executed in parallel in background).
I want the job to stop as soon as one of the script fails and not waste time by completing the execution of other script.
Please help.
You set up a signal handler for SIGCHLD, which is a signal that is always delivered to the parent process when a child exits. I'm not aware of a mechanism to see which child process exited, but you can save the subprocess process identifiers and just kill both of them when you receive SIGCHLD:
some shell commands
perl script 1 &
pid1=$!
perl script 2 &
pid2=$!
trap "kill $pid1 $pid2" CHLD
wait
some more shell commands
The script above has the downside that it will kill the other script regardless of the exit status of the subprocess. You could in the trap, if you want to, add a check for the exit status. The subprocess could e.g. create some temp file if it succeeds and the trap could check if the file exists.
Typically with Jenkins you would have the parallel steps running as separate jobs (or projects as they are sometimes known) rather than steps in a job. This would then allow the steps to run in parallel across different slave machines and it would keep the output for the jobs in a separate place.
You would then have a controlling job running the other parts.
I like the Multijob plugin for this sort of thing.
There are alternatives which may suit better, such as Build Flow Plugin which uses a DSL to describe the jobs you want to run