Ok, so I have a problem currently where I am trying to remotely deploy and run exe files remotely. I have a Ninite installation of just a simple 7zip on it, which I wanna install, and for now I have gotten so close that it even executes the file, but it never finishes the installation. I can see it in the Task Manager, but it never makes any progress, nor does it use any CPU, only a small bit of memory.
The code looks like this:
Copy-item -path C:\windows\temp\installer.exe -destination 'c:\users\administrator\desktop\installer -tosession
start-sleep -seconds 2
invoke-command -computername MyPc -scriptblock { $(& cmd c/ 'c:\users\administrator\desktop\installer\installer.exe' -silent -wait -passthru | select Exitcode }
$session | remove-possession
this runs, but it never stops running. the script just constantly runs without ever finishing. If I go and look in the Task Manager, I can clearly see both(?) the installation files going on somewhere, but it doesn't do anything but just sit there and never finishes. I have now tried to let it sit for about 15 minutes (the installation takes about 2-3 min at max) and still, no progress.
Any ideas as to what could cause this? or atleast how to fix it and let it finish. this is the last thing I need to get done in my project and this is done and I am so tired of this already.
NOTE: I have found out that if I have ANY kind of parameters (-wait, -passthru, etc) then it will become stuck forever with 0 CPU usage, even if i just have argumentlist attached with 0 arguments it will do the same, but if i have nothing but the command in the scriptblock, then it will go through, say "done master" and then be the biggest liar because the software has not been installed, it just closed the installer instead.
Related
I have been picking at this issue for a few weeks now with no resolve or solution found on the webs. When I run a Dell BIOS update with /s and /f during a new Windows 10 21H1 build task sequence, the update runs successfully with the BIOS update log showing error 2 that a reboot is needed to perform the BIOS update upon reboot. So the next step in the task sequence I perform a reboot but the BIOS never does the update, it just boots into Windows. I tried this from command line, PowerShell and as an application with the reboot box checked. All the ways I run this the log says ready to update on reboot but never does. I can get the update to work if I manually perform the reboot by using the mouse before MDT reboots it. This actually performs the update at the reboot as it should! However this of course creates a dirty environment and MDT is grumpy.
This happens on all different Dell builds I try that are only one or two steps newer. We currently use PDQ to run the updates. When I call the install from there, this too works fine. We want to move away from PDQ to a free solution such as just straight from MDT. I found many different ways people have performed this via task sequence and no mention of this hiccup. What I seem to be running into is MDT is removing whatever the BIOS is putting into the boot sequence so it never gets performed. I've tried different credentials, Dell's flash64w.exe and too much to list. Things seem to work until reboot. I'm stumped.
Sample of simple working PowerShell:
# Get model of system to be updated
$Model = (gwmi Win32_ComputerSystem).Model
Write-Host "Model Found: $($Model)"
# Get root folder where BIOS for model is stored
$BIOSRoot = "Z:\Applications\BIOSUpdates\Dell\$Model"
Write-Host "BIOSRoot: $($BIOSRoot)"
# Get path with BIOS executable and list of arguments
$BIOSFile = Get-Childitem -Path "$BIOSRoot" -Include *.exe -Recurse -ErrorAction SilentlyContinue
#Write-Host "BIOSFile: $($BIOSFile)"
$ARGS = #('/s', '/f')
Write-Host "BIOSFile and arguments: $($BIOSFile) $($ARGS)"
#Start BIOS Update with completed path
Start-Process "$BIOSFile" -ArgumentList "$ARGS" -Wait
Is anyone else having this show stopping issue?
So after much testing, the answer is to run the Start-Process at the end of my code twice. Why? I have no idea. I accidentally had it looping and it worked on the second loop. It just needed to run twice. I thought maybe it just needed more time. I put a sleep at end of the code, but time is not what it wanted. Very bizarre.
I am running multiple PowerShell scripts at once. I would like to be able to wait on certain ones to finish before opening new scripts. Basically, I was thinking if I could find the command line option that ran it something like "powershell.exe -Path "<script dir>" that would do it.
I tried doing a Get-Process | gm to find any parameters that I could call to get that information and I didn't see any (doesn't mean they aren't there) I tried looking through Task Manager to see if I could view something through the gui that I could link to but that didn't help either.
I hope I can get something like
Start-Process -FilePath ".\<script>.ps1" -ArgumentList "<args>"
do
{
sleep 10
}
until ((Get-Process -ProcessName "PowerShell" | where "<paramater>" -EQ ".\<script>")
I need to wait until that process is done but I don't want to put a wait at the end of the Start-Process because after that Start-Process kicks off I need some other items to go to while my .\ is running. I just need it to wait before another section of script kicks off.
Have a look at the "Job" cmdlets https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_jobs?view=powershell-6
And the $PID automatic variable, this will give the process ID of the current PowerShell session.
This question already has answers here:
How to tell PowerShell to wait for each command to end before starting the next?
(10 answers)
Closed 7 years ago.
When I run an installation from Inno Setup with:
Installer.exe /VERYSILENT
The command immediately returns even though the install takes about 10 minutes. So, if I run:
Installer.exe /VERYSILENT
DoNextThing.exe
DoNextThing.exe runs while the installer.exe is still installing.
I would like to run some configuration after the install is successful. Right now, in powershell, I do the following:
$h = Start-job -name Installer -ScriptBlock {."Installer.exe" /VERYSILENT}
$h # the ps job control commands show this job as complete very quickly
sleep 10
$x = Get-Process -ProcessName Installer
while ($x -and ! $x.HasExited)
{
write-output "waiting ..."
sleep 10
}
# Do some configuration
Although this seems to work, I think I must be missing a better way to do this. I do not want to make it part of the installer as this configuration is just for the Jenkins test environment.
Any ideas why the powershell job management does not work for this? Am I using powershell incorrectly, or is the Installer.exe generated by Inno Setup not working well with powershell? [should I be using cmd.exe instead of powershell?]
You might have to just add a command to the RUN section in inno-setup to create a file "IamFinishedInstalling.txt" as the last thing it does.
Your powershell can then block on that file rather than try to figure out process or job statuses.
while (! (Test-Path "IamFinishedInstalling.txt")) { sleep 10 }
If the installer.exe is really returning before the install is finished, this may be the simplest thing you can try.
Why use a job at all? Just run the installer using the installer command. When the executable completes, PowerShell will continue on to the next line of the script.
I have a powershell script that tails specifics logs. If there is no update to the logs within a specific time span, an alert is then sent to nagios (as this depicts that the service is no longer running).
The powershell script works great when run manually, but my issue is that I want it to load up on reboot. I've tried creating a scheduled task that repeats itself every 5 minutes using the arguments '-noexit -file C::\script.ps1'. The problem is then that my script doesn't actually work when run as a scheduled task.
The execution policy is set to Unrestricted, so the script runs, but the code doesn't execute and work like it does when manually run.
FWIW, the code is:
function Write-EventlogCustom($msg) {
Write-EventLog System -source System -eventid 12345 -message $msg
}
Get-Content -Path C:\test.log -Wait | % {Write-EventlogCustom $_}
So if I update test.log while the powershell script runs a scheduled task, the event log doesn't get updated. However, when I run this script manually, and update to test.log, it does appear in the event viewer.
I'm hoping that a second set of eyes might find something that I may have missed?
As #Tim Ferrill has mentioned, I needed to run the process with task schedulers 'Run with highest privileges' setting. This resolved the issue.
I am copying files from One Windows machine to another using Copy-Item in Powershell script.
But I want to wait till copy completes, Powershell Copy-Item is non-blocking call means, it just triggers copy and returns to script however I want to wait till copy completes.
Is there any way to do it ?
Maybe "copy-item [...] | out-null" will help to wait for completion.
"Out-null actually deletes output instead of routing it to the PowerShell console, so the script will sit and wait for the content to be deleted before moving on."
The explanation comes from ITProToday.com:
https://www.itprotoday.com/powershell/forcing-powershell-wait-process-complete
Copy-Item does block. I just copied a 4.2GB file from a share on our gigabit network to my local machine. My PowerShell prompt hung and didn't return for several minutes.
It seems like it's non-blocking here for very small files. I'm using:
Start-Sleep -s 3
To wait for the files to be copied. Not an ideal solution but it's what I got so far.