Determine when a verysilent install is complete [duplicate] - powershell

This question already has answers here:
How to tell PowerShell to wait for each command to end before starting the next?
(10 answers)
Closed 7 years ago.
When I run an installation from Inno Setup with:
Installer.exe /VERYSILENT
The command immediately returns even though the install takes about 10 minutes. So, if I run:
Installer.exe /VERYSILENT
DoNextThing.exe
DoNextThing.exe runs while the installer.exe is still installing.
I would like to run some configuration after the install is successful. Right now, in powershell, I do the following:
$h = Start-job -name Installer -ScriptBlock {."Installer.exe" /VERYSILENT}
$h # the ps job control commands show this job as complete very quickly
sleep 10
$x = Get-Process -ProcessName Installer
while ($x -and ! $x.HasExited)
{
write-output "waiting ..."
sleep 10
}
# Do some configuration
Although this seems to work, I think I must be missing a better way to do this. I do not want to make it part of the installer as this configuration is just for the Jenkins test environment.
Any ideas why the powershell job management does not work for this? Am I using powershell incorrectly, or is the Installer.exe generated by Inno Setup not working well with powershell? [should I be using cmd.exe instead of powershell?]

You might have to just add a command to the RUN section in inno-setup to create a file "IamFinishedInstalling.txt" as the last thing it does.
Your powershell can then block on that file rather than try to figure out process or job statuses.
while (! (Test-Path "IamFinishedInstalling.txt")) { sleep 10 }
If the installer.exe is really returning before the install is finished, this may be the simplest thing you can try.

Why use a job at all? Just run the installer using the installer command. When the executable completes, PowerShell will continue on to the next line of the script.

Related

MDT will not allow BIOS update upon reboot

I have been picking at this issue for a few weeks now with no resolve or solution found on the webs. When I run a Dell BIOS update with /s and /f during a new Windows 10 21H1 build task sequence, the update runs successfully with the BIOS update log showing error 2 that a reboot is needed to perform the BIOS update upon reboot. So the next step in the task sequence I perform a reboot but the BIOS never does the update, it just boots into Windows. I tried this from command line, PowerShell and as an application with the reboot box checked. All the ways I run this the log says ready to update on reboot but never does. I can get the update to work if I manually perform the reboot by using the mouse before MDT reboots it. This actually performs the update at the reboot as it should! However this of course creates a dirty environment and MDT is grumpy.
This happens on all different Dell builds I try that are only one or two steps newer. We currently use PDQ to run the updates. When I call the install from there, this too works fine. We want to move away from PDQ to a free solution such as just straight from MDT. I found many different ways people have performed this via task sequence and no mention of this hiccup. What I seem to be running into is MDT is removing whatever the BIOS is putting into the boot sequence so it never gets performed. I've tried different credentials, Dell's flash64w.exe and too much to list. Things seem to work until reboot. I'm stumped.
Sample of simple working PowerShell:
# Get model of system to be updated
$Model = (gwmi Win32_ComputerSystem).Model
Write-Host "Model Found: $($Model)"
# Get root folder where BIOS for model is stored
$BIOSRoot = "Z:\Applications\BIOSUpdates\Dell\$Model"
Write-Host "BIOSRoot: $($BIOSRoot)"
# Get path with BIOS executable and list of arguments
$BIOSFile = Get-Childitem -Path "$BIOSRoot" -Include *.exe -Recurse -ErrorAction SilentlyContinue
#Write-Host "BIOSFile: $($BIOSFile)"
$ARGS = #('/s', '/f')
Write-Host "BIOSFile and arguments: $($BIOSFile) $($ARGS)"
#Start BIOS Update with completed path
Start-Process "$BIOSFile" -ArgumentList "$ARGS" -Wait
Is anyone else having this show stopping issue?
So after much testing, the answer is to run the Start-Process at the end of my code twice. Why? I have no idea. I accidentally had it looping and it worked on the second loop. It just needed to run twice. I thought maybe it just needed more time. I put a sleep at end of the code, but time is not what it wanted. Very bizarre.

Powershell remote installation of exe files but it never finishes

Ok, so I have a problem currently where I am trying to remotely deploy and run exe files remotely. I have a Ninite installation of just a simple 7zip on it, which I wanna install, and for now I have gotten so close that it even executes the file, but it never finishes the installation. I can see it in the Task Manager, but it never makes any progress, nor does it use any CPU, only a small bit of memory.
The code looks like this:
Copy-item -path C:\windows\temp\installer.exe -destination 'c:\users\administrator\desktop\installer -tosession
start-sleep -seconds 2
invoke-command -computername MyPc -scriptblock { $(& cmd c/ 'c:\users\administrator\desktop\installer\installer.exe' -silent -wait -passthru | select Exitcode }
$session | remove-possession
this runs, but it never stops running. the script just constantly runs without ever finishing. If I go and look in the Task Manager, I can clearly see both(?) the installation files going on somewhere, but it doesn't do anything but just sit there and never finishes. I have now tried to let it sit for about 15 minutes (the installation takes about 2-3 min at max) and still, no progress.
Any ideas as to what could cause this? or atleast how to fix it and let it finish. this is the last thing I need to get done in my project and this is done and I am so tired of this already.
NOTE: I have found out that if I have ANY kind of parameters (-wait, -passthru, etc) then it will become stuck forever with 0 CPU usage, even if i just have argumentlist attached with 0 arguments it will do the same, but if i have nothing but the command in the scriptblock, then it will go through, say "done master" and then be the biggest liar because the software has not been installed, it just closed the installer instead.

wusa silent install error

I am trying to automate updating Powershell on Windows 7 using Windows6.1-KB2506143-x64.msu, and having a heck of a time. The following code works fine in a standalone ps1 file. And it works in my main ps1 file. But when run from a module it fails with exit code -2145124341. This is in PS v2, where negative exit codes are handled wrong, so that number is perhaps useless, and FWIW I have a good 40 other installers of various types that work from this module. However, this is my first attempt at automating msu files, so maybe there is a known interaction here that I haven't discovered yet? There's thousands of lines of code between the root ps1 file where this works and the module where it doesn't, so tracking down what is triggering the error is going to be a beast without some sort of trail to follow at the very least. So, anyone have an idea where I should start?
$filePath = 'wusa.exe'
$argumentList = '"\\PX_SERVER\Rollouts\Microsoft\Windows6.1-KB2506143-x64.msu" /quiet /norestart'
$exitCode = (Start-Process -filePath:$filePath -argumentList:$argumentList -wait -errorAction:stop -passThru).exitCode
Also, running wusa.exe leaves some detritus in the script folder, but only when it is run from the module. Is this an issue with the msu file, or just a bug in wusa? Or does it point at what is causing the issue perhaps?
I had hoped to get this update to work to enable some new features, but between not being able to automate and garbage being left behind, I am very close to abandoning that path and juts continuing to target v2. But hopefully someone can point me in the right direction as that is not my preferred solution at all.
a few toughts on first reading :
The ArgumentList parameter for Start-process needs an ARRAY to work well :
$argumentList = #( "\\PX_SERVER\Rollouts\Microsoft\Windows6.1-KB2506143-x64.msu", "/quiet", "/norestart" )
wusa.exe takes a log parameter : /log:c:\fso\install.log can you had it to your script for this particular package to check what happens ?
a powershell script trying to update powershell ... I'm not quite sure this is meant to work ... it's the only case in wich i'll backup on another scrpting language (people, please correct me if i'm wrong ... )
Please let me know the result of the wusa.exe /log command, thanks

Enqueuing MSI installs - via Powershell

I'm trying to install both the 32-bit and the 64-bit versions of Visual Studio 2005 as part of a Powershell script on our Win2008 instances. When I try to invoke the installation of both EXE files without a break, the second EXE (x86) doesn't execute since the x64 one hasn't finished installing.
So, I added a 5 sec sleep after each invoke and that seems to work now. However, I'm not very happy with this solution as it looks more like a workaround than a proper way to handle the task at hand.
Is there a better way to do this - maybe enqueue the files for install - so they execute one after another?
Here are the specific lines of code:
if ($OSArchitecture -eq "64-bit")
{ Write-Output "Found 64-bit OS. Installing both VC++ files for compat"
start-process .\vcredist_x64.exe /Q
start-sleep 5
start-process .\vcredist_x86.exe /Q
start-sleep 5
}
You must use the Start-Process -Wait parameter.
-Wait Waits for the specified process to complete before accepting more input. This parameter suppresses the command
prompt or retains the window until the process completes.

Parallelizing powershell script execution

I am having 8 powershell scripts. Few of them having dependencies. It means they can't be executed in parallel. They should be executed on after another.
Some of the Powershell scripts has no dependency and it can be executed in parallel.
Following is the dependency explained in detail
Powershell scripts 1, 2, and 3 depend on nothing else
Powershell script 4 depends on Powershell script 1
Powershell script 5 depends on Powershell scripts 1, 2, and 3
Powershell script 6 depends on Powershell scripts 3 and 4
Powershell script 7 depends on Powershell scripts 5 and 6
Powershell script 8 depends on Powershell script 5
I knew that by manually hard coding the dependency is possible. But 10 more powershell scripting may be added and dependency among them may added.
Has any one acheived parallelism by finding dependency? If so please share me how to proceed.
You need to look at PowerShell 3.0 Workflows. It offers the features you need for your requirement. Something like this:
workflow Install-myApp {
param ([string[]]$computername)
foreach -parallel($computer in $computername) {
"Installing MyApp on $computer"
#Code for invoking installer here
#This can take as long as 30mins and may reboot a couple of times
}
}
workflow Install-MyApp2{
param ([string[]]$computername)
foreach -parallel($computer in $computername) {
"Installing MyApp2 on $computer"
#Code for invoking installer here
#This can take as long as 30mins!
}
}
WorkFlow New-SPFarm {
Sequence {
Parallel {
Install-MyApp2 -computername "Server2","Server3"
Install-MyApp -computername "Server1","Server4","Server5"
}
Sequence {
#This activity can happen only after the set of activities in the above parallel block are complete"
"Configuring First Server in the Farm [Server1]"
#The following foreach should take place only after the above activity is complete and that is why we have it in a sequence
foreach -parallel($computer in $computername) {
"Configuring SharePoint on $computer"
}
}
}
}
How familiar with parallel programming in general are you? Have you heard of and used the concept of mutual exclusion? The concept in general is to use some kind of messaging/locking mechanism to protect a shared resource among different parallel threads.
In your case, you're making the dividing lines be the scripts themselves - which I think may make this much simpler than most of the techniques outlined in that wikipedia article. Would this simple template work for what you're looking for?
Define a folder in the local file system. This location will be known to all scripts (default parameter).
Before running any of the scripts, make sure any files in that directory are deleted.
For each script, as the very last step of their execution, they should write a file in the shared directory with their script name as the name of the file. So script1.ps1 would create script1 file, for example.
Any script that has a dependency on another script will define these dependencies in terms of the file names of the scripts. If script3 is dependent on script1 and script2, this will be defined as a dependency parameter in script3.
All scripts with dependencies will run a function that checks if the files exist for the scripts it's dependent on. If they are, it proceeds with the execution of the script, otherwise it pauses until they are complete.
All scripts get kicked off simultaneously by a master script / batch file. All of the scripts are ran as PowerShell jobs so that the OS will run their execution in parallel. Most of the scripts will start up, see they have dependencies, and then wait patiently for these to get resolved before continuing with the actual execution of the script body.
The good news is that this would allow for flexible changing of dependencies. Every script writes a file, making no assumption about whether someone else is waiting for them or not. Changing the dependency of a particular script would be a simple one-line change or change of input parameter.
This is definitely not a perfect solution though. For instance what would happen if a script fails (or your script can exit in multiple different code paths but you forget to write the file in one of them)? This could cause a deadlock situation where no dependent scripts will get kicked off. The other bad thing is the busy wait of sleeping or spinning while waiting for the right files to get created - this could be corrected by implementing an Event-based approach where you have the OS watch the directory for changed.
Hope this helps and isn't all garbage.
You'll just have to order you calls appropriately. There's nothing built-in that will handle the dependencies for you.
Run 1,2,3 at the same time Start-Job.
Wait for them to get done Get-Job -State Running | Wait-Job
Run 4,5 at the same time Start-Job
Wait for them to get done Get-Job -State Running | Wait-Job
Run 6 and wait for it.
Run 7, 8 at the same time Start-Job