Start-job for each computer in my csv - powershell

i'm looking for a way to start a new job for each computer in my csv, to launch parallel job on each of them.
I already have this
$job1 = Start-Job -Scriptblock {
$Csv = "C:\springfield\Citrix\CitrixComposants.csv"
$myservers = Import-Csv $Csv
Import-module C:\springfield\Citrix\CitrixDeploymentActivlanModule.ps1
Deploy-Citrix -servers $myservers[0].server -component $myservers[0].component
}
How can i get $job[n] as many as there are servers in my csv ?
I was thinking about something like this, but maybe there is a better way to achieve that.
My idea is to create a dynamic $job variable ($job1, $job2, $job3, $job[n]...)
$Csv = "C:\springfield\Citrix\CitrixComposants.csv"
for ($i=0;$i -lt $csv.count; $i++)
{
$job+"$i" = Start-Job -Scriptblock {
$Csv = "C:\springfield\Citrix\CitrixComposants.csv"
$myservers = Import-Csv $Csv
Import-module C:\springfield\Citrix\CitrixDeploymentActivlanModule.ps1
Deploy-Citrix -servers $myservers[$i].server -component $myservers[$i].component
}
}
I also would like when a job finishes it shows me it's completed, how can i get that?
I'm using powershell v4.
Thanks for your help

Well, currently you are running one job for all the servers. Or you would be if you weren't specifying $Myservers[0].
If any results are returned you can see the results by using receive-job. I would think the results would contain output returned from each server, assuming you removed $myservers[0] and that deploy-citrix cmdlet accepts an array. If you just want to check the status you can use get-job.
If you wanted to instead start a job for each server it should look more like this:
$Csv = "C:\springfield\Citrix\CitrixComposants.csv"
$myservers = Import-Csv $Csv
Foreach ($Server in $MyServers)
{
$SrvName = $Server.Server
$Component = $Server.Component
Start-Job -ArgumentList $SrvName,$Component -Scriptblock {
Param ($SrvName,$Component)
Import-module C:\springfield\Citrix\CitrixDeploymentActivlanModule.ps1
Deploy-Citrix -servers $Srvname -component $component
}
}

Related

Optimizing Powershell Script to Query Remote Server OS Version

I want optimize a simple task: pull server OS version into a neat table. However, some servers in our environment have Powershell disabled. Below you fill find my script, which works! However, it takes about 20 seconds or so per server, since it waits for the server to return the results of the invoked command before moving onto the next server in the list. I know there's a way to asynchronously pull the results from a PS command, but is this possible when I need to resort to cmd line syntax for servers that can't handle PS, as shown in the catch statement?
$referencefile = "ps_servers_to_query.csv"
$export_location = "ps_server_os_export.csv"
$Array = #()
$servers = get-content $referencefile
foreach ($server in $servers){
#attempt to query the server with Powershell.
try{
$os_version = invoke-command -ComputerName $server -ScriptBlock {Get-ComputerInfo -Property WindowsProductName} -ErrorAction stop
$os_version = $os_version.WindowsProductName
} # If server doesnt have PS installed/or is disabled, then we will resort to CMD Prompt, this takes longer however.. also we will need to convert a string to an object.
catch {
$os_version = invoke-command -ComputerName $server -ScriptBlock {systeminfo | find "OS Name:"} # this returns a string that represents the datetime of reboot
$os_version = $os_version.replace('OS Name: ', '') # Remove the leading text
$os_version = $os_version.replace(' ','') # Remove leading spaces
$os_version = $os_version.replace('Microsoft ','') # Removes Microsoft for data standardization
}
# Output each iteration of the loop into an array
$Row = "" | Select ServerName, OSVersion
$Row.ServerName = $Server
$Row.OSVersion = $os_version
$Array += $Row
}
# Export results to csv.
$Array | Export-Csv -Path $export_location -Force
Edit: Here's what I'd like to accomplish. Send the command out to all the servers (less than 30) at once, and have them all process the command at the same time rather than doing it one-by-one. I know I can do this if they all could take PowerShell commands, but since they can't I'm struggling. This script takes about 6 minutes to run in total.
Thank you in advance!
If I got it right something like this should be all you need:
$referencefile = "ps_servers_to_query.csv"
$export_location = "ps_server_os_export.csv"
$ComputerName = Get-Content -Path $referencefile
$Result =
Get-CimInstance -ClassName CIM_OperatingSystem -ComputerName $ComputerName |
Select-Object -Property Caption,PSComputerName
$Result
| Export-Csv -Path $export_location -NoTypeInformation

Get-WindowsUpdateLog stream re-direction

Has anyone noticed how the Get-WindowsUpdateLog cmdlet cannot be redirected to any streams?
Furthermore, storing the output into a variable, piping it, or any type of re-direction leads to the cmdlet to only be executed.
Any help redirecting/silencing the output of this command would be appreciated.
What I've tried:
Get-WindowsUpdateLog | Out-Null
Get-WindowsUpdateLog > $null
$sink = Get-WindowsUpdateLog
Everything I could find failed to suppress the output of the CmdLet Get-WindowsUpdateLog. As you say, the displayed information in the console is not properly following the output streams as we know them in PowerShell.
The only workaround I found is using Jobs:
$Job = Start-Job -ScriptBlock {Get-WindowsUpdateLog}
$Job | Wait-Job | Remove-Job
This way all output is handled within the job and we don't retrieve the result. It's also unnecessary to retrieve it as the result is simply a text file placed in the -Path parameter.
As an addendum to DarkLite's answer, we can also utilise the following code to check if Get-WindowsUpdateLog cmdlet worked properly :
# Generate a unique filename.
$o_filename = "C:\temp\" + "WindowsUpdateLog" + "_" + ( Get-Date -UFormat "%H_%M_%S" ) + ".txt"
$o_job = Start-Job -ScriptBlock { Get-WindowsUpdateLog -LogPath "$( $args[0] )" } `
-ArgumentList $o_filename -ErrorAction SilentlyContinue
Start-Sleep -Seconds 5
$o_job | Remove-Job -ErrorAction SilentlyContinue
if( ! ( Test-Path $o_filename ) ) {
# Return an exception
}
else {
# Do some work on the generated log file
}
I think the lesson here is don't use Out-Default in a script. https://keithga.wordpress.com/2018/04/03/out-default-considered-harmful/ I don't see that command in powershell 6. But it is in powershell 7!

Powershell Get-Content with Wait flag and IOErrors

I have a PowerShell script that spawns x number of other PowerShell scripts in a Fire-And-Forget way.
In order to keep track of the progress of all the scripts that I just start, I create a temp file, where I have all of them write log messages in json format to report progress.
In the parent script I then monitor that log file using Get-Content -Wait. Whenever I receive a line in the log file, I parse the json and update an array of objects that I then display using Format-Table. That way I can see how far the different scripts are in their process and if they fail at a specific step. That works well... almost.
I keep running into IOErrors because so many scripts are accessing the log file, and when that happens the script just aborts and I lose all information on what is going on.
I would be able to live with the spawned scripts running into an IOError because they just continue and then I just catch the next message. I can live with some messages getting lost as this is not an audit log, but just a progress log.
But when the script that tails the log crashes then I lose insight.
I have tried to wrap this in a Try/Catch but that doesn't help. I have tried setting -ErrorAction Stop inside the Try/Catch but that still doesn't catch the error.
My script that reads looks like this:
function WatchLogFile($statusFile)
{
Write-Host "Tailing statusfile: $($statusFile)"
Write-Host "Press CTRL-C to end."
Write-Host ""
Try {
Get-Content $statusFile -Force -Wait |
ForEach {
$logMsg = $_ | ConvertFrom-JSON
#Update status on step for specific service
$svc = $services | Where-Object {$_.Service -eq $logMsg.Service}
$svc.psobject.properties[$logMsg.step].value = $logMsg.status
Clear-Host
$services | Format-Table -Property Service,Old,New,CleanRepo,NuGet,Analyzers,CleanImports,Build,Invoke,Done,LastFailure
} -ErrorAction Stop
} Catch {
WatchLogFile $statusFile
}
}
And updates are written like this in the spawned scripts
Add-Content $statusFile $jsonLogMessage
Is there an easy way to add retries or how can I make sure my script survives file locks?
As #ChiliYago pointed out I should use jobs. So that is what I have done now. I had to figure out how to get the output as it arrived from the many scripts.
So I did added all my jobs to an array of jobs and and monitored them like this. Beware that you can receive multiple lines if your script has had multiple outputs since you invoked Receive-Job. Be sure to use Write-Output from the scripts you execute as jobs.
$jobs=#()
foreach ($script in $scripts)
{
$sb = [scriptblock]::create("$script $(&{$args} #jobArgs)")
$jobs += Start-Job -ScriptBlock $sb
}
while ($hasRunningJobs -gt 0)
{
$runningJobs = $jobs | Where-Object {$_.State -eq "Running"} | measure
$hasRunningJobs = $runningJobs.Count
foreach ($job in $jobs)
{
$outvar = Receive-Job -Job $job
if ($outvar)
{
$outvar -split "`n" | %{ UpdateStatusTable $_}
}
}
}
Write-Host "All scripts done."

Get-Content and foreach in two files

I have two files. The first with contains hostnames (Computers.txt) and the second one contains SID (SID.txt). I want to use Get-Content and foreach to execute a command on each computer with the corresponding SID to modify registry.
Let's take for example PC 1 (first line Computers.txt with first line SID.txt) and PC 2 (second line Computers.txt with second line SID.txt).
$Computer = Get-Content D:\Downloads\computers.txt
$SID = Get-Content D:\Downloads\SID.txt
foreach ($pc in $Computer)
{
Invoke-Command -ComputerName $pc {New-Item HKEY_USERS:\$SID -Name -Vaue}
}
Using a foreach-loop doesn't give you the current linenumber so it's impossible to get the same line from the SIDs-list. You should use a while- or for-loop to create an index that increments by one for each run so you know the "current line".
There's no HKEY_USERS: PSDrive. You need to access it using the Registry-provider, like Registry::HKEY_USERS\
Variables in your local scope (ex. $currentsid) aren't accessible inside the Invoke-Command-scriptblock since it's executed on the remote computer. You can pass it in using -ArgumentList $yourlocalvariable and call it with $args[0] (or put param ($sid) at the beginning of the scriptblock). With PS 3.0+ this is much simpler as you can use the using-scope ($using:currentsid) in your script.
Example:
$Computers = Get-Content D:\Downloads\computers.txt
$SIDs = Get-Content D:\Downloads\SID.txt
#Runs one time for each value in computers and sets a variable $i to the current index (linenumer-1 since arrays start at index 0)
for($i=0; $i -lt $Computers.Length; $i++) {
#Get computer on line i
$currentpc = $Computers[$i]
#Get sid on line i
$currentsid = $SIDs[$i]
#Invoke remote command and pass in currentsid
Invoke-Command -ComputerName $currentpc -ScriptBlock { param($sid) New-Item "REGISTRY::HKEY_USERS\$sid" -Name "SomeKeyName" } -ArgumentList $curentsid
#PS3.0+ with using-scope:
#Invoke-Command -ComputerName $currentpc -ScriptBlock { New-Item "REGISTRY::HKEY_USERS\$using:currentsid" -Name "SomeKeyName" }
}
One-liner:
0..($Computers.Length-1) | ForEach-Object { Invoke-Command -ComputerName $Computers[$_] -ScriptBlock { param($sid) New-Item REGISTRY::HKEY_USERS\$sid -Name "SomeKeyName" } -ArgumentList $SIDs[$_] }
On a side-note: Using two files with matching line numbers is a bad idea. What if comptuers has more lines than SIDs? You should be using a CSV-file that maps computer and SID. Ex..
input.csv:
Computer,SID
PC1,S-1-5-21-123123-123213
PC2,S-1-5-21-123123-123214
PC3,S-1-5-21-123123-123215
This is safer, easier to maintain and you can use it like this:
Import-Csv input.csv | ForEach-Object {
Invoke-Command -ComputerName $_.Computer -ScriptBlock { param($sid) New-Item REGISTRY::HKEY_USERS\$sid -Name "SomeKeyName" } -ArgumentList $_.SID
}

PowerShell: Use Get-Content from multiple text files to send a message

There was very little on the topic of using multiple text files for PowerShell, only found stuff that would take one list and run it against the primary list. Anyway...
My question comes from a need to combine 2 sets of data, equal in the number of rows.
Server.txt & SessionID.txt. Both files are created from another Get-XASession query.
I wanted to combine these in a Send-XAMessage.
Servers.txt = "Server1","Server2","Server3",etc.
SessionIds.txt = "2","41","18",etc.
Here's the code I've tried unsuccessfully...
BTW, "ServerX", is a static connection server required for XA Remote computing.
$Server = Get-Content .\Server.txt
$SessionIds = Get-Content .\SessionIds.txt
ForEach ($s in $Servers -And $i in $SessionIds) {
Send-XASession -ComputerName ServerX -ServerName $s -SessionId $i -MessageTitle "MsgTitle" -MessageBody "MsgBody" }
For normal usability, we can switch the Stop-XASession, with Get-Service, and use the $s for -ComputerName.
And switch SessionId for -ServiceName.
That would look something like this...
ForEach ($s in $Servers -And $i in $Sevices) { Get-Service -ComputerName $s -Name $i } | FT Name,Status
The only thing that matters, is that each line on both text files is ran through simultaneously. No duplicates. Matching line 1 in Servers.txt to line 1 on SessionIds.txt and using it in each command.
Any help would be greatly appreciated.
You can do something like this:
$Server = Get-Content .\Server.txt
$SessionIds = Get-Content .\SessionIds.txt
$i=0
ForEach ($s in $Servers)
{
Send-XASession -ComputerName ServerX -ServerName $s -SessionId $SessionIds[$i++] -MessageTitle "MsgTitle" -MessageBody "MsgBody"
}
That will cycle the $SessionIds elements in synch with the $server elements. The postincrement operator on $SessionIds[$i++] will increment $i each time it goes through the loop.