I have been struggling for a few days now with running Start-Job with a -Scriptblock.
If I run this, I get today's date back in Receive-Job:
$sb = {Get-Date}
Start-Job -ScriptBlock $sb
Now, I have a script which does some reporting via 65+ API calls for 8 different statuses. In total, it does 500+ API calls and takes almost 20mins to complete (2-5seconds per API call, x 500), so I am looking to run each foreach block in parallel.
The script works well as a script but not as a Start-Job.
The Scriptblock will get a session token, gets data from API server in a foreach loop and then populates the results into a $Global: variable as it goes.
When I run the job, it "Completes" instantly and Has More Data is False (so I can't see any error messages):
67 Job67 BackgroundJob Completed False localhost foreach ($item....
Any suggestions as to where I am going wrong?
My sample code looks like this:
$sb = {# Get count of Status1
foreach ($item in $Global:GetItems) {
$item1 = $item.id
If ($null -eq $TokenExpiresIn) { .\Get-Token.ps1 }
$Global:TokenExpiresIn = New-TimeSpan -Start $TokenExpiresNow -End $Script:TokenExpiresTime
$Method = "GET"
$Fromitems = $item1
$Status = "foo"
$Limit = "1"
$Fields = "blah"
$URL = "https://$APIServer/apis/v1.1/status?customerId=1&itemId=$Fromitems&status=$Status&fields=$Fields&limit=$Limit"
$Header = #{"Accept" = "*/*"; "Authorization" = "Bearer $SessionToken" }
$itemStatsResponse = Invoke-WebRequest -Method $Method -Uri $URL -Header $Header
$itemStatsJSON = ConvertFrom-Json -InputObject $itemStatsResponse
$itemCount = $itemStatsJSON.metadata.count
$Global:GetItems | Where-Object -Property id -EQ $item1 | Add-Member -MemberType NoteProperty "foo" -Value $itemCount
}
}
I am running the Scriptblock as follows:
Start-Job -Name "foo" -ScriptBlock $sb
I am using PowerShell 7.1. The script as a whole runs successfully through all 500+ API calls, however it's a bit slow, so I am also looking at how I can "milti-thread" my API calls moving forward for better performance.
Thanks in advance for your support/input. The other posts on StackOverflow relating to PowerShell and Scriptblocks have not assisted me thus far.
Related
I have a PowerShell GUI that is pulling some values from a SSRS report using an String array input. However, as this would freeze the GUI, I decided to use Start-Job to start a job that pulls the SSRS report while a ProgressBar keeps running in the GUI.
The SSRS report has only one input parameter. When I use Start-Job to Render the report using multiple values, I get the result of only the first record irrespective of the number of input values.
The same function works smoothly when called natively without Start-Job returning the records for all the input values.
This is the code:
$GetSSRSData = {
param([string[]]$InputArray)
$reportServerURI = "https://<SERVER>/ReportServer/ReportExecution2005.asmx?wsdl"
$RS = New-WebServiceProxy -Class 'RS' -NameSpace 'RS' -Uri $reportServerURI -UseDefaultCredential
$RS.Url = $reportServerURI
$deviceInfo = "<DeviceInfo><NoHeader>True</NoHeader></DeviceInfo>"
$extension = ""
$mimeType = ""
$encoding = ""
$warnings = $null
$streamIDs = $null
$reportPath = "/Folder/Report"
$Report = $RS.GetType().GetMethod("LoadReport").Invoke($RS, #($reportPath, $null))
# Report parameters are handled by creating an array of ParameterValue objects.
$parameters = #()
for($i = 0; $i -lt $InputArray.Count; $i++){
$parameters += New-Object RS.ParameterValue
$parameters[$i].Name = "ParameterName"
$parameters[$i].Value = "$($InputArray[$i])"
}
# Add the parameter array to the service. Note that this returns some
# information about the report that is about to be executed.
$RS.SetExecutionParameters($parameters, "en-us") > $null
# Render the report to a byte array. The first argument is the report format.
$RenderOutput = $RS.Render('CSV',
$deviceInfo,
[ref] $extension,
[ref] $mimeType,
[ref] $encoding,
[ref] $warnings,
[ref] $streamIDs
)
$output = [System.Text.Encoding]::ASCII.GetString($RenderOutput)
return $output
}
$InputArray = #('XXXXXX', 'YYYYYY', 'ZZZZZZ', 'ABCDEF')
<#
# The below code works perfectly
$Data = GetSSRSData -InputArray $InputArray
ConvertFrom-Csv -InputObject $Data
#>
$job = Start-Job -ScriptBlock $GetSSRSData -ArgumentList $InputArray
do { [System.Windows.Forms.Application]::DoEvents() } until ($job.State -ne "Running")
$Data = Receive-Job -Job $job
Write-Host $Data # returns only the first record
When I change the bottom part as shown below, I am able to verify that the job ends after the first record is output.
$RenderOutput = $RS.Render('CSV',
$deviceInfo,
[ref] $extension,
[ref] $mimeType,
[ref] $encoding,
[ref] $warnings,
[ref] $streamIDs
)
Write-Output $RenderOutput
}
$InputArray = #('XXXXXX', 'YYYYYY', 'ZZZZZZ', 'ABCDEF')
$job = Start-Job -ScriptBlock $GetSSRSData -ArgumentList $InputArray
do {[System.Text.Encoding]::ASCII.GetString($job.ChildJobs[0].Output)} until ($job.State -ne "Running")
I also tried adding a sleep function of 5 seconds after the Render function, but it did not make any difference.
Please note that repeatedly calling the Start-Job for each input is not an option as each function call is costing a lot of time and hence the report needs to be pulled in a single call.
Why is the Render function behaving differently when started as a job? Is the function ending prematurely before it can render the other records as well?
Is there any other way such as a Runspace or a Start-ThreadJob that can solve this problem?
Ref: https://stackoverflow.com/a/63253699/4137016
Answer is here: ArgumentList parameter in Invoke-Command don't send all array
Probably better answer here: How do I pass an array as a parameter to another script?
Change -ArgumentList $InputArray to -ArgumentList (,$InputArray)
$InputArray = #('XXXXXX', 'YYYYYY', 'ZZZZZZ', 'ABCDEF')
$job = Start-Job -ScriptBlock $GetSSRSData -ArgumentList (,$InputArray)
I have a URL health-checking PowerShell script which correctly gets an HTTP 200 status code on most of my intranet sites, but a '0' status code is returned on a small minority of them. The '0' code is an API return rather than from the web site itself, according to my research of questions from others who have written similar URL-checking PowerShell scripts. Thinking this must be a timeout issue, where API returns '0' before the slowly-responding web site returns its 200, I've researched yet more questions about this subject area on SO and implemented a suggestion from someone to insert a timeout in the script. The timeout setting though, no matter how high I set the timeout value, doesn't help. I still get the same '0' "response" code from the same web sites even though those web sites are up and running as checked from any regular web browser. Any thoughts on how I could tweak the timeout setting in the script below in order to get the correct 200 response code?
The Script:
$URLListFile = "C:\Users\Admin1\Documents\Scripts\URL Check\URL_Check.txt"
$URLList = Get-Content $URLListFile -ErrorAction SilentlyContinue
#if((test-path $reportpath) -like $false)
#{
#new-item $reportpath -type file
#}
#For every URL in the list
$result = foreach($Uri in $URLList) {
try{
#For proxy systems
[System.Net.WebRequest]::DefaultWebProxy = [System.Net.WebRequest]::GetSystemWebProxy()
[System.Net.WebRequest]::DefaultWebProxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
#Web request
$req = [system.Net.WebRequest]::Create($uri)
$req.Timeout=5000
$res = $req.GetResponse()
}
catch {
#Err handling
$res = $_.Exception.Response
}
$req = $null
#Getting HTTP status code
$int = [int]$res.StatusCode
# output a formatted string to capture in variable $result
"$int - $uri"
#Disposing response if available
if($res){
$res.Dispose()
}
}
# output on screen
$result
#output to log file
$result | Set-Content -Path "C:\Users\Admin1\Documents\Scripts\z_Logs\URL_Check\URL_Check_log.txt" -Force
Current output:
200 - http://192.168.1.1/
200 - http://192.168.1.2/
200 - http://192.168.1.250/config/authentication_page.htm
0 - https://192.168.1.50/
200 - http://app1-vip-http.dev.local/
0 - https://CA/certsrv/Default.asp
Perhaps using PowerShell cmdlet Invoke-WebRequest works better for you. It has many more parameters and switches to play around with like ProxyUseDefaultCredentials and DisableKeepAlive
$pathIn = "C:\Users\Admin1\Documents\Scripts\URL Check\URL_Check.txt"
$pathOut = "C:\Users\Admin1\Documents\Scripts\z_Logs\URL_Check\URL_Check_log.txt"
$URLList = Get-Content -Path $pathIn
$result = foreach ($uri in $URLList) {
try{
$res = Invoke-WebRequest -Uri $uri -UseDefaultCredentials -UseBasicParsing -Method Head -TimeoutSec 5 -ErrorAction Stop
$status = [int]$res.StatusCode
}
catch {
$status = [int]$_.Exception.Response.StatusCode.value__
}
# output a formatted string to capture in variable $result
"$status - $uri"
}
# output on screen
$result
#output to log file
$result | Set-Content -Path $pathOut -Force
I am calling an API 500 times with 10 parallel threads as part of load testing. I want to capture the result of API call in a global variable (a counter outside script block scope) so, that I can process further for validation.
Example- In below code , I want to check if all 500 API call is success or not.
PFB code snippet-
$invokeAPI =
{
try {
$bodyContent = Get-Content $Using:inputFilepath
$Response = (Invoke-WebRequest -Method 'Post' -Uri $Using:headUri -Headers $Using:blobHeaders -Body $bodyContent).StatusCode
Write-Host -BackgroundColor Green "status Code :" $Response
}
catch [System.Exception] {
Write-Host -ForegroundColor Red "Exception caught while invoking API :" $_.ErrorDetails.Message
[int]$_.Exception.Response.StatusCode
}
}
1..500 | ForEach-Object -Parallel $invokeAPI -ThrottleLimit 10
<# ToDo...Capture API invocation Result to validate results#>
Updated:
Turns out I overcomplicated my initial answer by thinking jobs would be necessary. But it looks like they aren't. It appears it should be as simple as just outputting to a variable.
Sample script which will randomly test various HTTP statuses:
$invokeAPI = {
try {
$statusCode = 200,200,200,200,200,301,400,404,500 | Get-Random;
(iwr "http://httpbin.org/status/$statusCode").StatusCode;
}
catch {
[int]$_.Exception.Response.StatusCode;
};
};
$statuscodes = 1..20 | % -Parallel $invokeAPI -ThrottleLimit 5;
$statuscodes;
OLD - I thought Jobs would be needed, turns out you don't, see edit above
Change this:
1..500 | ForEach-Object -Parallel $invokeAPI -ThrottleLimit 10
To this:
$output = 1..500 | ForEach-Object -Parallel $invokeAPI -ThrottleLimit 10 -AsJob | Wait-Job | Receive-Job
$output
Explanation:
-AsJob - Causes it to run each task as a PowerShell job in the background
Wait-Job - Wait for the jobs to finish
Receive-Job - Get the return data for all the jobs
By running -AsJob, it will store the results in the background. You can then retrieve the job, which is the stored results of that jobs output.
Thanks to:
https://devblogs.microsoft.com/powershell/powershell-foreach-object-parallel-feature/
In fact, your example is very very similar to this example in the documentation:
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/foreach-object?view=powershell-7.1#example-13--run-in-parallel-as-a-job
Trying to user start-parallel module but receiving not output. While debugging I could see that it goes to that line but never enter function that I have provided
I have tried using a script block and as a function as well. Went through example provided for Start-parallel module but no result. Be is a script block or a function it showcases a bar that executing two tasks but nothing gets executed
Function avolume{
param ($SCID_server_profile)
#foreach ($SCID_server_profile in $SCID_JSON.serverProfile)
#{
if ($SCID_server_profile.spdefName -like "spdef_workload*")
{
#---------------------------------
#Getting Server Profile Details
#---------------------------------
$SCID_frame_number = $SCID_server_profile.synergyFrame
$SCID_bay_number = $SCID_server_profile.bayNumber
$OV_server_profile_uri = (Get-HPOVServer -Name "$SCID_frame_number, Bay $SCID_bay_number").serverProfileUri
if ($OV_server_profile_uri)
{
$OV_server_profile = Send-HPOVRequest -uri $OV_server_profile_uri
$result = Add-SanVolume -ServerProfile $SCID_server_profile -serverprofileobj $OV_server_profile
Report-Row -reportwriter $reportwriter -category $result.category -status $result.status -description $result.description
if($result.status -eq "fail")
{
$Script:status = $false
}
else{$Successful_Configs += 1}
}
else
{
Report-Row -reportwriter $reportwriter -category "Update Server Profile" -status "Fail" -description "Server Profile does not exist for $SCID_frame_number, $SCID_frame_number"
$Script:status = $false
}
}
#}
}
#$SCID_JSON.serverProfile | Start-Parallel -ScriptBlock $Script
#Start-Parallel -InputObject $SCID_JSON.serverProfile -ScriptBlock $Script
$SCID_JSON.serverProfile | Start-Parallel -Scriptblock ${Function:\avolume}
I am expecting that whatever is in function should get executed in parallel for values that I am providing
I have a curl command which response time by breaking it by each action in invoking a service.
curl -w "#sample.txt" -o /dev/null someservice-call
I want to measure the response time in a similar way using PowerShell's built-in Invoke-WebRequest call. So far I am able to get total response time using Measure-Command. Can someone please help me with this?
Content of sample.txt used in curl:
time_namelookup: %{time_namelookup}\n
time_connect: %{time_connect}\n
time_appconnect: %{time_appconnect}\n
time_pretransfer: %{time_pretransfer}\n
time_redirect: %{time_redirect}\n
time_starttransfer: %{time_starttransfer}\n
----------\n
time_total: %{time_total}\n
time in milliseconds:
$url = "google.com"
(Measure-Command -Expression { $site = Invoke-WebRequest -Uri $url -UseBasicParsing }).Milliseconds
This seems to do it without any noticable overhead:
$StartTime = $(get-date)
Invoke-WebRequest -Uri "google.com" -UseBasicParsing
Write-Output ("{0}" -f ($(get-date)-$StartTime))
As the other solutions point out, there is a performance catch when using powershell only.
The most efficient solution would probably be to write some c# with the measurements built in. But when it's not properly compiled beforehand, the loading-time will increase dramatically when the C# needs to be compiled.
But there is another way.
Since you can use almost all dotnet constructs within powershell, you can just write the same request and measurement logic within powershell itself.
I have written a small method which should do the trick:
function Measure-PostRequest {
param(
[string] $Url,
[byte[]] $Bytes,
[switch] $Block
)
$content = [Net.Http.ByteArrayContent]::new($bytes);
$client = [Net.Http.HttpClient]::new();
$stopwatch = [Diagnostics.Stopwatch]::new()
$result = $null;
if ($block) {
# will block and thus not allow ctrl+c to kill the process
$stopwatch.Start()
$result = $client.PostAsync($url, $content).GetAwaiter().GetResult()
$stopwatch.Stop()
} else {
$stopwatch.Start()
$task = $client.PostAsync($url, $content)
while (-not $task.AsyncWaitHandle.WaitOne(200)) { }
$result = $task.GetAwaiter().GetResult()
$stopwatch.Stop()
}
[PSCustomObject]#{
Response = $result
Milliseconds = $stopwatch.ElapsedMilliseconds
}
}