Adding Powershell jobs to retrieve in parallel from paginated rest API - rest

I've spent perhaps over 30 hours trying to add various methods to implement parallel processing to my API call. The most obvious would be to place the code into a job, but I'm having no luck. Any ideas?
Start-Transcript -Path "$PSScriptRoot\Errorlog.txt"
$WondeObjectsArray = Import-CSV $PSScriptRoot\WondeID.csv
$EndpointObjectArray = Import-CSV $PSScriptRoot\Endpoints.csv
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("REDACTED", "REDACTED")
# School loop
Foreach ($object in $WondeObjectsArray) {
$SchoolName = $object.School
$TrustName = $object.Trust
$WondeID = $object."Wonde ID"
# Create data structure
If(!(test-path "$PSScriptRoot\$TrustName"))
{New-Item -Path "$PSScriptRoot\$TrustName" -ItemType Directory}
If(!(test-path "$PSScriptRoot\$TrustName\$SchoolName"))
{New-Item -Path "$PSScriptRoot\$TrustName\$SchoolName" -ItemType Directory}
# Endpoint request loop
foreach($Eobject in $EndpointObjectArray){
$JsonName = $Eobject.JsonName
$Table = $Eobject.Table
$Method = $Eobject.Url_Method
# First response
$response = Invoke-RestMethod "https://api.wonde.com/v1.0/schools/$WondeID/$Table$Method&$Page" -Method 'GET' -Headers $headers -Body $body
$concat = $response.data
#Pagination loop
While ($response.meta.pagination.next){
$response = Invoke-RestMethod $response.meta.Pagination.next -Method 'GET' -Headers $headers -Body $Body
$concat = $concat + $response.data
} #pagination loop end
# Concatenate completed request
$concat | ConvertTo-Json | Out-File "$PSScriptRoot\$TrustName\$SchoolName\$JsonName.json"
} # Endpoint request loop end
} # School loop end
Stop-Transcript

Never mind, I figured it out. I turned the code that I wanted to run in parallel into a scriptblock, Passed it the parameters by using param () And in the start job line I included my variables in -ArgumentList.
My issue was actually that when you start a new job it changes the script file path to C:/, when I was passing $PSScriptRoot rather than the default file location.
Start-Transcript -Path "$PSScriptRoot\Errorlog.txt"
$PathRoot = $PSScriptRoot
$WondeObjectsArray = Import-CSV $PSScriptRoot\WondeID.csv
$EndpointObjectArray = Import-CSV $PSScriptRoot\Endpoints.csv
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("REDACTED", "REDACTED")
# School loop
Foreach ($object in $WondeObjectsArray) {
$SchoolName = $object.School
$TrustName = $object.Trust
$WondeID = $object."Wonde ID"
# Create data structure
If(!(test-path "$PSScriptRoot\$TrustName"))
{New-Item -Path "$PSScriptRoot\$TrustName" -ItemType Directory}
If(!(test-path "$PSScriptRoot\$TrustName\$SchoolName"))
{New-Item -Path "$PSScriptRoot\$TrustName\$SchoolName" -ItemType Directory}
# Endpoint request loop
foreach ($Eobject in $EndpointObjectArray){
$JsonName = $Eobject.JsonName
$Table = $Eobject.Table
$Method = $Eobject.Url_Method
$scriptblock = {
param ($WondeID, $Table, $Method, $headers, $trustName, $SchoolName, $JsonName, $PathRoot)
write-host "$TrustName : $SchoolName : $JsonName is complete!"
# First response
$response = Invoke-RestMethod "https://api.wonde.com/v1.0/schools/$WondeID/$Table$Method&$Page" -Method 'GET' -Headers $headers -Body $body
$concat = $response.data
#Pagination loop
While ($response.meta.pagination.next){
$response = Invoke-RestMethod $response.meta.Pagination.next -Method 'GET' -Headers $headers -Body $Body
$concat = $concat + $response.data
} #pagination loop end
# Concatenate completed request
$concat | ConvertTo-Json | Out-File "$PathRoot\$TrustName\$SchoolName\$JsonName.json"
}
Start-Job $scriptblock -Name "$SchoolName&$JsonName" -ArgumentList $WondeID, $Table, $Method, $headers, $trustName, $SchoolName, $JsonName, $PathRoot
} # Endpoint request loop end
Get-Job | Wait-Job | Receive-Job
Get-Job | Remove-Job
} # School loop end
Stop-Transcript

Related

synchronize an onprem fileshare to a sharepoint online site collection using powershell and Microsof RestAPI

I am trying to work out a powershell script that:
retrieves an accesstoken (MSAL) to access (read/write) a sharepoint online site with subsites and documents. Preferably the Azure APP-registration ServicePrincipal can be granted access to just that site and access the sharepoint site/files without giving consent to the whole sharepoint environment. I don't know if that is possible currently as I can only grant application permission to files.readwrite.all and sites.readwrite.all. I do not see anything like files.readwrite.shared to grant access only to sites/collections that the serviceprincipal has access to. Anyone done this? I currently use the MSAL.PS powershell module to get a token using an AppRegistration with the admin-consented readwrite.all access but would like to limit that. The code for this is now:
Import-Module MSAL.PS;
$clientid = "my-appreg-client-id";
$tenantID = 'my-tenant-id';
$thumbPrint = 'certificate-thumbprint';
$ClientCertificate = Get-Item "Cert:\CurrentUser\My\$thumbPrint";
$myAccessToken = Get-MsalToken -ClientId $clientID -TenantId $tenantID -ClientCertificate
$ClientCertificate;
The script will read all files and folders from an UNC-share and build a file-collection of the onprem files. That part of the code is in place using a Get-ChildItem call to the UNC filetree.
Then, after getting the token, I need to get the current available files in the sharepoint online site document library structure and store that in a variable/hashtable which I can use to perform lookups between the onprem filecollection and the presence of those files and (sub)folders in the sharepoint site. If a folder does not yet exist I need to create that sharepoint folder and if a file is not yet present or the onprem version is newer I need to upload that file into sharepoint.
I have a script that does this using the old sharepoint.client.dll libraries but those support only basic authentication which will be unavailable any time soon for accessing the MS Online environment. So now I am searching for code to do this using the Microsoft Graph Api or other Rest API call. I am already struggling to get the contents of a site file collection so I hope that this generic problem description is enough to get some hints and tips/resources to get going.
Many thanks,
Eric
This is what I use. I'm using powershell in Linux.
## Get the Token
$clientId = "Application (Client) ID"
$clientSecret = "Client secret"
$tenantName = "TenantName.onmicrosoft.com"
$tokenBody = #{
Grant_Type = 'client_credentials'
Scope = 'https://graph.microsoft.com/.default'
Client_Id = $clientId
Client_Secret = $clientSecret
}
$tokenResponse = Invoke-RestMethod -Uri "https://login.microsoftonline.com/$TenantName/oauth2/v2.0/token" -Method POST -Body $tokenBody -ErrorAction Stop
$headers = #{
"Authorization" = "Bearer $($tokenResponse.access_token)"
"Content-Type" = "application/json"
}
## Use the SharePoint groups ObjectID. From this we'll get the drive ID.
$site_objectid = "Groups ObjectID"
## Create all the folders on the SharePoint site first. I've set microsoft.graph.conflictBehavior below to fail because I never want to rename or replace folders.
# Set the base directory.
$baseDirectory = "/test"
$directories = get-childItem -path $baseDirectory -recurse -directory
foreach ($directory in $directories) {
$URL = "https://graph.microsoft.com/v1.0/groups/$site_objectid/sites/root"
$subsite_ID = (Invoke-RestMethod -Headers $headers -Uri $URL -Method Get).ID
$URL = "https://graph.microsoft.com/v1.0/sites/$subsite_ID/drives"
$Drives = Invoke-RestMethod -Headers $headers -Uri $URL -Method Get
$Document_drive_ID = ($Drives.value | Where-Object { $_.name -eq 'Documents' }).id
$createFolderURL = "https://graph.microsoft.com/v1.0/drives/$Document_drive_ID/items/root:{0}:/children" -f $directory.parent.FullName
$file = $directory.Name
$uploadFolderRequestBody = #{
name= "$file"
folder = #{}
"#microsoft.graph.conflictBehavior"= "fail"
} | ConvertTo-Json
invoke-restMethod -headers $headers -method Post -body $uploadFolderRequestBody -contentType "application/json" -uri $createFolderURL
}
## Upload the files. I'm only adding files that are 4 days old or less because I run the script every 3 days for backup.
## These are set in the $sharefiles variable. To upload all files just remove everything after the pipe.
$sharefiles = get-childItem $baseDirectory -recurse | Where-Object {$_.LastWriteTime -gt (Get-Date).AddDays(-4)}
foreach ($sharefile in $sharefiles) {
$Filepath = $sharefile.FullName
$URL = "https://graph.microsoft.com/v1.0/groups/$site_objectid/sites/root"
$subsite_ID = (Invoke-RestMethod -Headers $headers -Uri $URL -Method Get).ID
$URL = "https://graph.microsoft.com/v1.0/sites/$subsite_ID/drives"
$Drives = Invoke-RestMethod -Headers $headers -Uri $URL -Method Get
$Document_drive_ID = ($Drives.value | Where-Object { $_.name -eq 'Documents' }).id
$Filename = $sharefile.Name
$upload_session = "https://graph.microsoft.com/v1.0/drives/$Document_drive_ID/root:{0}/$($Filename):/createUploadSession" -f $sharefile.directory.FullName
$upload_session_url = (Invoke-RestMethod -Uri $upload_session -Headers $headers -Method Post).uploadUrl
## We'll upload files in chunks.
$ChunkSize = 62259200
$file = New-Object System.IO.FileInfo($Filepath)
$reader = [System.IO.File]::OpenRead($Filepath)
$buffer = New-Object -TypeName Byte[] -ArgumentList $ChunkSize
$position = 0
$counter = 0
Write-Host "ChunkSize: $ChunkSize" -ForegroundColor Cyan
Write-Host "BufferSize: $($buffer.Length)" -ForegroundColor Cyan
$moreData = $true
While ($moreData) {
#Read a chunk
$bytesRead = $reader.Read($buffer, 0, $buffer.Length)
$output = $buffer
If ($bytesRead -ne $buffer.Length) {
#no more data to be read
$moreData = $false
#shrink the output array to the number of bytes
$output = New-Object -TypeName Byte[] -ArgumentList $bytesRead
[Array]::Copy($buffer, $output, $bytesRead)
Write-Host "no more data" -ForegroundColor Yellow
}
#Upload the chunk
$Header = #{
'Content-Range' = "bytes $position-$($position + $output.Length - 1)/$($file.Length)"
}
Write-Host "Content-Range = bytes $position-$($position + $output.Length - 1)/$($file.Length)" -ForegroundColor Cyan
#$position = $position + $output.Length - 1
$position = $position + $output.Length
Invoke-RestMethod -Method Put -Uri $upload_session_url -Body $output -Headers $Header -SkipHeaderValidation
#Increment counter
$counter++
}
$reader.Close()
}

Powershell - Loop through folder, Get Contents and post to SOAP

I am trying to loop through a folder, grab all files, read their contents then post each file content individually to SOAP.
This is how I would do it, but PowerShell returns an error.
Invoke-Webrequest : The input object cannot be bound to any parameters for the command either because the command does not take pipeline input or the input and its properties do not match any of the parameters that take pipeline input.
Below is my code:
$dataAPI = Get-ChildItem 'C:\Users\..\Output'
$uri = 'http://server-name.com:8080/name/name2'
ForEach ($Item in $dataAPI) {
Get-Content $Item.FullName | Invoke-Webrequest -Headers #{"Content-Type" = "text/xml;charset=UTF-8"; "SOAPAction" = "http://server-name.com:8080/name/name2"} -Method 'POST' -Body $dataAPI -Uri $uri -UseDefaultCredential
}
I am not really sure where I should place the Invoke-WebRequest...
Any help would be appreciated. Thanks.
Continuing from my comments,
Add switch -Raw to the Get-Content call to receive a single multiline string instead of an array of lines
Add switch -File to the Get-ChildItem call to ensure you will only deal with files in the loop, not directories too
Try
# if all files you need have a common extension, add `-Filter '*.xml'` to below line
# '*.xml' is just an example here..
$files = Get-ChildItem -Path 'C:\Users\Gabriel\Output' -File
$uri = 'http://server-name.com:8080/name/name2'
$header = #{"Content-Type" = "text/xml;charset=UTF-8"; "SOAPAction" = "http://server-name.com:8080/name/name2"}
foreach ($Item in $files) {
$content = Get-Content $Item.FullName -Raw
Invoke-Webrequest -Headers $header -Method 'POST' -Body $content -Uri $uri -UseDefaultCredential
}

How to time delay between each url in a PowerShell http server status check script

I am using the following PS script to check the server status of a list of URLs.
$linksFilePath = "C:\Temp\urls.txt"
$outCSVPath = "C:\Temp\results.csv"
get-content $linksFilePath |
Foreach { $uri = $_; try { Invoke-WebRequest -Uri $uri -Method HEAD -MaximumRedirection 0 -ErrorAction SilentlyContinue -UseBasicParsing } catch {
New-Object -TypeName psobject -Property #{ Error = $_ } } } |
Select #{Name="RequestURI";Expression={$uri}}, StatusCode, #{Name="RedirectTo";Expression={$_.Headers["Location"]}}, Error |
Export-Csv $outCSVPath
I want the script to pause for x amount of seconds between each url in my urls.txt list before moving on to the next line and append the results to my results.csv file accordingly.
Is this possible?

Nessus IO Powershell API HTML/PDF report

When exporting PDF & HTML format reports the reports are empty, best I can tell there needs to be a report attribute but after 5 hours of running through the API and searching every which way I can think of I am not finding anything referencing that.
For those interested, this is the starting script before I started optimizing it.
https://github.com/Pwd9000-ML/NessusV7-Report-Export-PowerShell/blob/master/NessusPro_v7_Report_Exporter_Tool.ps1
add-type #"
using System.Net;
using System.Security.Cryptography.X509Certificates;
public class TrustAllCertsPolicy : ICertificatePolicy {
public bool CheckValidationResult(
ServicePoint srvPoint, X509Certificate certificate,
WebRequest request, int certificateProblem) {
return true;
}
}
"#
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy
[System.Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$GNR = #{
OutputDir = "$Env:SystemDrive\Nessus\$(([DateTime]::Now).ToString("yyyy-MM-dd"))"
StatusUri = [System.Collections.ArrayList]::new()
}
#------------------Input Variables-----------------------------------------------------------------
$Baseurl = "https://$($env:COMPUTERNAME):8834"
$Username = <Removed>
$Password = <Removed>
$ContentType = "application/json"
$POSTMethod = 'POST'
$GETMethod = 'GET'
#------------------Stage props to obtain session token (Parameters)--------------------------------
$session = #{
Uri = $Baseurl + "/session"
ContentType = $ContentType
Method = $POSTMethod
Body = convertto-json (New-Object PSObject -Property #{username = $Username; password = $Password})
}
#------------------Commit session props for token header X-cookie----------------------------------
$TokenResponse = Invoke-RestMethod #session
if ($TokenResponse) {
$Header = #{"X-Cookie" = "token=" + $TokenResponse.token}
} else {
Write-nLog -Message "Error occured obtaining session token. Script Terminating... Please ensure Username and Password Correct." -Type Error -TerminatingError
}
IF (![System.IO.Directory]::Exists($GNR.OutputDir)) {
New-Item -Path $GNR.OutputDir -ItemType directory -Force |Out-Null
}
#------------------Output completed scans----------------------------------------------------------
$Scans = (Invoke-RestMethod -Uri "$baseurl/scans" -Headers $Header -Method $GETMethod -ContentType "application/json").scans
ForEach ($Format in #("nessus","pdf")) {
$StatusURI = [System.Collections.ArrayList]::new()
$StatusArray = [System.Collections.ArrayList]::new()
ForEach ($Scan in $Scans) {
Add-Content -Path "$($GNR.OutputDir)\ScanReport.txt" -Value "$($Scan.Name) ($($Scan.status))"
IF ($Scan.status -eq "Completed") {
$File = (Invoke-RestMethod -URI "$baseurl/scans/$($Scan.ID)/export" -ContentType $ContentType -Headers $Header -Method $POSTMethod -Body $(convertto-json (New-Object PSObject -Property #{format = "$Format"}))).file
[Void]$StatusArray.Add(
[pscustomobject]#{
ScanName = $scan.name
StatusUri = $baseurl + "/scans" + "/" + $Scan.id + "/export/" + "$file" + "/status"
DownloadUri = $baseurl + "/scans" + "/" + $Scan.id + "/export/" + "$file" + "/download"
}
)
}
}
#------------------Check Status of Export requests-------------------------------------------------
While ($StatusArray.StatusUri.count -GT $StatusURI.Count) {
ForEach ($ScanStatus in $StatusArray.StatusURI) {
IF ((Invoke-RestMethod -Uri $ScanStatus -ContentType $ContentType -Headers $Header -Method $GETMethod).status -EQ "Ready") {
if ($StatusURI -notcontains $ScanStatus) {
Write-Host "Adding $ScanStatus"
[void]$StatusURI.Add($ScanStatus)
}
} Else {
Write-nLog -Type "Info" -Message "Not all scans complete. ($($GNR.StatusURI.Count)/$($StatusArray.StatusUri.count)"
Start-Sleep -s 5
}
}
}
#------------------Download the Reports------------------------------------------------------------
$ExportUri = $StatusArray.DownloadUri
$outputs = $StatusArray.ScanName
foreach ($i in 0..($ExportUri.Count - 1)) {
Write-nLog -Type Info -Message "Exporting Report: $($outputs[$i])"
Invoke-WebRequest -Uri $ExportUri[$i] -ContentType $ContentType -Headers $Header -Method $GETMethod -OutFile "$($GNR.OutputDir)\$($outputs[$i]).$Format"
}
}
#------------------Script END----------------------------------------------------------------------
There are several additional parameters you can set on the POST /scans/{id}/export endpoint. The important one missed here is chapters which accepts a semi-colon delimted list of the desired content sections. This must be set for exports of pdf or html types, otherwise you get an empty result.
For example, to get the executive summary, in addition to format of html/pdf/csv etc, set chapters to vuln_hosts_summary. The other available options are:
vuln_by_host
compliance_exec
remediations
vuln_by_plugin
compliance
Hopefully this helps the next person trying to debug empty Nessus API exports too!
For full API docs for your version check out https://{YOUR_NESSUS_INSTALL}/api

Make Invoke-WebRequest loop through each URL it finds

I'm new to PowerShell and I'm trying to make the Invoke-WebRequest cmdlet loop through each url the webscrape finds. All I have so far is this :
$site = Invoke-WebRequest -UseBasicParsing -Uri www.example.com/examples
$site.Links | Out-GridView
Any help would be appreciated!
Add your links to a comma separated list.
Split the list and loop each item.
Request each item.
As below:
$option = [System.StringSplitOptions]::RemoveEmptyEntries
$urlCollection = "link1,link2,link3"
$separator = ","
$urlList = $urlCollection.Split($separator, $option)
foreach ($url in $urlList) {
Invoke-WebRequest $url
# Give feedback on how far we are
Write-Host ("Initiated request for {0}" -f $url)
}