I'm new to PowerShell and I'm trying to make the Invoke-WebRequest cmdlet loop through each url the webscrape finds. All I have so far is this :
$site = Invoke-WebRequest -UseBasicParsing -Uri www.example.com/examples
$site.Links | Out-GridView
Any help would be appreciated!
Add your links to a comma separated list.
Split the list and loop each item.
Request each item.
As below:
$option = [System.StringSplitOptions]::RemoveEmptyEntries
$urlCollection = "link1,link2,link3"
$separator = ","
$urlList = $urlCollection.Split($separator, $option)
foreach ($url in $urlList) {
Invoke-WebRequest $url
# Give feedback on how far we are
Write-Host ("Initiated request for {0}" -f $url)
}
Related
I am trying to work out a powershell script that:
retrieves an accesstoken (MSAL) to access (read/write) a sharepoint online site with subsites and documents. Preferably the Azure APP-registration ServicePrincipal can be granted access to just that site and access the sharepoint site/files without giving consent to the whole sharepoint environment. I don't know if that is possible currently as I can only grant application permission to files.readwrite.all and sites.readwrite.all. I do not see anything like files.readwrite.shared to grant access only to sites/collections that the serviceprincipal has access to. Anyone done this? I currently use the MSAL.PS powershell module to get a token using an AppRegistration with the admin-consented readwrite.all access but would like to limit that. The code for this is now:
Import-Module MSAL.PS;
$clientid = "my-appreg-client-id";
$tenantID = 'my-tenant-id';
$thumbPrint = 'certificate-thumbprint';
$ClientCertificate = Get-Item "Cert:\CurrentUser\My\$thumbPrint";
$myAccessToken = Get-MsalToken -ClientId $clientID -TenantId $tenantID -ClientCertificate
$ClientCertificate;
The script will read all files and folders from an UNC-share and build a file-collection of the onprem files. That part of the code is in place using a Get-ChildItem call to the UNC filetree.
Then, after getting the token, I need to get the current available files in the sharepoint online site document library structure and store that in a variable/hashtable which I can use to perform lookups between the onprem filecollection and the presence of those files and (sub)folders in the sharepoint site. If a folder does not yet exist I need to create that sharepoint folder and if a file is not yet present or the onprem version is newer I need to upload that file into sharepoint.
I have a script that does this using the old sharepoint.client.dll libraries but those support only basic authentication which will be unavailable any time soon for accessing the MS Online environment. So now I am searching for code to do this using the Microsoft Graph Api or other Rest API call. I am already struggling to get the contents of a site file collection so I hope that this generic problem description is enough to get some hints and tips/resources to get going.
Many thanks,
Eric
This is what I use. I'm using powershell in Linux.
## Get the Token
$clientId = "Application (Client) ID"
$clientSecret = "Client secret"
$tenantName = "TenantName.onmicrosoft.com"
$tokenBody = #{
Grant_Type = 'client_credentials'
Scope = 'https://graph.microsoft.com/.default'
Client_Id = $clientId
Client_Secret = $clientSecret
}
$tokenResponse = Invoke-RestMethod -Uri "https://login.microsoftonline.com/$TenantName/oauth2/v2.0/token" -Method POST -Body $tokenBody -ErrorAction Stop
$headers = #{
"Authorization" = "Bearer $($tokenResponse.access_token)"
"Content-Type" = "application/json"
}
## Use the SharePoint groups ObjectID. From this we'll get the drive ID.
$site_objectid = "Groups ObjectID"
## Create all the folders on the SharePoint site first. I've set microsoft.graph.conflictBehavior below to fail because I never want to rename or replace folders.
# Set the base directory.
$baseDirectory = "/test"
$directories = get-childItem -path $baseDirectory -recurse -directory
foreach ($directory in $directories) {
$URL = "https://graph.microsoft.com/v1.0/groups/$site_objectid/sites/root"
$subsite_ID = (Invoke-RestMethod -Headers $headers -Uri $URL -Method Get).ID
$URL = "https://graph.microsoft.com/v1.0/sites/$subsite_ID/drives"
$Drives = Invoke-RestMethod -Headers $headers -Uri $URL -Method Get
$Document_drive_ID = ($Drives.value | Where-Object { $_.name -eq 'Documents' }).id
$createFolderURL = "https://graph.microsoft.com/v1.0/drives/$Document_drive_ID/items/root:{0}:/children" -f $directory.parent.FullName
$file = $directory.Name
$uploadFolderRequestBody = #{
name= "$file"
folder = #{}
"#microsoft.graph.conflictBehavior"= "fail"
} | ConvertTo-Json
invoke-restMethod -headers $headers -method Post -body $uploadFolderRequestBody -contentType "application/json" -uri $createFolderURL
}
## Upload the files. I'm only adding files that are 4 days old or less because I run the script every 3 days for backup.
## These are set in the $sharefiles variable. To upload all files just remove everything after the pipe.
$sharefiles = get-childItem $baseDirectory -recurse | Where-Object {$_.LastWriteTime -gt (Get-Date).AddDays(-4)}
foreach ($sharefile in $sharefiles) {
$Filepath = $sharefile.FullName
$URL = "https://graph.microsoft.com/v1.0/groups/$site_objectid/sites/root"
$subsite_ID = (Invoke-RestMethod -Headers $headers -Uri $URL -Method Get).ID
$URL = "https://graph.microsoft.com/v1.0/sites/$subsite_ID/drives"
$Drives = Invoke-RestMethod -Headers $headers -Uri $URL -Method Get
$Document_drive_ID = ($Drives.value | Where-Object { $_.name -eq 'Documents' }).id
$Filename = $sharefile.Name
$upload_session = "https://graph.microsoft.com/v1.0/drives/$Document_drive_ID/root:{0}/$($Filename):/createUploadSession" -f $sharefile.directory.FullName
$upload_session_url = (Invoke-RestMethod -Uri $upload_session -Headers $headers -Method Post).uploadUrl
## We'll upload files in chunks.
$ChunkSize = 62259200
$file = New-Object System.IO.FileInfo($Filepath)
$reader = [System.IO.File]::OpenRead($Filepath)
$buffer = New-Object -TypeName Byte[] -ArgumentList $ChunkSize
$position = 0
$counter = 0
Write-Host "ChunkSize: $ChunkSize" -ForegroundColor Cyan
Write-Host "BufferSize: $($buffer.Length)" -ForegroundColor Cyan
$moreData = $true
While ($moreData) {
#Read a chunk
$bytesRead = $reader.Read($buffer, 0, $buffer.Length)
$output = $buffer
If ($bytesRead -ne $buffer.Length) {
#no more data to be read
$moreData = $false
#shrink the output array to the number of bytes
$output = New-Object -TypeName Byte[] -ArgumentList $bytesRead
[Array]::Copy($buffer, $output, $bytesRead)
Write-Host "no more data" -ForegroundColor Yellow
}
#Upload the chunk
$Header = #{
'Content-Range' = "bytes $position-$($position + $output.Length - 1)/$($file.Length)"
}
Write-Host "Content-Range = bytes $position-$($position + $output.Length - 1)/$($file.Length)" -ForegroundColor Cyan
#$position = $position + $output.Length - 1
$position = $position + $output.Length
Invoke-RestMethod -Method Put -Uri $upload_session_url -Body $output -Headers $Header -SkipHeaderValidation
#Increment counter
$counter++
}
$reader.Close()
}
I am trying to loop through a folder, grab all files, read their contents then post each file content individually to SOAP.
This is how I would do it, but PowerShell returns an error.
Invoke-Webrequest : The input object cannot be bound to any parameters for the command either because the command does not take pipeline input or the input and its properties do not match any of the parameters that take pipeline input.
Below is my code:
$dataAPI = Get-ChildItem 'C:\Users\..\Output'
$uri = 'http://server-name.com:8080/name/name2'
ForEach ($Item in $dataAPI) {
Get-Content $Item.FullName | Invoke-Webrequest -Headers #{"Content-Type" = "text/xml;charset=UTF-8"; "SOAPAction" = "http://server-name.com:8080/name/name2"} -Method 'POST' -Body $dataAPI -Uri $uri -UseDefaultCredential
}
I am not really sure where I should place the Invoke-WebRequest...
Any help would be appreciated. Thanks.
Continuing from my comments,
Add switch -Raw to the Get-Content call to receive a single multiline string instead of an array of lines
Add switch -File to the Get-ChildItem call to ensure you will only deal with files in the loop, not directories too
Try
# if all files you need have a common extension, add `-Filter '*.xml'` to below line
# '*.xml' is just an example here..
$files = Get-ChildItem -Path 'C:\Users\Gabriel\Output' -File
$uri = 'http://server-name.com:8080/name/name2'
$header = #{"Content-Type" = "text/xml;charset=UTF-8"; "SOAPAction" = "http://server-name.com:8080/name/name2"}
foreach ($Item in $files) {
$content = Get-Content $Item.FullName -Raw
Invoke-Webrequest -Headers $header -Method 'POST' -Body $content -Uri $uri -UseDefaultCredential
}
I'm trying to concatenate my uri in this invoke so that I can have each variable on a separate line. That way I can make changes easier and don't have to search as hard. I was able to do this in a bash script, but am at a loss for how to do this in Powershell.
Line as follows:
Invoke-WebRequest -Uri (beginning of url)?date=$date"&"time=$time"&"name=$env:computername"&"loginid=$env:username"&"sn=$serialnumber"&"ipaddr=$ipaddr"&"verb=profileclear
Thanks!
I think the easiest way of achieving this would be like this:
$uri = "Https://something.somewhere/?" +
"date=$date&" +
"time=$time&" +
"name=$env:computername&" +
"loginid=$env:username&" +
"sn=$serialnumber&" +
"ipaddr=$ipaddr&" +
"verb=profileclear"
Invoke-WebRequest -Uri $uri
I would create an array of variable strings, join them, and build a uri:
$variables = "date=$date",
"time=$time",
"name=$env:computername",
"loginid=$env:username",
"sn=$serialnumber",
"ipaddr=$ipaddr",
"verb=profileclear"
$uri = [System.UriBuilder]::new('https://contoso.com')
$uri.Query = $variables -join '&'
Invoke-WebRequest -Uri $uri.ToString()
One option is to store the query parameters in an ordered dictionary and construct the URL from that:
$parameters = [ordered]#{
date = $date
time = $time
name = $env:computername
loginid = $env:username
sn = $serialnumber
ipaddr = $ipaddr
verb = 'profileclear'
}
$baseURI = 'https://host.fqdn/path'
# Construct full URI string from base URI + parameters
$queryString = #($parameters.GetEnumerator() |ForEach-Object {
'{0}={1}' -f $_.Key,$_.Value
}) -join '&'
$URI = '{0}?{1}' -f $baseURI,$queryString
Invoke-WebRequest -Uri $URI
I have a long list of urls for which I want to check the last-modified header
The script below works well, however for some pages I get this popup, which of course slows down the whole process as I need to click 'No' every time:
To get around this, I added the -UseBasicParsing paramater (commented below)
This prevents the security warning, but also prevents last-modified header from appearing in the response
Is there any way around this?
$urls = #(
"https://www.google.com/",
"https://www.bassie.com/"
)
function Get-LastModified($url) {
$WebResponse = Invoke-WebRequest $url -TimeoutSec 10 # -UseBasicParsing
$members = $WebResponse | Get-Member
$parsedHtml = $WebResponse.ParsedHtml
$output = $parsedHtml | Select lastModified
return $output
}
foreach($url in $urls) {
if ($url -match "http://" -or $url -match "https://") {
$lastModified = Get-LastModified $url
Write-Host "--------------------------------------"
Write-Host $url
Write-Host $lastModified
Write-Host "--------------------------------------"
}
}
I need some help with the following PS code:
$site1 = "www.site1.com"
$site2 = "www.site2.com"
$site3 = "www.site3.com"
$sites = $site1,$site2,$site3
$request = foreach ($site in $sites) {invoke-webrequest $site -method head}
if ($request.StatusCode -ne "200"){write-host "site is not working"}
The actual output of $request returns the headers of all 3 sites, so how do I get the exact site that failed the test?
Thanks in advance
Try something like this:
"www.site1.com","www.site2.com", "www.site3.com" |
ForEach-Object {
$response = Invoke-WebRequest $_ -Method Head
[PsCustomObject]#{
Site = $_
StatusCode = $response.StatusCode
}
}
You can filter the output by appending the following after the last bracket:
| Where-Object StatusCode -ne 200