Read Invoke-WebRequest line by line - powershell

I'm trying to keep a central list of log file locations where my log file cleanup script can grab the most up to date list.
$logpaths = (Invoke-WebRequest -UseBasicParsing -Uri 'http://10.7.58.99/logpaths.txt').Content
foreach($logpath in $logpaths)
{
"line"
$logpath
}
My script was sort of working but I was seeing some strange behavior so when I broke it down I found that within the foreach loop it just loops once and dumps the entire contents.
If I download the file the a text file on the local machine I can then use [System.IO.File]::ReadLines and it steps through perfectly. However, I don't want to download the file each time I run it or store it on the local server at all for that matter. How can I step through the content of Invoke-WebRequest line by line?

Based on this example from the .NET docs, you could read a response stream line-by-line like this, which should have better performance.
$url = 'http://10.7.58.99/logpaths.txt'
& {
$myHttpWebRequest = [System.Net.WebRequest]::Create($url)
$myHttpWebResponse = $myHttpWebRequest.GetResponse()
$receiveStream = $myHttpWebResponse.GetResponseStream()
$encode = [System.Text.Encoding]::GetEncoding("utf-8")
$readStream = [System.IO.StreamReader]::new($receiveStream, $encode)
while (-not $readStream.EndOfStream) {
$readStream.ReadLine()
}
$myHttpWebResponse.Close()
$readStream.Close()
} | foreach {
$logPath = $_
}
You might want to turn this into a nice little function. Let me know if you need help.

Related

Powershell script to write to file maintaining structure

I am working with powershell to read in a file. See sample content in the file.
This is my file with content
-- #Start
This is more content
across different lines
etc etc
-- #End
I am using this code to read in file to a variable.
$content = Get-Content "Myfile.txt";
I then use this code to strip a particular section from the file and based on opening and closing tag.
$stringBuilder = New-Object System.Text.StringBuilder;
$pattern = "-- #Start(.*?)-- #End";
$matched = [regex]::match($content, $pattern).Groups[1].Value;
$stringBuilder.AppendLine($matched.Trim());
$stringBuilder.ToString() | Out-File "Newfile.txt" -Encoding utf8;
The problem that I have is in the file I write to, the formatting is not maintained. So what I want is:
This is more content
across different lines
etc etc
But what I am getting is:
This is more content across different lines etc etc
Any ideas how I can alter my code so that in the outputted file the structures is maintained (multiple lines)?
This regex might do what you're looking for, don't see a point on using a StringBuilder in this case. Do note, since this is a multi-line regex pattern you need to use the -Raw switch to read your file's content.
$re = [regex] '(?ms)(?<=^-- #Start\s*\r?\n).+?(?=^-- #End)'
$re.Match((Get-Content path\to\Myfile.txt -Raw)).Value |
Set-Content path\to\newFile.txt -NoNewLine
See https://regex101.com/r/82HJxf/1 for details.
If you want to do line-by-line processing, you could use a switch to read and process the lines of interest. This is particularly useful if the file is very big and doesn't fit in memory.
& {
$capture = $false
switch -Rege -File path\to\Myfile.txt {
'^-- #Start' { $capture = $true }
'^-- #End' { $capture = $false }
Default { if($capture) { $_ } }
}
} | Set-Content path\to\newFile.txt
If there is only one appearance of the opening and closing tag, you could even break the switch as soon as it encounters the closing tag to stop processing:
'^-- #End' { break }

How to create powershell script to retry on error

I am writing a Powershell script to copy unencrypted EBS Snapshots in AWS to Encrypted Snapshots. In AWS the max number of concurrent copies is currently 20 at one time, but I have 1400 snapshots to copy. I wrote a script in Powershell using a For Each loop to loop through the snapshot IDs stored in a Text file, and it works as expected until it gets to 20 snapshots being copied. Then it will throw the following error and fail:
An error occurred (ResourceLimitExceeded) when calling the CopySnapshot operation: Too many snapshot copies in progress. The limit is 20 for this destination region.
I have tried to use a While Do statement, but I believe I am missing some items on here. The script is listed below. Essentially I am trying to have it if the script gets to 20 concurrent copies, it will retry on the one snapshot until a free spot opens up and then move on to the next. Ideally I would like to just have this run in the background for a day or so. See the current script below:
function Get-TimeStamp {
return "[{0:MM/dd/yy} {0:HH:mm:ss}]" -f (Get-Date)
}
$kmsID = "blah"
$region = "us-east-1"
$stoploop = $false
[int]$Retrycount = "0"
Foreach($line in get-content C:\snaps4.txt) {
do {
$desc = aws ec2 describe-snapshots --snapshot-ids $line | ConvertFrom-Json
$description = $desc.Snapshots.Description
Write-Output "$description"
$snap = aws ec2 copy-snapshot --description "[Copied $line from us-east-1] $description" --source-region $region --source-snapshot-id $line --encrypted --kms-key-id $kmsID | ConvertFrom-Json
$newsnap = $snap.SnapshotId
Write-Output "$(Get-TimeStamp) Created copy of $line $description with NEW SnapshotID $newsnap" >> C:\log.txt
$stoploop = $true
}
While ($Stoploop -eq $false)
}
Please let me know if you have any questions, and I appreciate any help in advance.
Thanks!
You can put the copy command inside a try/catch block.
Something like this:
try {
copy command;
mark as complete
}
catch{
mark as failed
}
one approach is to make the content file a csv with file name and complete columns. Use import-csv to read it; iterate the imported list when $_.complete -ne "Y" and set complete to "Y" when it succeeds. Export the file at the end.
Re-run as needed

Scraping Multiple Pages and Making a Table

I've been trying to find a way to successfully scrape data from a website easily and have found that using Powershell gets me the results needed, although I can only tell how to do it one by one.
The URLs go from www.example.com/Item/1 to www.example.com/Item/40 and present data from a form.
I've used the commands:
$WebResponse = Invoke-WebRequest "www.example.com/Item/1"
$WebResponse.Forms.Fields
And the results I get are what I need, but I was wanted to be able to do it for all 40 pages and make a readable table from it.
I'm really new to anything to do with powershell so I'm assuming there's just something I'm looking over.
Just chuck it in a loop:
for ( $i = 1; $i -lt 40; $i++ ) {
$WebResponse = Invoke-WebRequest "www.example.com/Item/$i"
$WebResponse.Forms.Fields
}
Another possible way to write it:
$start = 1
$end = 40
$start..$end | select {
$WebResponse = Invoke-WebRequest "www.example.com/Item/$_"
$WebResponse.Forms.Fields
}

Outputting a file with an Azure Function

I'm trying to experiment with Azure Functions. Basically my use case is calling the function with a GUID as GET Parameter, having the function download the WIX toolkit DLL and an MSI file, updating a parameter in the MSI file, and the returning that file to the caller of the function (as download prompt for example).
I'm mostly there, just need some help getting the download prompt/send to happen, my code so far:
$urlWix = "http://domain/wix.dll"
$outputWix = "$Env:TEMP\wix.dll"
Invoke-WebRequest -Uri $urlWix -OutFile $outputWix
try{Add-Type -Path $outputWix}catch{$Null}
$urlMSI = "http://domain/file.msi"
$outputFile = "$Env:TEMP\file.msi"
Invoke-WebRequest -Uri $urlMSI -OutFile $outputFile
$oDatabase = New-Object Microsoft.Deployment.WindowsInstaller.Database($outputFile,[Microsoft.Deployment.WindowsInstaller.DatabaseOpenMode]::Direct);
$sSQLQuery = "SELECT * FROM Property WHERE Property= 'MYPROPERTY'"
[Microsoft.Deployment.WindowsInstaller.View]$oView = $oDatabase.OpenView($sSQLQuery)
$oView.Execute()
$oRecord = $oView.Fetch()
$oRecord.SetString("Value","MyCustomValue")
$oView.Modify([Microsoft.Deployment.WindowsInstaller.ViewModifyMode]::Update,$oRecord)
$oView.Close();
$oDatabase.Dispose();
$file = get-item $outputFile
write-output $file
Unfortunately due to content type issues this is not possible in powershell. You can do this via a C#, F#, or Node (isRaw) function. The problem is that you need to specify headers via the JSON response format, which would convert any non-text data into a base64 string.
If you want to sent a text file via powershell it is possible:
$response = ConvertTo-JSON #{
Body="your file data";
Headers=#{
# unfortunately it seems functions does not support 'filename=...'
'Content-Disposition'='attachment';
# you would use application/octet-stream, but because it's converted to JSON you lose binary content
'Content-Type'='text/plain';
};
}
Out-File -Encoding Ascii -FilePath $res -inputObject $response

Converting byte array to true/false

Through various different posts on StackOverflow and other places I was able to put together a powershell script that FTP uploads files and it works great. However I wanted to add a bit more verbosity to it. See code below:
foreach ($file in $uploadfiles)
{
# create the full path to the file on remote server (odd but okay!)
$ftp_command = $ftp + $file
# for debugging
#$ftp_command
# create a new URI object for the full path of the file
$uri = New-Object System.URI($ftp_command)
#for debugging
#$uri
# finally do our upload to the remote server - URI object, full path to local file
#$responseArray = $ftpclient.UploadFile($uri,$file.Fullname)
$result = $ftpclient.UploadFile($uri,$file.Fullname)
if ($result) {
$file.Fullname + " uploaded successfully"
} else {
$file.Fullname + " not uploaded successfully"
}
}
Basically after the file is uploaded I wanted to check and see if it was successful or not. Upload file is supposed to return a byte array ( http://msdn.microsoft.com/en-us/library/36s52zhs(v=vs.80).aspx ; A Byte array containing the body of the response from the resource.). I'm new to powershell so that's probably where my problem is, but for the life of me I can't get anything to come out of $result so I can test. Is it possible the server isn't returning anything or I'm just not accessing/setting the the byte array correctly? I've tried a variety of different things, but haven't yet figured anything out.
Thanks,
I would not use the $result in your case begining PowerShell 2.0 you can use the try/catch sections.
try
{
$ftpclient.UploadFile($uri,$file.Fullname)
}
catch [System.Net.WebException]
{
# here $_ gives you the exception details
}