PowerShell Get links from FTP Site - powershell

I'm trying to create a list of all the links from an FTP site. The links are to download zip files.
My end goal is to analyze each link as a string and match the beginning to a set phrase. The end of each link contains a date and I have to find the newest one to download.
In this example I want to find ABC_20170323.zip out of this list:
ABC_20170323.zip
ABC_20160102.zip
EFG_20170324.zip
I need to figure out how to acquire the links before analysis. I've tried a variety of methods and the only one that has returned any information from the site is to gather the source code:
Invoke-WebRequest $sourceuri -UseBasicParsing -Credential $user
But then I find it difficult to gather all the links from there. Anyone have a method for easily getting these file download links?

Okay, so I know it's been ages, but I figured out how to do it. Admittedly, it's the hard way. What ended up happening is I gathered the source code and saved it like so:
$r = Invoke-WebRequest $sourceuri -UseBasicParsing -credential $user
Then I converted it to a string and used -split to separate out the links by their html tag and what I expected the beginning to look like (in this case 'ABC'):
$c = $r.ToString() #convert to string
$datelist = #()
$f = ($c -split 'A HREF="' -split '.zip</A>') #split by html tag (and .zip)
foreach($link in $f){
if($link -match 'ABC') { #if the beginning of the link is 'ABC'
$datelist += ($link.substring($link.Length-8)) #isolate the date on the end
}
} #more logic for comparing $datelist items...
Then I wrote some logic for comparing the items in $datelist (Omitted from answer) and created a variable that had all the components I needed:
$ExactLink = "ABC_$GreatestDate" + ".zip"
Then went on to download the $ExactLink I needed.

Related

Download File on Webpage via Windows CMD/Power Shell

just as the title states, I'd like to download a file from the internet, specifically the download on this webpage. I have looked into using Invoke-WebRequest, curl, and certutil. All of these options download the HTML of the site. The specific URL of the download looks like this: https://bluetoothinstaller.com/bluetooth-command-line-tools/BluetoothCLTools-1.2.0.56.exe.
Calling things like the following just downloads the HTML:
Invoke-WebRequest -Uri 'https://bluetoothinstaller.com/bluetooth-command-line-tools/BluetoothCLTools-1.2.0.56.exe' -OutFile 'test.exe'
Alternatively, if anyone knows how to download the link via the HTML, please do share.
I'd prefer it if the solution did not require any additional software, but am flexible.
Thanks!
Looking at some code I wrote near the end of last year and I found this line:
(New-Object System.Net.WebClient).DownloadFile($URL, $ZipFile)
In my case I was trying to download the latest SQLite and it worked. In your case you will probably want to rename the $ZipFile variable to something like $ExeFile.
The command to build the file path/name, and define where I wanted the file saved, was this:
$ZipFile = "$PSScriptRoot\$(Split-Path -Path $URL -Leaf)"
As for extracting the file's download path form a webpage, I haven't done that yet. It is something aim to do but it will be awhile before I get around to trying to figure that out.
The following worked for me, note the OutFile comment. You might find something useful on the Network tab of your browser's dev-tools.
$params = #{
UseBasicParsing = $true
Uri = "https://bluetoothinstaller.com/bluetooth-command-line-tools/BluetoothCLTools-1.2.0.56.exe"
Headers = #{
Referer = "https://bluetoothinstaller.com/bluetooth-command-line-tools/download.html"
}
OutFile = 'path/to/download.exe' # Change this
}
Invoke-RestMethod #params

Copy File From Teams to File Server Using Powershell

I am trying to copy a xlsx file from my Teams channel to a location on a file server.
I've seen various articles on line that suggest Invoke-WebRequest "https://teams.microsoft.com/l/file/rest of URL here" -OutFile C:\Test\CricketQuiz.xlsx. While this works in terms of being able to see the file at the desired file location, I can't actually open it as I get this error:
I get the same error when I tried the approach suggested in this article https://blog.jourdant.me/post/3-ways-to-download-files-with-powershell .
$url = "https://teams.microsoft.com/l/file/rest of my URL here"
$output = "C:\Test\SportsQuiz.xlsx"
$start_time = Get-Date
$wc = New-Object System.Net.WebClient
$wc.DownloadFile($url, $output)
I'm guessing this is something relatively straightforward to resolve for those with more experience.
The problem here is that the link you've got (the Teams link) is not a direct link to the file at all - it's a link to an embedded version of the file, inside the Teams client (basically like a deep link). To -actually- download the file try the following:
from the url you've got, parse out the "objectUrl" part of the query string. As an example, I have:
https://teams.microsoft.com/l/file/[guid]?tenantId=[guid2]&fileType=xlsx&objectUrl=https%3A%2F%2F[tenantname].sharepoint.com%2Fsites%2FHR%2FShared%2520Documents%2FEmployee%2520Sentiment%2520Analysis.xlsx&serviceName=recent
you want (in my example): https%3A%2F%2F[tenantname].sharepoint.com%2Fsites%2FHR%2FShared%2520Documents%2FEmployee%2520Sentiment%2520Analysis.xlsx
then you need to querystring decode this, to get (e.g.) https://[tenantname].sharepoint.com/sites/HR/Shared%20Documents/Employee%20Sentiment%20Analysis.xlsx
finally, you should use the PnP-PowerShell module's Get-PnPFile to download the file. This itself is a few steps though:
3.1 you need to connect the session, using Connect-PnPOnline, but you also need to connect to the right "SPWeb". In this case, it would be Connect-PnPOnline https://[tenantname].sharepoint.com/sites/HR
3.1 after that you can download the file, but you need to url decode it again, to get rid of %20 and similar, something like:
Get-PnPFile -Url "/Shared Documents/Employee Sentiment Analysis.xlsx" -AsFile -Path "c:\temp\"
This will give you a copy of Employee Sentiment Analysis.xlsx (in my example) inside c:\temp
Obviously this can all be automated, like the querystring decoding, the connect-pnp credentials, etc., but hopefully this gets you on the right path.

Compare-Object API response with text file shows everything is different

I have a Web API that returns an HTML string which I want to compare with a html file on my local machine.
To do so, I have the following code
$Result = (Invoke-WebRequest `
-Uri "{uri}" `
-Headers #{"some-header", "some-value"}).Content
$TestContent = Get-Content -Path ($RepositoryLocation + "index.html") -Raw
$Equal = Compare-Object -ReferenceObject $TestContent -DifferenceObject $Result
When I now use Write-Hosts $Equal it displays me that the whole content is different
When I use Write-Host $Equal.SideIndicator it displays me => <= which also indicates that the complete file is different
Furthermore, using the command with -IncludeEqual -ExcludeDifferent displays empty result, so like I said, no lines are the same.
So what I tried next was to save the Content of $Result into a text file and compare them then, but still, it told me, that the whole file is different.
I then used diffchecker.com as well as JetBrains IDEs integrated comparison tool, to check for differences. Both tools told me that the content is identical. I'm losing my mind, why does PowerShell tell me they have complete different content?
Sadly, I cannot post the content of the API response as well as the content of the index.html
What I thought maybe could be the reason is
Encoding, however both are UTF8
Line endings, however no diff if I use CL, CL RF or RF on the file
Some hidden characters (tabs instead of spaces) but I activated to see that on JetBrains IDE and they still are identical.
How do I know what's causing this issue here?
Not sure if Compare-Object is the right choice here. As the name says, it's for objects. Why not use the equals-operator -eq or string.Equals?
$equal = $testContent -eq $result
# or
$equal = $testContent.Equals($result)
Compare-Object does not do line-by-line-comparison if both are just single strings. So your strings are not equal, as long as there's any tiny difference, as much as just an extra line-feed at the end, etc.
You could try several things:
# trim
$equal = $testContent.Trim() -eq $result.Trim()
# case-insensitive comparison
$equal = $testContent.Equals($result, 'OrdinalIgnoreCase')
# or both
$equal = $testContent.Trim().Equals($result.Trim(), 'OrdinalIgnoreCase')
Btw: If you do want a line-by-line-comparison, you have to split up the strings into lines first, e.g.:
Compare-Object ($testContent -split "`r`n") ($result -split "`r`n")

How to download multiple files with powershell

Okay, so, here is what I ended up editing from my original answer. Kudos to #Matt for pointing out that I should be more descriptive with my answer and explain clearly what my edits were so that other users might be able to benefit from my answer in the futre. Users like #Matt are a great part of this community and put emphasis on keeping the standards high here.
The first thing I edited/added is the ability to delete the previous log from each run. Since this script will be scheduled it is important to remove the previous logs in order to prevent utilizing up too much disk space. This can be noted under the comment: "delete log files from prev run"
# delete log files from prev Run
Remove-Item C:\alerts\logs\*.*
The next thing I edited/added is the ability to switch between host names. I did this to prevent the overwriting of the files. You can see this under the comment "change filename in order to prevent overwriting of log file". I accomplished this by checking the index of "$url" in the foreach loop, and checked to see if it was at the position where I needed to change the host name. I suspect there was a much more intuitive way to do this and I would just love it if someone chimed in with a better way to do this as its driving me crazy that I don't know a better way. It should be noted that there are a total of 44 urls where I'm downloading from, hence the magic numbers (11, 22, 33) where I change the host name. Again, if you know a better way please don't hesitate to let me know.
If ($urls.IndexOf($url) -eq 11){
$currentDir = "goxsd1704"
}
ElseIf ($urls.IndexOf($url) -eq 22){
$currentDir = "goxsd1705"
}
ElseIf ($urls.IndexOf($url) -eq 33){
$currentDir = "goxsd1706"
}
The next thing I edited/added, thanks to #Matt for the recommendation is the try catch blocks which are clearly noted in the code. I should of had these to start with as by not having them before I was assuming that the script was always going to work. Rookie mistake and point taken.With that being said, these are all my edits. The code is working fine, but improvement is always possible. Thank you for your time and answers.
# set date
$date = Get-Date -UFormat "%Y-%m-%d-%H_EST"
# delete log files from prev Run
Remove-Item C:\alerts\logs\*.*
# setup download links
$urls = "http://subdomain.domain.com:portnumber/LogA/API_DBG_CS_Logs/dbg_a.$date.log"
function DownloadFMOSLogs()
{
try
{
# assign working dir to currentDir
$currentDir = "goxsd1703"
# instantiate web-client.
$wc = New-Object System.Net.WebClient
# loop through each url
foreach ($url in $urls)
{
# change filename to prevent overwriting of log file
If ($urls.IndexOf($url) -eq 11){
$currentDir = "goxsd1704"
}
ElseIf ($urls.IndexOf($url) -eq 22){
$currentDir = "goxsd1705"
}
ElseIf ($urls.IndexOf($url) -eq 33){
$currentDir = "goxsd1706"
}
# get file name
$fileName = $url.SubString($url.LastIndexOf('/')+1)
# create target file name
$targetFileName = "C:\alerts\logs\" + $currentDir + "_" + $fileName
$wc.DownloadFile($url, $targetFileName)
Write-Host "Downloaded $url to file location $targetFileName"
}
} catch [System.Net.WebException],[System.IO.IOException]
{
"An error occurred. Files were not downloaded."
}
}
DownloadFMOSLogs
Write-Host ""
Write-Host "Download of application log files has successfully completed!"
Invoke-WebRequest is a good way in Powershell to download files and the OutFile parameter will put this straight to disk, docs are here.
Have a go with Invoke-WebRequest -Uri $link -OutFile $targetFileName
You have a couple of problems and an issue or two.
$urls is not an array like you think it is. It is actually one whole string. Try something like this instead:
$urls = "http://subdomain.domain.com:port/LogA/API_DBG_CS_Logs/dbg_a.$date.log",
"http://subdomain.domain.com:port/LogA/API_DBG_CS_Logs/dbg_b.$date.log"
The variable will expand in that string just fine. The issue before is that you were concatenating the string starting from the first part because of the order of operations. When you add an array to a string on the left hand side the array gets converted to a space delimited string. Have a look at a smaller example which is exactly what you tried to do.
"hello" + 2,2 + "there"
You could have made what you had work if you wrapped each one in a set of brackets first.
("hello" + 2),(2 + "there")
This code might make sense elsewhere but as others have pointed out you have a useless loop about lines in a file. foreach($line in Get-Content .\hosts.txt). If you don't use it get rid of it.
You don't really use $targetDir to its full potential. If you are going to use the working directory of the script at least use some absolute paths. Side note the comments don't really match what is happening which is likely related to 2. above
# preprend host to file name to keep file names diff.
$targetFilePath = [io.path]::combine($pwd,"test.txt")
# download the files.
$wc.DownloadFile($link, $targetFilePath)
You should try and make that unique somehow since the files will overwrite eachother as you have it coded.
I would also wrap that in a try block in case the download fails and you can report properly on that. As of now you assume it will work every time.

Powershell 3 clone an object has unexpected results - affects other object [duplicate]

This question already has answers here:
PowerShell copy an array completely
(5 answers)
Closed 5 years ago.
I have a problem with a sample of code I am writing. It is a bit of a simple question but it is an issue that has taken some time, which I do not have. I already tried the stackoverflow relevant questions and search but did not find anything that helps a lot.
I have the following code:
#Importing some file from a csv to a variable
$output = import-csv -LiteralPath ...some file imports OK
##
#Copy the layout of the csv imported for further processing..
##
$extraOut = $output.clone()
$extraOut | ForEach-Object {
$_.innerlinks1 = ""
$_.innerlinksurl1 = ""
}
When I try to print out the value of $output, by using $_ I get the empty strings that I previously assigned (which I do not want). Why does this happen?
#Trying to print out $output should have data and not empty strings.
$output | ForEach-Object { Write-Host $_ }
Any help, or code that allows to copy the structure of an Import-csv result (with or without PSObject cloning) would also be great.
NOTE: After finding a helpful answer, I also think that the problem needs more detailed scripting since there is a lot of empty strings in my file that might cause additional issues in the future.
I use the answer in this link whenever I need to deep copy an array. To quote the original answer:
# Get original data
$data = Import-Csv ...
# Serialize and Deserialize data using BinaryFormatter
$ms = New-Object System.IO.MemoryStream
$bf = New-Object System.Runtime.Serialization.Formatters.Binary.BinaryFormatter
$bf.Serialize($ms, $data)
$ms.Position = 0
$data2 = $bf.Deserialize($ms)
$ms.Close()
# Use deep copied data
$data2