Use powershell to download unknown files from a remote https url - powershell

I'm trying to download files from a remote url to a local folder. The filenames that exist on the remote side will never be known. Similar to how you would just use: copy c:\downloadfiles*.* c:\dump
For now let's assume the url is open to everyone (no credentials needed):
$localPath = "C:\dump"
$url = "https://remoteserver/folder"
$WebClient = New-Object "System.Net.WebClient"
$WebClient.DownloadFile($url, $localPath)
But I just keep getting:
Exception calling "DownloadFile" with "2" argument(s): "An exception occurred during a WebClient request."
I realize that you probably can't download with HTTP/HTTPS using wildcard but perhaps we can parse a remote file from "https://remoteserver/folder/FileList.txt" and use wildcard in there somehow, even if it's just by filetype *.jpg.

Related

Error 403 when trying to access URL, able to access through browser without error

I'm trying to access a public file using a URL from Australian Bureau Of Statistics.
https://stat.data.abs.gov.au/sdmx-json/data/ABS_BLDG_APPROVALS_LGA2020/1.1.1.110.LGA2020.20660+20740+20830+20910+21110+21180+21450+21610+21890+22170+22310+22490+22670+22750+23110+23270+23430+23670+24130+24210+24330+24410+24600+24650+24850+24970+25060+25150+25250+25340+25710+25900+26080+26170+26350+26490+26980+27070+27260+27350+27450.M/all?detail=Full&dimensionAtObservation=AllDimensions&startPeriod=2020-07&endPeriod=2020-09
And can do so without an error using Firefox but when I try using powershell I get
"Exception calling "DownloadFile" with "2" argument(s): "The remote server returned an error: (403) Forbidden."
The code I'm using:
(New-Object System.Net.WebClient).DownloadFile('https://stat.data.abs.gov.au/sdmx-json/data/ABS_BLDG_APPROVALS_LGA2020/1.1.1.110.LGA2020.20660+20740+20830+20910+21110+21180+21450+21610+21890+22170+22310+22490+22670+22750+23110+23270+23430+23670+24130+24210+24330+24410+24600+24650+24850+24970+25060+25150+25250+25340+25710+25900+26080+26170+26350+26490+26980+27070+27260+27350+27450.M/all?detail=Full&dimensionAtObservation=AllDimensions&startPeriod=2020-07&endPeriod=2021-09','C:\temp\test')
powershell version 5.1
Edit: I should have mentioned that I have successfully run the powershell script on other websites, without error
Just for completeness sake this script is now returning the file without error, thanks to #Daniel comment.
$wb = New-Object System.Net.WebClient;
$wb.Headers.Add("User-Agent: Other");
$wb.DownloadFile('https://stat.data.abs.gov.au/sdmx-json/data/ABS_BLDG_APPROVALS_LGA2020/1.1.1.110.LGA2020.20660+20740+20830+20910+21110+21180+21450+21610+21890+22170+22310+22490+22670+22750+23110+23270+23430+23670+24130+24210+24330+24410+24600+24650+24850+24970+25060+25150+25250+25340+25710+25900+26080+26170+26350+26490+26980+27070+27260+27350+27450.M/all?detail=Full&dimensionAtObservation=AllDimensions&startPeriod=2020-07&endPeriod=2021-09','C:\temp\test')

Sharepoint online copy file script

I am having trouble copying a file within a sharepoint online list using powershell. Error I am getting is
Exception calling "ExecuteQuery" with "0" argument(s): "Server relative urls must start with SPWeb.ServerRelativeUrl"
The path is correct as i can combine context.url with the path variables and access the file using that path. I used similar paths except with getfolderbyrelativeurl to set permissions on folders with no issues (same list).
Here is the code.
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl)
$SourceFile =$context.Web.GetFileByServerRelativeUrl("/$ListName/$sa_man_checklist")
$Context.Load($SourceFile)
$Context.ExecuteQuery()
I am very new sharepoint online and any help is much appreciated
Found the cause, not sure how anything before this worked. Server relative url should have started after host name rather than what i specified in context. Odd thing is when i called folder by server relative path taking url specified in context it still works just fine but when i try to call a file using that same method it breaks...

Error in download file using powershell

Am trying to download sqlserver express 2012 on my EC2 instance.following sample given here.
this is my script:
$storageDir = "C:\download"
$webclient = New-Object System.Net.WebClient
$url = "https://www.microsoft.com/en-us/download/confirmation.aspx?id=29062"
$file = "$storageDir"
$webclient.DownloadFile($url,$file)
C:\download is where i want the downloaded file to be save.
but i keep getting this error:
Exception calling"DownloadFile" with "2" argument(s):"An exception occurred
during a webClient request."
At line:1 chart:1
+$wc.DownloadFile($url,$output)
+$webclient.DownloadFile($url,"file)
+CategoryInfo :NotSpecified:(:)[], MethodInvocationException
Can anyone please tell me what am i doing wrong.
I have tried to download from my pc and copying to aws vis rdc but takes hours
The second parameter to the DownloadFile method is the path to the filename you want to use for saving the file - not the directory. You would need to change your line to:
$file = "$storageDir\SQLEXPRE_x64_ENU.exe"
Reference: https://msdn.microsoft.com/en-us/library/ez801hhe.aspx

How do I query a file on FTP server in PowerShell to determine if an upload is required?

The project is an MVC website coded and built using VS2017 and (on premises) TFS2017. The Build Definition is currently working and publishing to the staging location upon check-in.
The PowerShell script below, derived from David Kittle's website, is being used but it uploads all files every time. I abbreviated the listing using comments to focus on the part of the script for which I'd like to ask for help/guidance.
# Setup the FTP connection, destination URL and local source directory
# Put the folders and files to upload into $Srcfolders and $SrcFiles
# Create destination folders as required
# start file uploads
foreach($entry in $SrcFiles)
{
    #Create full destination filename from $entry and put it into $DesFile
    $uri = New-Object System.Uri($DesFile)
    #NEED TO GET THE REMOTE FILE DATA HERE TO TEST AGAINST THE LOCAL FILE
If (#perform a test to see if the file needs to be uploaded)
{ $webclient.UploadFile($uri, $SrcFullname) }
}
In the last few lines of the script above I need to determine if a source file requires upload. I am assuming I can check the time stamp to determine this. So;
If my assumption is wrong, please advise the best way to check for a required upload.
If my assumption is correct, how do I (1) retrieve the time stamp from the remote server and then (2) make the check against the local file?
You can use the FtpWebRequest class with its GetDateTimestamp FTP "method" and parse the UTC timestamp string it returns. The format is specified by RFC 3659 to be YYYYMMDDHHMMSS[.sss].
That would work only if the FTP server supports MDTM command that the method uses under the cover (most servers do, but not all).
$url = "ftp://ftp.example.com/remote/folder/file.txt"
$ftprequest = [System.Net.FtpWebRequest]::Create($url)
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::GetDateTimestamp
$response = $ftprequest.GetResponse().StatusDescription
$tokens = $response.Split(" ")
$code = $tokens[0]
if ($code -eq 213)
{
Write-Host "Timestamp is $($tokens[1])"
}
else
{
Write-Host "Error $response"
}
It would output something like:
Timestamp is 20171019230712
Now you parse it, and compare against a UTC timestamp of a local file:
(Get-Item "file.txt").LastWriteTimeUtc
Or save yourself some time and use an FTP library/tool that can do this for you.
For example with WinSCP .NET assembly, you can synchronize whole local folder with a remote folder with one call to the Session.SynchronizeDirectories. Or your can limit the synchronization to a set of files only.
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::Ftp
$sessionOptions.HostName = "ftpsite.com"
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
$result = $session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Remote, "C:\local\folder", "/remote/folder")
$result.Check()
To use the assembly, just extract a contents of .NET assembly package to your script folder. No other installation is needed.
The assembly supports not only the MDTM, but also other alternative methods to retrieve the timestamp.
(I'm the author of WinSCP)

PowerShell: how to move a file from a remote computer to a network share

I am using PowerShell to move a file from a network share to a remote computer (which I do not have admin rights on). The source path on the network share is \\share_computer\some_folder\file1.txt. The destination path to the file on the remote computer is \\remote_computer\d$\another_folder.
A simple Move-Item $from $to doesn't work. I get a PermissionDenied message when I try to access the network share. However, I have confirmed that I can access the shared file via something like
`$data = Get-Content "\\share_computer\some_folder\file1.txt"
$var = $data[0]`
I then tried the following:
$src = "\\share_computer\some_folder\file1.txt"
$dest = "\\remote_computer\d$\another_folder"
$username = "my_username"
$password = "my_password"
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = New-Object System.Net.NetworkCredential($username, $password)
$WebClient.DownloadFile($src, $dest)
PowerShell is throwing the following error:
Exception calling "DownloadFile" with "2" argument(s): "An exception occurred during a WebClient request."
I don't know why PowerShell is throwing this error. Assuming the above is the correct technique to move the file, what do I need to do to correct it? Or, if the above is the incorrect technique, what should I do?