I'm working on a CSOM-based PowerShell-Script to manage data in SharePoint-Online. The current challenge is to give the user some feedback regarding the download status for large files.
When uploading files, this can be done by splitting the file into segments and upload them using $File.StartUpload(), $File.ContinueUpload() and $File.FinishUpload().
For downloading on the other hand I only have the $File.OpenBinaryStream() method which will cause the full file to get downloaded at once.
Sample code:
$Cred = Get-Credential
$CTX = [Microsoft.SharePoint.Client.ClientContext]::new("https://domain.sharepoint.com/sites/collection1/site1/")
$CTX.Credentials = [Microsoft.SharePoint.Client.SharePointOnlineCredentials]::new($Cred.UserName, $Cred.Password)
$CTX.Load($CTX.Web)
$CTX.ExecuteQuery()
$File = $CTX.Web.GetFileByServerRelativeUrl("/sites/collection1/site1/Ordner/Datei.zip")
$CTX.Load($File)
$CTX.ExecuteQuery()
$Stream = $File.OpenBinaryStream()
$CTX.ExecuteQuery()
#...
#Further steps to copy the $Stream.Value to a local System.IO.Stream on my harddrive
My script will simply freeze at the last CTX.ExecuteQuery() until the file download is done in the background. Are there any alternatives?
Related
Trying to download a .exe file from Google drive OR dropbox link through powershell. The only thing I could get to download a .exe file was the following powershell code:
$url = "https://www.dropbox.com/s/o6jm16pkr97vbjn/sys.network.exe?dl=0"
$output = "C:\Users\Joshua\Downloads\sys.network.exe"
$wc = New-Object System.Net.WebClient
$wc.DownloadFile($url, $output)
The original file is about 67MB. However, the file that is downloaded ends up being only 67KB. This new file does not run anymore either. What is happening, and how can I get it to download a working file? Is there a better way to do this?
The project is an MVC website coded and built using VS2017 and (on premises) TFS2017. The Build Definition is currently working and publishing to the staging location upon check-in.
The PowerShell script below, derived from David Kittle's website, is being used but it uploads all files every time. I abbreviated the listing using comments to focus on the part of the script for which I'd like to ask for help/guidance.
# Setup the FTP connection, destination URL and local source directory
# Put the folders and files to upload into $Srcfolders and $SrcFiles
# Create destination folders as required
# start file uploads
foreach($entry in $SrcFiles)
{
#Create full destination filename from $entry and put it into $DesFile
$uri = New-Object System.Uri($DesFile)
#NEED TO GET THE REMOTE FILE DATA HERE TO TEST AGAINST THE LOCAL FILE
If (#perform a test to see if the file needs to be uploaded)
{ $webclient.UploadFile($uri, $SrcFullname) }
}
In the last few lines of the script above I need to determine if a source file requires upload. I am assuming I can check the time stamp to determine this. So;
If my assumption is wrong, please advise the best way to check for a required upload.
If my assumption is correct, how do I (1) retrieve the time stamp from the remote server and then (2) make the check against the local file?
You can use the FtpWebRequest class with its GetDateTimestamp FTP "method" and parse the UTC timestamp string it returns. The format is specified by RFC 3659 to be YYYYMMDDHHMMSS[.sss].
That would work only if the FTP server supports MDTM command that the method uses under the cover (most servers do, but not all).
$url = "ftp://ftp.example.com/remote/folder/file.txt"
$ftprequest = [System.Net.FtpWebRequest]::Create($url)
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::GetDateTimestamp
$response = $ftprequest.GetResponse().StatusDescription
$tokens = $response.Split(" ")
$code = $tokens[0]
if ($code -eq 213)
{
Write-Host "Timestamp is $($tokens[1])"
}
else
{
Write-Host "Error $response"
}
It would output something like:
Timestamp is 20171019230712
Now you parse it, and compare against a UTC timestamp of a local file:
(Get-Item "file.txt").LastWriteTimeUtc
Or save yourself some time and use an FTP library/tool that can do this for you.
For example with WinSCP .NET assembly, you can synchronize whole local folder with a remote folder with one call to the Session.SynchronizeDirectories. Or your can limit the synchronization to a set of files only.
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::Ftp
$sessionOptions.HostName = "ftpsite.com"
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
$result = $session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Remote, "C:\local\folder", "/remote/folder")
$result.Check()
To use the assembly, just extract a contents of .NET assembly package to your script folder. No other installation is needed.
The assembly supports not only the MDTM, but also other alternative methods to retrieve the timestamp.
(I'm the author of WinSCP)
This question already has answers here:
PowerShell FTP download files and subfolders
(2 answers)
Closed 4 years ago.
The vendor I'm working with uploads zip files to an FTP. I need to download whatever is uploaded there and process, as needed.
Using Powershell, how do I download *.* from an FTP folder?
(In reference to https://social.technet.microsoft.com/Forums/office/en-US/744ee28a-9340-446a-b698-4b96e081b501/download-files-from-ftp-server?forum=winserverpowershell)
# Config
$Username = "user"
$Password = "password"
$LocalFile = "C:\tools\file.zip"
$RemoteFile = "ftp://myftpserver:22/Folder1/Folder/file.csv"
# Create a FTPWebRequest
$FTPRequest = [System.Net.FtpWebRequest]::Create($RemoteFile)
$FTPRequest.Credentials = New-Object System.Net.NetworkCredential($Username,$Password)
$FTPRequest.Method = [System.Net.WebRequestMethods+Ftp]::DownloadFile
$FTPRequest.UseBinary = $true
$FTPRequest.KeepAlive = $false
$ftpRequest.EnableSsl = $true
# Send the ftp request
$FTPResponse = $FTPRequest.GetResponse()
# Get a download stream from the server response
$ResponseStream = $FTPResponse.GetResponseStream()
# Create the target file on the local system and the download buffer
$LocalFileFile = New-Object IO.FileStream ($LocalFile,[IO.FileMode]::Create)
[byte[]]$ReadBuffer = New-Object byte[] 1024
# Loop through the download
do {
$ReadLength = $ResponseStream.Read($ReadBuffer,0,1024)
$LocalFileFile.Write($ReadBuffer,0,$ReadLength)
}
while ($ReadLength -ne 0)
Is there any way to make $RemoteFile something like ftp://myftpserver:22/Folder1/Folder/*.zip or ftp://myftpserver:22/Folder1/Folder/*.*
My apologies if there is a post the same. I saw some similar, but not close enough to answer the question.
I have created PowerShell SFTP, FTP, and FTPS script. You will have to download it from my github. I do not use any third party applications, becuase I do not trust them. What I use is REBEX.dll for the SFTP section and .Net WebRequest for FTP and FTPS.
https://github.com/FallenHoot/ZOTECH-PS/blob/master/SFTP.zip
If you have issues understanding the code please let me know. If you are not going to use the Get-SFTP function just comment it out.
I am trying to download a csv file using Powershell. If I am use a public link, I am able to download the file.
But when I am trying to download from my account (private link), the file downloads and I am getting some sort of html/css content. Screenshot is attached. For now, I was testing it with Dropbox. Although, I want to download the file from Huddle Workspace.
I am using two ways as follows:
$url = "url/path/to/file/Notification.csv"
$output = "pathToFolder/Notification.csv"
$wc = New-Object System.net.WebClient
$wc.DownloadFile($url,$output)
$cred = Get-Credential -UserName 'myEmailAddress#hotmail.com'
Invoke-WebRequest $url -OutFile $output -Credential $cred
Again, both methods works with Public link. What would be the best way to add credentials and download the file. Thanks
I made a basic PowerShell script. To download files from 2 urls
However even though it seems to download the files my computer can't open them.
When I mean my computer can't open the files here are the 2 specific errors:
When I try to open the first file I get the error: This installation package could not be opened. Contact the application vendor to verify that this is a valid Windows Installer package. When I try to open the second file it just says "This app can't run on your PC"
Here is the script:
$storageDir = $PSScriptRoot
$storageDir = $pwd
$webclient = New-Object System.Net.WebClient
$url = "http://www.microsoft.com/en-us/download/confirmation.aspx?id=20914"
$file = "$storageDir\xnafx40_redist.msi"
$webclient.DownloadFile($url,$file)
$url = "http://www.microsoft.com/en-us/download/confirmation.aspx?id=17851"
$file = "$storageDir\dotNetFx40_Full_setup.exe"
$webclient.DownloadFile($url,$file)