Set-SFTPFile upload full directory - powershell

I need to upload full directory (with recourcive folders) to the server by sftp
#SSH
$secpasswd = ConvertTo-SecureString $Password -AsPlainText -Force
$Credentials = New-Object System.Management.Automation.PSCredential($User, $secpasswd)
$sftpSession = New-SFTPSession -ComputerName $HostIP -Credential $Credentials
#Folders Paths
$FilePath = Get-ChildItem -Path uploading-folder | Select-Object -ExpandProperty FullName
$SftpPath = "/home/new-folder"
# Upload the file to the SFTP path
Set-SFTPFile -SessionId ($sftpSession).SessionId -LocalFile $FilePath -RemotePath $SftpPath -Overwrite
#Disconnect all SFTP Sessions
Get-SFTPSession | % { Remove-SFTPSession -SessionId ($_.SessionId) }
But i cant upload folders inside folders (and filse in them). How can I upload full folder with files and folders inside it?

Posh does not seem to support recursive operations. You have to code it on your own.
Quoting Reddit post SFTP - How to upload an entire folder/directory instead of one file at a time?:
# Get a recursive list of files in the target folder ($path).
# Change the include to your needs,
# and remove the -Force if you don't want hidden files included
$path = "./Recursion test/"
$uploadFiles = Get-ChildItem $path -Recurse -Include "*" -Force
$session = New-SFTPSession (your args here)
$remoteFolder = Get-SFTPCurrentDirectory $sessoin.Index
Set-SFTPDirectoryPath $session.Index -Path $destinationPath
foreach ($item in $uploadFiles) {
if ($item.GetType -eq [System.IO.DirectoryInfo]) {
New-SFTPDirectory $sessionIndex $item
}
else {
$localFolder = $item.PSPath | Split-Path | Split-Path -Leaf
if ($localFolder -ne $remoteFolder) {
Set-SFTPCurrentDirectory $session.Index $localFolder
$remoteFolder = $localFolder
}
Set-SFTPFile $session.Index $item.Name
}
}
Or use another SFTP library that supports recursive operations.

Code :
$folders = #('D:\Shares\folderone\*.pdf','D:\Shares\foldertwo\*.pdf')
foreach ($folder in $folders) {
$files = Get-ChildItem $folder
foreach ($file in $files) {
Set-SFTPItem -SessionId $SFTPSession.SessionId -Path $file -Destination /upload -Verbose
}
}

Related

Sort file in SFTP based on datetime using PowerShell

I have multiple files in the SFTP folder that were created at the same time but at different dates in their filenames.
Example file:
REGISTRATION_ELI_20210422_071008.csv
REGISTRATION_ELI_20210421_071303.csv
REGISTRATION_ELI_20210420_071104.csv
I want to copy 1 file with today's date in its filename and send it to local. Which property can I use to list files and copy the one that matches today's date?
#Setting credentials for the user account
$password = ConvertTo-SecureString "password" -AsPlainText -Force
$creds = New-Object System.Management.Automation.PSCredential ("gomgom", $password)
$SFTPSession = New-SFTPSession -ComputerName 172.16.xxx.xxx -Credential $Credential -AcceptKey
# Set local file path and SFTP path
$LocalPath = "D:\WORK\Task - Script\20221010 - AJK - ITPRODIS380 - upload file csv ke sql server\csvfile"
$SftpPath = '/Home Credit/Upload/REGISTRATION_ELI_*.csv'
Get-SFTPItem -SessionId $SFTPSession.SessionID -Path $SftpPath -Destination $LocalPath | Sort-Object {[datetime] ($_.BaseName -replace '^.+_(\d{4})(\d{2})(\d{2}) (\d{2})(\d{2})', '$1-$2-$3 $4:$5:') } | Select-Object Name
Remove-SFTPSession $SFTPSession -Verbose
You can try to change your $SftpPath like this :
$SftpPath = "/Home Credit/Upload/REGISTRATION_ELI$([datetime]::Now.toString('yyyyMMdd'))_*.csv"
I introduce the date in he path you look for :
/Home Credit/Upload/REGISTRATION_ELI20221221_*.csv
You can perhaps solve your problem by first listing remote files and them download the one you want by date. I don't test but it gi can give something like that :
$SftpPath = '/Home Credit/Upload'
$SftpPath = "/Home Credit/Upload/REGISTRATION_ELI$([datetime]::Now.toString('yyyyMMdd'))_*.csv"
$Files = (Get-SFTPChildItem -SessionId $Session.SessionId -Path "$SftpPath") | where {$_.name -like $SftpPath}
foreach ($file in $files)
{
Get-SFTPItem -SessionId $SFTPSession.SessionID -Path $file -Destination $LocalPath
}

New-PSsession to create a loop and wait to finish foreach line in text file

I am trying to get files from servers in a list using the below
$server = Get-Content server.txt
$server| ForEach-Object {
$session=new-pssession -computername $server -credential (Import-Clixml "mycredentials.xml")
Invoke-Command -Session $session -ScriptBlock ${function:getfiles}
Copy-Item -path "C:\some\folder\*" -Destination "C:\localfolder" -recurse -FromSession $session
}
If I supply explicitly a name in -computername, works like a charm.
When there are several names in the list, the execution stops after the first one. I suspect that the session closes after the first execution.
Is there a way to make it like this:
get-content -> for each line execute the copy-item -> close session -> open new session to new server -> .....etc, meaning that $session will be only for the current server.
$function:getfiles
function getfiles {
New-Item -Force -Path C:\path\trace.txt
$remoteserver=$env:computername
$trace='C:\path\trace.txt'
$Include = #('*.keystore', '*.cer', '*.crt', '*.pfx', '*.jks', '*.ks')
$exclude = '^C:\\(Windows|Program Files|Documents and Settings|Users|ProgramData)|\bBackup\b|\breleases?\b|\bRECYCLE.BIN\b|\bPerfLogs\b|\bold\b|\bBackups\b|\brelease?\b|'
Get-ChildItem -Path 'C:\','D:\' -file -Include $include -Recurse -EA 0|
Where-Object { $_.DirectoryName -notmatch $exclude } |
Select-Object -ExpandProperty FullName |
Set-Content -Path $trace
$des = "C:\some\folder\$remoteserver"
$safe = Get-Content $trace
$safe | ForEach-Object{
#find drive-delimeter
$first=$_.IndexOf(":\");
if($first -eq 1){
#stripe it
$newdes=Join-Path -Path $des -ChildPath #($_.Substring(0,1)+$_.Substring(2))[0]
}
else{
$newdes=Join-Path -Path $des -ChildPath $_
}
$folder=Split-Path -Path $newdes -Parent
$err=0
#check if folder exists"
$void=Get-Item $folder -ErrorVariable err -ErrorAction SilentlyContinue
if($err.Count -ne 0){
#create when it doesn't
$void=New-Item -Path $folder -ItemType Directory -Force -Verbose
}
$void=Copy-Item -Path $_ -destination $newdes -Recurse -Container -Verbose
}
}
UPDATE
So I have found out that the file where the lines should be be redirected from the script is not populated, which explains why the next step for copy-item fails. I have tried redirecting in different ways, still cant get it populated. The file is created without issues.
Made a workaround - placed the function in a script which is copied to the remote server / execute it \ clean afterwards.

Powershell 2.0 extract certain files from zip (include subdirectories)

Apologies, this question is scattered on the internet but I have yet to find a satisfactory answer that uses only Powershell 2.0 (with .NET v3.5) - no external libraries or programs
I'm using the following code to extract log.txt from ZipFile.zip (no matter log.txt's location)
$Destination = (new-object -com shell.application).NameSpace('C:\ZipExtractDir')
$ZipFile = (new-object -com shell.application).NameSpace('C:\ZipFile.zip')
$Destination.CopyHere(($Zipfile.Items() | where-object {$_.Name -like '*log.txt'}), 1044)
Works if log.txt is in directory root \log.txt
Fails if log.txt is in a subdirectory \Subfolder\log.txt
Fails if referencing the literal (.zip) path
{$_.Name -Like '*Subfolder\log.txt'} (both double & single quotes fail)
Have tried using -eq -like -contains '' "" $_.FullName
I'm quite certain that I'm filtering incorrectly - can anyone help with this code so that it will parse subdirectories as well?
Similar to what you have already done, you can set up the Shell.Application namespaces like this. Then you can copy the extracted directory to the destination path.
$zipFilePath = "Zipfile.zip"
$destinationPath = "C:\Users\Public\Downloads"
$zipfile = (New-Object -Com Shell.Application).NameSpace($zipFilePath)
$destination = (New-Object -Com Shell.Application).NameSpace($destinationPath)
$destination.CopyHere($zipfile.Items())
Then to list the log.txt files, we can contruct the full extracted path with Join-Path. This basically just appends the zip file name from System.IO.Path.GetFileNameWithoutExtension() to the destination path. Then just use Get-ChildItem to list the files recursively with the -Recurse and -Filter switches.
$extractedPath = Join-Path -Path $destinationPath -ChildPath ([System.IO.Path]::GetFileNameWithoutExtension($zipFilePath))
Get-ChildItem -Path $extractedPath -Filter log.txt -Recurse
And to test this for PowerShell 2.0 we can use -version 2 with powershell.exe:
powershell.exe -version 2 .\test.ps1
UPDATE
If you want to inspect files before extracting, you'll need to recurse the directories yourself. Below is a demo of how this can be done.
function New-ZipChildRootFolder
{
param
(
[string]$DestinationPath,
[string]$ZipFileName
)
$folderPath = Split-Path -Path $ZipFileName -Leaf
$destination = (New-Object -ComObject Shell.Application).NameSpace($DestinationPath)
$destination.NewFolder($folderPath)
}
function Get-ZipChildItems
{
param
(
[string]$ZipFilePath,
[string]$DestinationPath
)
$zipfile = (New-Object -ComObject Shell.Application).NameSpace($ZipFilePath)
$zipFileName = [System.IO.Path]::GetFileNameWithoutExtension($ZipFilePath)
Write-Output "Create root zip folder : $zipFileName"
New-ZipChildRootFolder -DestinationPath $DestinationPath -ZipFileName $zipFileName
foreach ($item in $zipFile.items())
{
Get-ZipChildItemsRecurse -Items $item -DestinationPath $DestinationPath -ZipFileName $zipFileName
}
}
function Get-ZipChildItemsRecurse
{
param
(
[object]$Items,
[string]$DestinationPath,
[string]$ZipFileName
)
foreach ($file in $Items.getFolder.Items())
{
if ($file.IsFolder -eq $true)
{
Write-Output "Creating folder : $($file.Path)"
New-ZipChildFolder -Folder $file -DestinationPath $DestinationPath -ZipFileName $ZipFileName
Get-ZipChildItemsRecurse -Items $file -DestinationPath $DestinationPath -ZipFileName $ZipFileName
}
else
{
$filename = Split-Path -Path $file.Path -Leaf
if ($filename -eq "log.txt")
{
Write-Output "Copying file : $($file.Path)"
New-ZipChildFile -File $file -DestinationPath $DestinationPath -ZipFileName $ZipFileName
}
}
}
}
function New-ZipChildFile
{
param
(
[object]$File,
[string]$DestinationPath,
[string]$ZipFileName
)
$destination = New-Object -ComObject Shell.Application
$items = $File.Path.Split("\")
$zipRootIndex = [array]::IndexOf($items, $ZipFileName)
$path = $items[$zipRootIndex..($items.Length - 2)] -join "\"
$fullPath = Join-path -Path $DestinationPath -ChildPath $path
$destination.NameSpace($fullPath).CopyHere($File)
}
function New-ZipChildFolder
{
param
(
[object]$Folder,
[string]$DestinationPath,
[string]$ZipFileName
)
$destination = New-Object -ComObject Shell.Application
$items = $Folder.Path.Split("\")
$zipRootIndex = [array]::IndexOf($items, $ZipFileName)
$folders = $items[$zipRootIndex..($items.Length - 1)]
$currentFolder = $DestinationPath
foreach ($folder in $folders)
{
$destination.NameSpace($currentFolder).NewFolder($folder)
$currentFolder = Join-Path -Path $currentFolder -ChildPath $folder
}
}
Usage:
$zipFilePath = "C:\Zipfile.zip"
$destinationPath = "C:\Users\Public\Downloads"
Get-ZipChildItems -ZipFile $zipFilePath -DestinationPath $destinationPath

Unzip a file on multiple remote servers via Powershell

I'm currently writing a simple PowerShell script.
Basically, it should get the list of servers from a notepad and start to unzip the .zip file on each server and extract it to the new folder.
However, the script is not extracting all files under the zip file.
It would only extract one file from it and I'm not sure why the foreach loop not working properly.
Please shed some light on this issue. Thanks.
$servers = Get-Content "C:\tmp\script\new_unzip\servers.txt"
$Date = ((Get-Date).ToString('dd-MM-yyyy_HH-mm-ss'))
foreach ($server in $servers) {
$shell = new-object -com shell.application
$target_path = "\\$server\c$\Temp\FFPLUS_Temp"
$location = $shell.namespace($target_path)
$ZipFiles = Get-ChildItem -Path $target_path -Filter *.zip
$ZipFiles | Unblock-File
foreach ($ZipFile in $ZipFiles) {
$ZipFile.fullname | out-default
$NewLocation = "\\$server\c$\Temp\FFPLUS_Temp\$Date"
New-Item $NewLocation -type Directory -Force -ErrorAction SilentlyContinue
Move-Item $ZipFile.fullname $NewLocation -Force -ErrorAction SilentlyContinue
$NewZipFile = Get-ChildItem $NewLocation *.zip
$NewLocation = $shell.namespace($NewLocation)
$ZipFolder = $shell.namespace($NewZipFile.fullname)
$NewLocation.copyhere($ZipFolder.items())
}
}
$servers = Get-Content "C:\tmp\script\updated\servers.txt"
$Date = ((Get-Date).ToString('dd-MM-yyyy_HH-mm-ss'))
foreach ($server in $servers)
{
$zipFolder = "\\$server\c$\Temp\FFPLUS_Temp"
Add-Type -assembly System.IO.Compression.Filesystem
$zipFiles = Get-ChildItem -Path $zipFolder -Filter *.zip
foreach($zip in $zipFiles)
{
$destPath = "\\$server\c$\Temp\FFPLUS_Temp\$Date"
New-Item -ItemType Directory $destPath
[io.compression.zipfile]::ExtractToDirectory([string]$zip.FullName, "$destPath")
Move-Item $zip.fullname $destPath -Force -ErrorAction SilentlyContinue
}
}

Most elegant way to extract a directory from a zipfile using PowerShell?

I need to unzip a specific directory from a zipfile.
Like for example extract the directory 'test\etc\script' from zipfile 'c:\tmp\test.zip' and place it in c:\tmp\output\test\etc\script.
The code below works but has two quirks:
I need to recursively find the directory ('script') in the zip file (function finditem) although I already know the path ('c:\tmp\test.zip\test\etc\script')
With CopyHere I need to determine the targetdirectory, specifically the 'test\etc' part manually
Any better solutions? Thanks.
The code:
function finditem($items, $itemname)
{
foreach($item In $items)
{
if ($item.GetFolder -ne $Null)
{
finditem $item.GetFolder.items() $itemname
}
if ($item.name -Like $itemname)
{
return $item
}
}
}
$source = 'c:\tmp\test.zip'
$target = 'c:\tmp\output'
$shell = new-object -com shell.application
# find script folder e.g. c:\tmp\test.zip\test\etc\script
$item = finditem $shell.NameSpace($source).Items() "script"
# output folder is c:\tmp\output\test\etc
$targetfolder = Join-Path $target ((split-path $item.path -Parent) -replace '^.*zip')
New-Item $targetfolder -ItemType directory -ErrorAction Ignore
# unzip c:\tmp\test.zip\test\etc\script to c:\tmp\output\test\etc
$shell.NameSpace($targetfolder).CopyHere($item)
I don't know about most elegant, but with .Net 4.5 installed you could use the ZipFile class from the System.IO.Compression namespace:
[Reflection.Assembly]::LoadWithPartialName('System.IO.Compression.FileSystem') | Out-Null
$zipfile = 'C:\path\to\your.zip'
$folder = 'folder\inside\zipfile'
$dst = 'C:\output\folder'
[IO.Compression.ZipFile]::OpenRead($zipfile).Entries | ? {
$_.FullName -like "$($folder -replace '\\','/')/*"
} | % {
$file = Join-Path $dst $_.FullName
$parent = Split-Path -Parent $file
if (-not (Test-Path -LiteralPath $parent)) {
New-Item -Path $parent -Type Directory | Out-Null
}
[IO.Compression.ZipFileExtensions]::ExtractToFile($_, $file, $true)
}
The 3rd parameter of ExtractToFile() can be omitted. If present it defines whether existing files will be overwritten or not.
As far as the folder location in a zip is known, the original code can be simplified:
$source = 'c:\tmp\test.zip' # zip file
$target = 'c:\tmp\output' # target root
$folder = 'test\etc\script' # path in the zip
$shell = New-Object -ComObject Shell.Application
# find script folder e.g. c:\tmp\test.zip\test\etc\script
$item = $shell.NameSpace("$source\$folder")
# actual destination directory
$path = Split-Path (Join-Path $target $folder)
if (!(Test-Path $path)) {$null = mkdir $path}
# unzip c:\tmp\test.zip\test\etc\script to c:\tmp\output\test\etc\script
$shell.NameSpace($path).CopyHere($item)
Windows PowerShell 5.0 (included in Windows 10) natively supports extracting ZIP files using Expand-Archive cmdlet:
Expand-Archive -Path Draft.Zip -DestinationPath C:\Reference