I have multiple files in the SFTP folder that were created at the same time but at different dates in their filenames.
Example file:
REGISTRATION_ELI_20210422_071008.csv
REGISTRATION_ELI_20210421_071303.csv
REGISTRATION_ELI_20210420_071104.csv
I want to copy 1 file with today's date in its filename and send it to local. Which property can I use to list files and copy the one that matches today's date?
#Setting credentials for the user account
$password = ConvertTo-SecureString "password" -AsPlainText -Force
$creds = New-Object System.Management.Automation.PSCredential ("gomgom", $password)
$SFTPSession = New-SFTPSession -ComputerName 172.16.xxx.xxx -Credential $Credential -AcceptKey
# Set local file path and SFTP path
$LocalPath = "D:\WORK\Task - Script\20221010 - AJK - ITPRODIS380 - upload file csv ke sql server\csvfile"
$SftpPath = '/Home Credit/Upload/REGISTRATION_ELI_*.csv'
Get-SFTPItem -SessionId $SFTPSession.SessionID -Path $SftpPath -Destination $LocalPath | Sort-Object {[datetime] ($_.BaseName -replace '^.+_(\d{4})(\d{2})(\d{2}) (\d{2})(\d{2})', '$1-$2-$3 $4:$5:') } | Select-Object Name
Remove-SFTPSession $SFTPSession -Verbose
You can try to change your $SftpPath like this :
$SftpPath = "/Home Credit/Upload/REGISTRATION_ELI$([datetime]::Now.toString('yyyyMMdd'))_*.csv"
I introduce the date in he path you look for :
/Home Credit/Upload/REGISTRATION_ELI20221221_*.csv
You can perhaps solve your problem by first listing remote files and them download the one you want by date. I don't test but it gi can give something like that :
$SftpPath = '/Home Credit/Upload'
$SftpPath = "/Home Credit/Upload/REGISTRATION_ELI$([datetime]::Now.toString('yyyyMMdd'))_*.csv"
$Files = (Get-SFTPChildItem -SessionId $Session.SessionId -Path "$SftpPath") | where {$_.name -like $SftpPath}
foreach ($file in $files)
{
Get-SFTPItem -SessionId $SFTPSession.SessionID -Path $file -Destination $LocalPath
}
Related
I am trying to add users to active directory form CSV files dropped in a particular folder by using PowerShell script. This script will check AD if the users have been created and archive the CSV files before deleting them in readiness for another drops of CSV files. The power shell script will scheduled to process the CSV files dropped into the folder. I also want to user a separate but similar script to delete user from a CSV files using PowerShell.
Below are CSV file content and my script. When running the file, there is no error or failure but I couldn't get any user created from the CSV file and no log file is created with the output redirect option 2>&1 and the CSV files are not being archived too.
CSV File content:
"givenName","displayName","sAMAccountName","EmailAddress","OU",password
"DummyUser","DummyUser","dummy.user.customer1.com","dummy.user#customer1.com","OU=customer1,OU=Customers,DC=customerservice,DC=customerdomain,DC=com","**********"
Powershell Script:
<# This script is used to add Customers users in bulk to active directory using CSV file
This script will be scheduled to run every hour or everyday to add new customer users to the AD Customers OU
A new AD User based attributes CSV file with .csv extension and name format LDAP_Users***####.csv must be placed in a shared folder called LDAPExport. The folder...
..LDAPExport is located in thg C: drive of the server. Access to the folder from permitted servers is through the FTP server service running on this server where the script is running#>
try {
set-location = c:\ldapexport <# I just added this line after posting the question #>
$CustomerAddADUserCSV = Get-ChildItem -Path C:\ldapexport\ -Name *adduser*.csv
$CustomerAddADUserLogFolder = "c:\ldapexport\CustomerADUsersLogs"
$LdapExportLog = "c:\ldapexport\LdapExportLog"
<#the next lines check if AddUser csv file exists and then processed the file to add the Customer user/s account to Active Directory #>
if($CustomerAddADUserCSV){
foreach ($CustomerCSVfile in $CustomeraddADUserCSV){
$NewADUsers = Import-Csv -Path $CustomerCSVfile ;
foreach ($User in $NewADUsers)
{
$Displayname = $User.displayName
$UserFirstname = $User.Firstname
$UserLastname = $User.Lastname
$OU = $User.OU
$SAM = $User.sAMAccountName
$Password = $User.Password
$EmailAddress = $User.EmailAddress
New-ADUser -Name "$Displayname" -DisplayName "$Displayname" -SamAccountName $SAM -AccountPassword (ConvertTo-SecureString $Password -AsPlainText -Force) -Enabled $true -Path "$OU" -ChangePasswordAtLogon $false > "$LdapExportLog\csvdeAdUsers_$(get-date -f ddMMyyyy_HHmmss).log" 2>&1 -ErrorAction stop;
Get-ADUser -Identity $SAM -ErrorAction Stop > "$LdapExportLog\csvdeAdUsersAdded_$(get-date -f ddMMyyyy_HHmmss).log" 2>&1
} ;
<#the next lines archive the AddUser csv files into a zipped files and store them the "Archive" located in the "ldapexport" directory.#>
$CustomerAddADUserCSVFileArchive = Compress-Archive -Path "C:\ldapexport\$CustomerCSVfile" -DestinationPath C:\ldapexport\Archives\$CustomerCSVfile.$(get-date -f ddMMyyyy_HHmmss).zip -force;
Compress-Archive -Path $CustomerAddADUserLogFolder -DestinationPath C:\ldapexport\Archives\CustomerADUsersLogs_$(get-date -f ddMMyyyy_HHmmss).zip -force > $LdapExportLog\csvdeArchive_$(get-date -f ddMMyyyy_HHmmss).log 2>&1;
}
else{
Write-Host "No new AddUser csv file exists. Quiting..." > $LdapExportLog\csvdeAdUsers_$(get-date -f ddMMyyyy_HHmmss).log 2>&1;
exit
}
}
}
catch {
write-host "Please, check the script of the referenced .csv file for any error logs"
}
Here's a rewrite of your code.
I have changed some of the variable names because I found yours quite confusing sometimes.
Below code uses splatting on the New-ADUser cmdlet and collects the newly added user objects in a variable $addedUsers.
This collection is only outpout at the very end of the code instead of writing and compressing in each iteration.
$ArchivePath = 'C:\ldapexport\Archives'
$ErrorLog = 'c:\ldapexport\LdapExportLog\csvdeAdUsers_{0:ddMMyyyy_HHmmss}.log' -f (Get-Date)
$NewUsersCsv = 'c:\ldapexport\LdapExportLog\csvdeAdUsersAdded_{0:ddMMyyyy_HHmmss}.csv' -f (Get-Date)
$addedUsers = foreach ($csvFile in (Get-ChildItem -Path 'C:\ldapexport' -Filter '*adduser*.csv')) {
$NewADUsers = Import-Csv -Path $csvFile.FullName
foreach ($User in $NewADUsers) {
$userParams = #{
Name = $User.displayName
SamAccountName = $User.sAMAccountName
DisplayName = $User.displayName
GivenName = $User.Firstname
Surname = $User.Lastname
EmailAddress = $User.EmailAddress
Path = $User.OU
AccountPassword = ConvertTo-SecureString $User.Password -AsPlainText -Force
Enabled = $true
ChangePasswordAtLogon = $false
ErrorAction = 'Stop'
PassThru = $true
}
# try to create the new user and output the properties to be collected in variable $addedUsers
try {
New-ADUser #userParams
}
catch {
$msg = "Error creating user $($User.sAMAccountName): $($_.Exception.Message)"
# write to the error log
$msg | Add-Content -Path $ErrorLog
Write-Warning $msg
}
}
# create the full path for the archive file and compress this current imported csv file
$archiveFile = Join-Path -Path $ArchivePath -ChildPath ('{0}.{1:ddMMyyyy_HHmmss}.zip' -f $csvFile.BaseName, (Get-Date))
Compress-Archive -Path $csvFile.FullName -DestinationPath $archiveFile
# remove the current csv file
$csvFile | Remove-Item
}
# here, you save the newly added users collected in variable $addedUsers
$addedUsers | Export-Csv -Path $NewUsersCsv -NoTypeInformation
# if you want, compress this file and archive
# create the full path for the archive file and compress
$archiveFile = Join-Path -Path $ArchivePath -ChildPath ('{0}.zip' -f [System.IO.Path]::GetFileNameWithoutExtension($NewUsersCsv))
Compress-Archive -Path $NewUsersCsv -DestinationPath $archiveFile
# remove the newusers csv file?
$NewUsersCsv | Remove-Item
I'm new to Powershell. I have 80 servers that I need to connect to and run a Pshell script on remotely to find files recursively in one share by last access date and move them to another \server\share for archiving purposes. I also need the file creation, last accessed etc. timestamps to be preserved.
I would welcome any help please
thank you
You need to test this thoroughly before actually using it on all 80 servers!
What you could do if you want to use PowerShell on this is to use Invoke-Command on the servers adding admin credentials so the script can both access the files to move as well as the destination Archive folder.
I would suggest using ROBOCOPY to do the heavy lifting:
$servers = 'Server1', 'Server2', 'Server3' # etcetera
$cred = Get-Credential -Message "Please supply admin credentials for archiving"
$scriptBlock = {
$SourcePath = 'D:\StuffToArchive' # this is the LOCAL path on the server
$TargetPath = '\\NewServer\ArchiveShare' # this is the REMOTE path to where the files should be moved
$LogFile = 'D:\ArchivedFiles.txt' # write a textfile with all fie fullnames that are archived
$DaysAgo = 130
# from a cmd box, type 'robocopy /?' to see all possible switches you might want to use
# /MINAGE:days specifies the LastWriteTime
# /MINLAD:days specifies the LastAccessDate
robocopy $SourcePath $TargetPath /MOVE /MINLAD:$DaysAgo /COPYALL /E /FP /NP /XJ /XA:H /R:5 /W:5 /LOG+:$logFile
}
Invoke-Command -ComputerName $servers -ScriptBlock $scriptBlock -Credential $cred
If you want to do all using just PowerShell, try something like this:
$servers = 'Server1', 'Server2', 'Server3' # etcetera
$cred = Get-Credential -Message "Please supply admin credentials for archiving"
$scriptBlock = {
$SourcePath = 'D:\StuffToArchive' # this is the LOCAL path on the server
$TargetPath = '\\NewServer\ArchiveShare' # this is the REMOTE path to where the files should be moved
$LogFile = 'D:\ArchivedFiles.txt' # write a textfile with all fie fullnames that are archived
$refDate = (Get-Date).AddDays(-130).Date # the reference date set to midnight
# set the ErrorActionPreference to Stop, so exceptions are caught in the catch block
$OldErrorAction = $ErrorActionPreference
$ErrorActionPreference = 'Stop'
# loop through the servers LOCAL path to find old files and move them to the remote archive
Get-ChildItem -Path $SourcePath -File -Recurse |
Where-Object { $_.LastAccessTime -le $refDate } |
ForEach-Object {
try {
$target = Join-Path -Path $TargetPath -ChildPath $_.DirectoryName.Substring($SourcePath.Length)
# create the folder in the archive if not already exists
$null = New-Item -Path $target -ItemType Directory -Force
$_ | Move-Item -Destination $target -Force
Add-Content -Path $LogFile -Value "File '$($_.FullName)' moved to '$target'"
}
catch {
Add-Content -Path $LogFile -Value $_.Exception.Message
}
}
$ErrorActionPreference = $OldErrorAction
}
Invoke-Command -ComputerName $servers -ScriptBlock $scriptBlock -Credential $cred
I need to move files on a remote FTP server from Test to Tested. The files are always .csv, but the name changes as it's timestamped. Using the PSFTP module, I have written the following
$FtpServer = "ftp://myftpserver.com/"
$User = "myusername"
$PWD = "mypassword"
$Password = ConvertTo-SecureString $Pwd -AsPlainText -Force
$FtpCredentials = New-Object System.Management.Automation.PSCredential ($User, $Password)
Set-FTPConnection -Credentials $FtpCredentials -Server $FtpServer -Session MyFtpSession -UsePassive
$FtpSession = Get-FTPConnection -Session MyFtpSession
$ServerPath = "ftp://myftpserver.com/Test"
$fileList = Get-FTPChildItem -Session $FtpSession -Path $ServerPath -Filter *.csv
$archivefolder = "ftp://myftpserver.com/Tested"
foreach ($element in $fileList )
{
$filename = $ServerPath + $element.name
$newfilename = $archivefolder + $element.name
Rename-FTPItem -Path $filename -NewName $newfilename -Session $FtpSession
}
The files do exist in the Test folder, but not yet in the archive (Tested) folder. I thought by using a variable to generate what the new file location should be, that would work.
When I try this, I get
Rename-FTPItem : Exception calling "GetResponse" with "0" argument(s): "The remote server returned an error: (550) File unavailable (e.g., file not found, no access)."
Is there a way to move files while using a wildcard, or a better way to achieve what I'm trying to do?
Thanks in advance
The -NewName should be a path only, not a URL:
$archivefolder = "/Tested"
(In the -Path, the URL is acceptable, but it's redundant, you specify the session already using the $FtpSession)
You are missing a slash between the folder paths and the file names.
$filename = $ServerPath + "/" + $element.name
$newfilename = $archivefolder + "/" + $element.name
So you should call the Rename-FTPItem like this:
Rename-FTPItem -Path "ftp://myftpserver.com/Test/file.txt" -NewName "/Tested/file.txt" -Session $FtpSession
or like this:
Rename-FTPItem -Path "/Test/file.txt" -NewName "/Tested/file.txt" -Session $FtpSession
I need to upload full directory (with recourcive folders) to the server by sftp
#SSH
$secpasswd = ConvertTo-SecureString $Password -AsPlainText -Force
$Credentials = New-Object System.Management.Automation.PSCredential($User, $secpasswd)
$sftpSession = New-SFTPSession -ComputerName $HostIP -Credential $Credentials
#Folders Paths
$FilePath = Get-ChildItem -Path uploading-folder | Select-Object -ExpandProperty FullName
$SftpPath = "/home/new-folder"
# Upload the file to the SFTP path
Set-SFTPFile -SessionId ($sftpSession).SessionId -LocalFile $FilePath -RemotePath $SftpPath -Overwrite
#Disconnect all SFTP Sessions
Get-SFTPSession | % { Remove-SFTPSession -SessionId ($_.SessionId) }
But i cant upload folders inside folders (and filse in them). How can I upload full folder with files and folders inside it?
Posh does not seem to support recursive operations. You have to code it on your own.
Quoting Reddit post SFTP - How to upload an entire folder/directory instead of one file at a time?:
# Get a recursive list of files in the target folder ($path).
# Change the include to your needs,
# and remove the -Force if you don't want hidden files included
$path = "./Recursion test/"
$uploadFiles = Get-ChildItem $path -Recurse -Include "*" -Force
$session = New-SFTPSession (your args here)
$remoteFolder = Get-SFTPCurrentDirectory $sessoin.Index
Set-SFTPDirectoryPath $session.Index -Path $destinationPath
foreach ($item in $uploadFiles) {
if ($item.GetType -eq [System.IO.DirectoryInfo]) {
New-SFTPDirectory $sessionIndex $item
}
else {
$localFolder = $item.PSPath | Split-Path | Split-Path -Leaf
if ($localFolder -ne $remoteFolder) {
Set-SFTPCurrentDirectory $session.Index $localFolder
$remoteFolder = $localFolder
}
Set-SFTPFile $session.Index $item.Name
}
}
Or use another SFTP library that supports recursive operations.
Code :
$folders = #('D:\Shares\folderone\*.pdf','D:\Shares\foldertwo\*.pdf')
foreach ($folder in $folders) {
$files = Get-ChildItem $folder
foreach ($file in $files) {
Set-SFTPItem -SessionId $SFTPSession.SessionId -Path $file -Destination /upload -Verbose
}
}
I am busy writing a power-shell script for uploading database backup files to a different location. See my code below:
$Dir="\\server\COM\"
#ftp server
$ftp = "ftp://ftp.xyz.com/"
$user = "user1"
$pass = "pass1"
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
#Lets upload latest backup file
$latest = Get-ChildItem -Path $dir -Filter *.bak | Sort-Object LastWriteTime -Descending |Select-Object -First 1
Write-Output "The latest db backup file is $latest. Let's start uploading"
"Uploading $latest..."
$uri = New-Object System.Uri($ftp+$latest.Name)
$webclient.UploadFile($uri, $latest.FullName)
I don't see anything wrong with my script but for some reasons, the script is not working and I can't figure out what the problem is. I am getting a error below:
Exception calling "UploadFile" with "2" argument(s): "The remote server returned an error: (534) 534 Policy requires SSL.
I am still learning power-shell and not so good at scripting. Can anyone assist?
Thanks in advance!
Use this script:
#$Dir=""
$Dir = ""
$CertificateFingerprint = ""
$newFileName = Get-ChildItem -Path $dir -Filter *.bak | Sort-Object LastWriteTime -Descending |Select-Object -First 1
$Day = Get-Date
$UploadDay= $Day.DayOfWeek
If($UploadDay -eq "Tuesday" ){
Write-Output "The latest db backup file is $newFileName. Let's start uploading"
Add-FTPItem -Path "" -LocalPath "" -Username "" -Password "" -FTPHost ""
Write-Output "Upload completed!"
}else {
"Upload failed!"
}
#$DiffBackupDir="\\"
$DiffBackupDir = ""
$DiffBackup = Get-ChildItem -Path $DiffBackupDir -Filter *.bak | Sort-Object LastWriteTime -Descending |Select-Object -First 1
Write-Output "The latest db backup file is $DiffBackup. Let's start uploading"
#Upload of a differential backup to FTP Folder
Add-FTPItem -Path "" -LocalPath "" -Username "" -Password "" -FTPHost ""