I am busy writing a power-shell script for uploading database backup files to a different location. See my code below:
$Dir="\\server\COM\"
#ftp server
$ftp = "ftp://ftp.xyz.com/"
$user = "user1"
$pass = "pass1"
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
#Lets upload latest backup file
$latest = Get-ChildItem -Path $dir -Filter *.bak | Sort-Object LastWriteTime -Descending |Select-Object -First 1
Write-Output "The latest db backup file is $latest. Let's start uploading"
"Uploading $latest..."
$uri = New-Object System.Uri($ftp+$latest.Name)
$webclient.UploadFile($uri, $latest.FullName)
I don't see anything wrong with my script but for some reasons, the script is not working and I can't figure out what the problem is. I am getting a error below:
Exception calling "UploadFile" with "2" argument(s): "The remote server returned an error: (534) 534 Policy requires SSL.
I am still learning power-shell and not so good at scripting. Can anyone assist?
Thanks in advance!
Use this script:
#$Dir=""
$Dir = ""
$CertificateFingerprint = ""
$newFileName = Get-ChildItem -Path $dir -Filter *.bak | Sort-Object LastWriteTime -Descending |Select-Object -First 1
$Day = Get-Date
$UploadDay= $Day.DayOfWeek
If($UploadDay -eq "Tuesday" ){
Write-Output "The latest db backup file is $newFileName. Let's start uploading"
Add-FTPItem -Path "" -LocalPath "" -Username "" -Password "" -FTPHost ""
Write-Output "Upload completed!"
}else {
"Upload failed!"
}
#$DiffBackupDir="\\"
$DiffBackupDir = ""
$DiffBackup = Get-ChildItem -Path $DiffBackupDir -Filter *.bak | Sort-Object LastWriteTime -Descending |Select-Object -First 1
Write-Output "The latest db backup file is $DiffBackup. Let's start uploading"
#Upload of a differential backup to FTP Folder
Add-FTPItem -Path "" -LocalPath "" -Username "" -Password "" -FTPHost ""
Related
I have multiple files in the SFTP folder that were created at the same time but at different dates in their filenames.
Example file:
REGISTRATION_ELI_20210422_071008.csv
REGISTRATION_ELI_20210421_071303.csv
REGISTRATION_ELI_20210420_071104.csv
I want to copy 1 file with today's date in its filename and send it to local. Which property can I use to list files and copy the one that matches today's date?
#Setting credentials for the user account
$password = ConvertTo-SecureString "password" -AsPlainText -Force
$creds = New-Object System.Management.Automation.PSCredential ("gomgom", $password)
$SFTPSession = New-SFTPSession -ComputerName 172.16.xxx.xxx -Credential $Credential -AcceptKey
# Set local file path and SFTP path
$LocalPath = "D:\WORK\Task - Script\20221010 - AJK - ITPRODIS380 - upload file csv ke sql server\csvfile"
$SftpPath = '/Home Credit/Upload/REGISTRATION_ELI_*.csv'
Get-SFTPItem -SessionId $SFTPSession.SessionID -Path $SftpPath -Destination $LocalPath | Sort-Object {[datetime] ($_.BaseName -replace '^.+_(\d{4})(\d{2})(\d{2}) (\d{2})(\d{2})', '$1-$2-$3 $4:$5:') } | Select-Object Name
Remove-SFTPSession $SFTPSession -Verbose
You can try to change your $SftpPath like this :
$SftpPath = "/Home Credit/Upload/REGISTRATION_ELI$([datetime]::Now.toString('yyyyMMdd'))_*.csv"
I introduce the date in he path you look for :
/Home Credit/Upload/REGISTRATION_ELI20221221_*.csv
You can perhaps solve your problem by first listing remote files and them download the one you want by date. I don't test but it gi can give something like that :
$SftpPath = '/Home Credit/Upload'
$SftpPath = "/Home Credit/Upload/REGISTRATION_ELI$([datetime]::Now.toString('yyyyMMdd'))_*.csv"
$Files = (Get-SFTPChildItem -SessionId $Session.SessionId -Path "$SftpPath") | where {$_.name -like $SftpPath}
foreach ($file in $files)
{
Get-SFTPItem -SessionId $SFTPSession.SessionID -Path $file -Destination $LocalPath
}
I am looking to find a way to compact and repair all the Access databases in a certain directory using Powershell via a script.
The VBA codes below work, but need one for Powershell:
Find all Access databases, and Compact and Repair
I am new to Powershell so will be grateful for the assistance.
Thanks
You may try this.
Add-Type -AssemblyName Microsoft.Office.Interop.Access
$rootfolder = 'c:\some\folder'
$createlog = $true # change to false if no log desired
$access = New-Object -ComObject access.application
$access.Visible = $false
$access.AutomationSecurity = 1
Get-ChildItem -Path $rootfolder -File -Filter *.accdb -Recurse -PipelineVariable file | ForEach-Object {
$newname = Join-Path $file.Directory ("{0}_compacted{1}" -f $file.BaseName,$file.Extension)
$message = #"
Current file: {0}
Output file: {1}
"# -f $file.FullName,$newname
Write-Host $message -ForegroundColor Cyan
$access.CompactRepair($file.fullname,$newname,$createlog)
}
$access.Quit()
This will output each compacted database as the name of the original file with _compacted appended to the name (before the extension.) I have tested this in every way except actually compacting databases.
Edit
Regarding your comment, a few minor changes should achieve the desired result. Keep in mind that this will put all new files in the same folder. This may not be an issue for your case but if there are duplicate file names you will have problems.
$rootfolder = 'c:\some\folder'
$destination = 'c:\some\other\folder'
$todaysdate = get-date -format '_dd_MM_yyyy'
Add-Type -AssemblyName Microsoft.Office.Interop.Access
$createlog = $true # change to false if no log desired
$access = New-Object -ComObject access.application
$access.Visible = $false
$access.AutomationSecurity = 1
Get-ChildItem -Path $rootfolder -File -Filter *.accdb -Recurse -PipelineVariable file | ForEach-Object {
$newname = Join-Path $destination ("{0}$todaysdate{1}" -f $file.BaseName,$file.Extension)
$message = #"
Current file: {0}
Output file: {1}
"# -f $file.FullName,$newname
Write-Host $message -ForegroundColor Cyan
$access.CompactRepair($file.fullname,$newname,$createlog)
}
$access.Quit()
I have been writing a script
Workflow:
Get list all of fixed disks ( except cdrom , floppy drive , usb drive)
to check if a path exists or not in PowerShell
to check if a Deny permission already exists to a directory or not in PowerShell
Set deny permission for write access for users
My question are :
1- After file exist control like below , also I want to check if a Deny permission already exists to a directory. ("$drive\usr\local\ssl")
If(!(test-path $path))
{
New-Item -ItemType Directory -Force -Path $path
}
2- there are about 1000 machines. How can I improve this script ?
Thanks in advance,
script :
$computers = import-csv -path "c:\scripts\machines.csv"
Foreach($computer in $computers){
$drives = Get-WmiObject Win32_Volume -ComputerName $computer.ComputerName | Where { $_.drivetype -eq '3'} |Select-Object -ExpandProperty driveletter | sort-object
foreach ($drive in $drives) {
$path = "$drive\usr\local\ssl"
$principal = "users"
$Right ="Write"
$rule=new-object System.Security.AccessControl.FileSystemAccessRule($Principal,$Right,"Deny")
If(!(test-path $path))
{
New-Item -ItemType Directory -Force -Path $path
}
try
{
$acl = get-acl $folder
$acl.SetAccessRule($rule)
set-acl $folder $acl
}
catch
{
write-host "ACL failed to be set on: " $folder
}
#### Add-NTFSAccess -Path <path> -Account <accountname> -AccessType Deny -AccessRights <rightstodeny>
}
}
The first thing I noticed is that in your code, you suddenly use an undefined variable $folder instead of $path.
Also, you get the drives from the remote computer, but set this $path (and try to add a Deny rule) on folders on your local machine:
$path = "$drive\usr\local\ssl"
where you should set that to the folder on the remote computer:
$path = '\\{0}\{1}$\usr\local\ssl' -f $computer, $drive.Substring(0,1)
Then, instead of Get-WmiObject, I would nowadays use Get-CimInstance which should give you some speed improvement aswell, and I would add some basic logging so you will know later what happened.
Try this on a small set of computers first:
Note This is assuming you have permissions to modify permissions on the folders of all these machines.
$computers = Import-Csv -Path "c:\scripts\machines.csv"
# assuming your CSV has a column named 'ComputerName'
$log = foreach ($computer in $computers.ComputerName) {
# first try and get the list of harddisks for this computer
try {
$drives = Get-CimInstance -ClassName Win32_Volume -ComputerName $computer -ErrorAction Stop |
Where-Object { $_.drivetype -eq '3'} | Select-Object -ExpandProperty driveletter | Sort-Object
}
catch {
$msg = "ERROR: Could not get Drives on '$computer'"
Write-Host $msg -ForegroundColor Red
# output a line for the log
$msg
continue # skip this one and proceed on to the next computer
}
foreach ($drive in $drives) {
$path = '\\{0}\{1}$\usr\local\ssl' -f $computer, $drive.Substring(0,1)
$principal = "users"
$Right = "Write"
if (!(Test-Path -Path $path -PathType Container)) {
$null = New-Item -Path $path -ItemType Directory -Force
}
# test if the path already has a Deny on write for the principal
$acl = Get-Acl -Path $path -ErrorAction SilentlyContinue
if (!$acl) {
$msg = "ERROR: Could not get ACL on '$path'"
Write-Host $msg -ForegroundColor Red
# output a line for the log
$msg
continue # skip this one and proceed to the next drive
}
if ($acl.Access | Where-Object { $_.AccessControlType -eq 'Deny' -and
$_.FileSystemRights -band $Right -and
$_.IdentityReference -like "*$principal"}) {
$msg = "INFORMATION: Deny rule already exists on '$path'"
Write-Host $msg -ForegroundColor Green
# output a line for the log
$msg
}
else {
$rule = [System.Security.AccessControl.FileSystemAccessRule]::new($Principal, $Right, "Deny")
# older PS versions use:
# $rule = New-Object System.Security.AccessControl.FileSystemAccessRule $Principal, $Right, "Deny"
try {
$acl.AddAccessRule($rule)
Set-Acl -Path $path -AclObject $acl -ErrorAction Stop
$msg = "INFORMATION: ACL set on '$path'"
Write-Host $msg -ForegroundColor Green
# output a line for the log
$msg
}
catch {
$msg = "ERROR: ACL failed to be set on: '$path'"
Write-Host $msg -ForegroundColor Red
# output a line for the log
$msg
}
}
}
}
# write the log
$log | Set-Content -Path "c:\scripts\SetAccessRuleResults.txt" -Force
I have this type of code in a backup script.
while($true){
$time = Get-Date -Format HH:mm:ss
$dateCheck = (Get-Date).AddDays(-0).ToString('dd-MM-yyyy')
[int]$check = $check
Get-ChildItem 'C:\fleet-integrator-installer-DHL2\work\matilda\14\done' -File |
Sort-Object -Property CreationTime -Descending |
Select-Object -First 1 |
Copy-Item -Destination \\Ict_nas\dhl\$dateCheck -Force
if($time -eq ('23:59:00')){
$check = $check - $check
}
if($time -eq ('23:59:30') -and $check -eq 0){
New-Item -ItemType "directory" -Path "\\Ict_nas\dhl" -Name $dateCheck | Out-Null
$check++
}
if($time -eq ('00:00:00')){
$fileCount = ( Get-ChildItem C:\fleet-integrator-installer-DHL2\work\matilda\14\done ).Count;
$EmailFrom = "..."
$EmailTo = "..."
$Subject = "Backup DHL Integrator -> IT NAS"
$Body = "The backup was succesful, $fileCount files are copied to the NAS"
$SMTPServer = "smtp.gmail.com"
$SMTPClient = New-Object Net.Mail.SmtpClient($SmtpServer, 587)
$SMTPClient.EnableSsl = $true
$SMTPClient.Credentials = New-Object System.Net.NetworkCredential("", "")
$SMTPClient.Send($EmailFrom, $EmailTo, $Subject, $Body)
Get-ChildItem -Path C:\fleet-integrator-installer-DHL2\work\matilda\14\done -Include *.* -File -Recurse | foreach { $_.Delete()}
}
}
It is creating automaticly an folder with the date of today on a NAS.
When i run the code separate it is working fine but in the script it is just creating a file with the date of today...
Anyone tips?
Thanks
You are copying to the directory before creating it:
Copy-Item -Destination \\Ict_nas\dhl\$dateCheck -Force
Because the directory does not exist (yet), the destination is interpreted as a file name. So the source file is copied to a file of that name.
I am not sure what your $check logic is supposed to do, but basically you have 2 options:
A) Create the directory first, before copying to it
New-Item "\\Ict_nas\dhl\$dateCheck" -Type Directory | Out-Null
Get-ChildItem 'C:\fleet-integrator-installer-DHL2\work\matilda\14\done' -File |
sort CreationTime -Descending | select -First 1 |
Copy-Item \\Ict_nas\dhl\$dateCheck
B) Specify a full file path for copying
Get-ChildItem 'C:\fleet-integrator-installer-DHL2\work\matilda\14\done' -File |
sort CreationTime -Descending | select -First 1 | foreach {
Copy-Item $_.FullName "\\Ict_nas\dhl\$dateCheck\$($_.Name)"
}
Below is a script that monitors a directory and its subfolders for deposited files. Every 10 minutes or so, I look for new files and then match them against a database table that tell me where they need to be moved to - then it copies the files to a local archive, moves them to the locations they need to be moved to, and inserts a record into another database table with the file's attributes and where it came and went. If there is no match in the database - or there is an script error - it sends me an email.
However, since files are getting deposited to the directory constantly, it's possible that a file is still being written when the script executes. As a result, I get the error The process cannot access the file because it is being used by another process. emailed to me all the time. In addition, because I'm not dealing with the error up front; it goes through the loop and a false entry is inserted into my log table in the database with incorrect file attributes. When the file finally frees up, it gets inserted again.
I'm looking for a way to identify files that have processes attached to them; and skipping them when the script executes - but several days of web searches and some testing hasn't yielded an answer yet.
## CLEAR ERROR LOG
$error.clear()
Write-Host "***File Transfer Script***"
## PARAMETERS
$source_path = "D:\Files\In\"
$xferfail_path = "D:\Files\XferFailed\"
$archive_path = "D:\Files\XferArchive\"
$email_from = "SQLMail <SQLMail#bar.com>"
$email_recip = [STRING]"foo#bar.com"
$smtp_server = "email.bar.com"
$secpasswd = ConvertTo-SecureString "Pa$$w0rd" -AsPlainText -Force
$smtp_cred = New-Object System.Management.Automation.PSCredential ("BAR\SQLAdmin", $secpasswd)
## SQL LOG FUNCTION
function Run-SQL ([string]$filename, [string]$filepath, [int]$filesize, [int]$rowcount, [string]$xferpath)
{
$date = get-date -format G
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server=SQLSERVER;Database=DATABASE;Uid=SQLAdmin;Pwd=Pa$$w0rd;"
$SqlConnection.Open()
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = "INSERT INTO DATABASE..Table VALUES ('$date','$filename','$filepath',$filesize,$rowcount,'$xferpath',0)"
$SqlCmd.Connection = $SqlConnection
$SqlCmd.ExecuteNonQuery()
$SqlConnection.Close()
}
## DETERMINE IF THERE ARE ANY FILES TO PROCESS
$file_count = Get-ChildItem -path $source_path |? {$_.PSIsContainer} `
| Get-ChildItem -path {$_.FullName} -Recurse | Where {$_.psIsContainer -eq $false} | Where {$_.Fullname -notlike "D:\Files\In\MCI\*"} `
| Measure-Object | Select Count
If ($file_count.Count -gt 0)
{
Write-Host $file_count.Count "File(s) Found - Processing."
Start-Sleep -s 5
## CREATE LIST OF DIRECTORIES
$dirs = Get-ChildItem -path $source_path -Recurse | Where {$_.psIsContainer -eq $true} | Where {$_.Fullname -ne "D:\Files\In\MCI"} `
| Where {$_.Fullname -notlike "D:\Files\In\MCI\*"}
## CREATE LIST OF FILES IN ALL DIRECTORIES
$files = ForEach ($item in $dirs)
{
Get-ChildItem -path $item.FullName | Where {$_.psIsContainer -eq $false} | Sort-Object -Property lastWriteTime -Descending
}
## START LOOPING THROUGH FILE LIST
ForEach ($item in $files)
{
## QUERY DATABASE FOR FILENAME MATCH, AND RETURN TRANSFER DIRECTORY
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server=SQLSERVER;Database=DATABASE;Uid=SQLAdmin;Pwd=Pa$$w0rd;"
$SqlConnection.Open()
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = "SELECT F.DirTransfer FROM DATABASE..Files F WHERE '$item.Name.Trim()' LIKE F.FileName"
$SqlCmd.Connection = $SqlConnection
$DirTransfer = $SqlCmd.ExecuteScalar()
$SqlConnection.Close()
If ($DirTransfer) # if there is a match
{
Write-Host $item.FullName"`t->`t"$DirTransfer
$filename = $item.Name
$filepath = $item.FullName
$filesize = $item.Length
If (!($filesize))
{
$filesize = 0
}
$rowcount = (Get-Content -Path $item.FullName).Length
If (!($rowcount))
{
$rowcount = 0
}
$xferpath = $DirTransfer
Run-SQL -filename "$filename" -filepath "$filepath" -filesize "$filesize" -rowcount "$rowcount" -xferpath "$DirTransfer"
Copy-Item -path $item.FullName -destination $DirTransfer -force -erroraction "silentlycontinue"
Move-Item -path $item.FullName -destination $archive_path -force -erroraction "silentlycontinue"
#Write-Host "$filename $filepath $filesize $rowcount $xferpath"
}
Else # if there is no match
{
Write-Host $item.FullName "does not have a mapping"
Move-Item -path $item.FullName -destination $xferfail_path -force
$filename = $item.FullName
$email_body = "$filename `r`n`r`n does not have a file transfer mapping setup"
Send-MailMessage -To $email_recip `
-From $email_from `
-SmtpServer $smtp_server `
-Subject "File Transfer Error - $item" `
-Body $email_body `
-Priority "High" `
-Credential $smtp_cred
}
}
}
## IF NO FILES, THEN CLOSE
Else
{
Write-Host "No File(s) Found - Aborting."
Start-Sleep -s 5
}
## SEND EMAIL NOTIFICATION IF SCRIPT ERROR
If ($error.count -gt 0)
{
$email_body = "$error"
Send-MailMessage -To $email_recip `
-From $email_from `
-SmtpServer $smtp_server `
-Subject "File Transfer Error - Script" `
-Body $email_body `
-Priority "High" `
-Credential $smtp_cred
}
You can use the SysInternals handles.exe to find the open handles on a file. The exe can be downloaded from http://live.sysinternals.com/.
$targetfile = "C:\Users\me\Downloads\The-DSC-Book.docx"
$result = Invoke-Expression "C:\Users\me\Downloads\handle.exe $targetfile" | Select-String ([System.IO.Path]::GetFileNameWithoutExtension($targetfile))
$result
Outputs:
WINWORD.EXE pid: 3744 type: File 1A0: C:\Users\me\Downloads\The-DSC-Book.docx
Alternatively, you can check for errors either via try/catch or by looking at the $error collection after the Move-Item attempt then handle the condition appropriately.
$error.Clear()
Move-Item -path $item.FullName -destination $xferfail_path -force -ea 0
if($error.Count -eq 0) {
# do something useful
}
else {
# do something that doesn't involve spamming oneself
}
To expand on Arluin's answer. It fails if there's spaces in either the handle.exe or the $targetfile.
This will work for spaces in both and also formats the result to give you the Program Name.exe
$targetfile = "W:\Apps Folder\File.json"
$result = & "W:\Apps (Portable)\handle.exe" "$targetfile" | Select-String ([System.IO.Path]::GetFileNameWithoutExtension($targetfile))
$result = $result -replace '\s+pid\:.+'
$result
# PS> FreeCommander.exe
One way to avoid file locks caused by running the script on a timer is to use an event driven approach using a file system watcher. It has the ability to execute code when an event such as a new file is created in the folder you are monitoring.
To run code when the file is finished copying you would need to listen for the changed event. There is a slight issue with this event in that it fires once when the file begins copying and again when it is finished. I got an idea to work around this chicken/egg problem after checking out the module Mike linked to in the comments. I've updated the code below so that it will only fire off code when file has fully been written.
To try, change $folderToMonitor to the folder you want to monitor and add some code to process the file.
$processFile = {
try {
$filePath = $event.sourceEventArgs.FullPath
[IO.File]::OpenRead($filePath).Close()
#A Way to prevent false positive for really small files.
if (-not ($newFiles -contains $filePath)) {
$newFiles += $filePath
#Process $filePath here...
}
} catch {
#File is still being created, we wait till next event.
}
}
$folderToMonitor = 'C:\Folder_To_Monitor'
$watcher = New-Object System.IO.FileSystemWatcher -Property #{
Path = $folderToMonitor
Filter = $null
IncludeSubdirectories = $true
EnableRaisingEvents = $true
NotifyFilter = [System.IO.NotifyFilters]'FileName,LastWrite'
}
$script:newFiles = #()
Register-ObjectEvent $watcher -EventName Changed -Action $processFile > $null