I have a powershell script designed to go through some backup files, archive the month end backup(if its the first of the month) and delete anything older than 9 days after that. I'm trying to tell it to exclude the archive folder but it seems to be ignoring that and I'm unsure why.. kind of a noob to powershell.
I've tried using -notlike $Exlcude+% I've tried -notmatch -NE .. I even tried notcontains even though I knew that wouldn't work.. I tried applying the $exclude to every portion of the code where that folder might be accessed and its still making a copy of it.
#If first of the month copy latest backups to Monthly Backup Folder - will copy full folder path
PARAM($BackupPath="\\SomeServer\sqltestbackups\",
$Exclude= "\\SomeServer\sqltestbackups\Archive")
#Grab a recursive list of all subfolders
$SubFolders = dir $BackupPath -Recurse | Where-Object {$_.PSIsContainer} | ForEach-Object -Process {$_.FullName}
#Iterate through the list of subfolders and grab the first file in each
$Year = (get-date).Year
$Month = (get-date).Month
$StartOfMonth = Get-Date -Year $Year -Month $Month -Day 1
$Today = (Get-Date).day
$Date = (GET-DATE).AddDays(-9)
$PrevMonth = (GET-DATE).year.ToString()+(Get-Culture).DateTimeFormat.GetMonthName((Get-Date).AddMonths(-1).month)
$Destination=$Exclude + "\" + $Prevmonth +"\"
IF (!(Test-Path -path $destination)) {New-Item $destination -Type Directory}
IF($Today -eq '5')
{
$Path = $BackupPath #Root path to look for files
$DestinationPath = $Destination #Remote destination for file copy
#Grab a recursive list of all subfolders
$SubFolders = dir $Path -Recurse | Where-Object {$_.PSIsContainer -and $_.DirectoryName -notmatch $Exclude} | ForEach-Object -Process {$_.FullName}
#Iterate through the list of subfolders and grab the first file in each
ForEach ($Folder in $SubFolders)
{
$FullFileName = dir $Folder | Where-Object {!$_.PSIsContainer -and $_.DirectoryName -notmatch $Exclude} | Sort-Object {$_.LastWriteTime} -Descending | Select-Object -First 1
#For every file grab it's location and output the robocopy command ready for use
ForEach ($File in $FullFileName)
{
$FilePath = $File.DirectoryName
$ArchFolder = $File.DirectoryName
$ArchFolder=$ArchFolder.Replace($BackupPath,"")+"\"
$FinalPath=$destinationPath+$ArchFolder
$FileName = $File.Name
robocopy $FilePath $FinalPath $FileName /A-:SH /R:6 /W:30 /Z
}
}
}
# Delete files older than 9 days that are not contained within the month end folder
Get-ChildItem $BackupPath -Recurse |
Where-Object { !($_.PSIsContainer) -and
$_.LastWriteTime -lt $Date -and
$_.Directory -notlike $Exclude+"%" } |
Remove-Item
The code works except for the copy month end portion.. in this portion it is including the archive folder and I end up with it copying the previous month.. the script is designed to put the files in archive/YM/FullPath of backup so what is happening is its going Archive/YM/ARchive/YM/FullPath even though I'm trying SO HARD to exclude this path from the folders.
Example of whats going wrong with the robocopy
Source : \SomeServer\sqltestbackups\ARCHIVE\2019March\SomeOtherServer\SQLBackups\SomeDatabase\master\
Dest : \SomeServer\sqltestbackups\Archive\2019March\ARCHIVE\2019March\SomeOtherServer\SQLBackups\SomeDatabase\master\
the type DirectoryInfodoes not have a property DirectoryName. Try property BaseName instead.
# dir, gci are aliases for Get-ChildItem
Get-ChildItem $Path -Recurse `
| Where-Object { $_.PSIsContainer -and $_.BaseName -notmatch $Exclude }
works.
To see which types are returned and which members there are type
Get-ChildItem .\ | Where-Object { $_.PSIsContainer } | ForEach-Object { $_.GetType() }
# and
Get-ChildItem .\ | Where-Object { $_.PSIsContainer } | Get-Member
But be careful: You will get a lot of output on directories with many files and subdirectories.
Also have a look at the answer How can I exclude multiple folders using Get-ChildItem -exclude? on stackoverflow. This answer comes with lots of elegant solutions for PowerShell from v1.0 to v5.0.
problem was
$Exclude= "\\SomeServer\sqltestbackups\Archive" I needed to double up the \s to be
$Exclude= "\\\\SomeServer\\sqltestbackups\\Archive"
after doing this the script worked fine.
Related
I want to get all files in subfolders, of the same root folder, that all contain the same string ("foo") in the name of the subfolder(s). Below gives me no error, and no output. I don't know what I'm missing.
Get-ChildItem $rootfolder | where {$_.Attributes -eq 'Directory' -and $_.BaseName -contains 'foo'}) | echo $file
Ultimately, I would like to not just echo their names, but move each file to a target folder.
Thank you.
Here is a solution that includes moving the child files of each folder to a new target folder:
$RootFolder = '.'
$TargetFolder = '.\Test'
Get-ChildItem $RootFolder | Where-Object {$_.PSIsContainer -and $_.BaseName -match 'foo'} |
ForEach-Object { Get-ChildItem $_.FullName |
ForEach-Object { Move-Item $_.FullName $TargetFolder -WhatIf } }
Remove -WhatIf when you are happy it's doing what it should be.
You might need to modify the Get-ChildItem $_.FullName part if you (for example) want to exclude sub-directories of the folders, or if you want to include child items in all subfolders of those paths, but not the folders themselves.
replace
Get-ChildItem $rootfolder | where {$_.Attributes -match 'Directory' -and $_.basename -Match 'foo'}) | echo $file
with
Get-ChildItem $rootfolder | where {($_.Attributes -eq 'Directory') -and ($_.basename -like '*foo*')} | Move-Item $targetPath
your request:
that all contain the same string ("foo")
you have to use the -like comparison operator. Also for exact match I would use -eq (case sensitive version is -ceq) instead of -match since its used for matching substrings and patterns.
workflow:
Gets all the files in directory, sending it through pipe to Where-Object cmdlet where you are filtering based on properties Attributes and Basename. When the filtering is done, its being sent to cmdlet Move-Item.
Adapt the first two vars to your environment.
$rootfolder = 'C:\Test'
$target = 'X:\path\to\whereever'
Get-ChildItem $rootfolder -Filter '*foo*' |
Where {$_.PSiscontainer} |
ForEach-Object {
"Processing folder: {0} " -f $_
Move $_\* -Destination $target
}
I wrote a simple script that will run as a scheduled task every weekend. This script cleans up files older than # days and you can give the name of a folder for exclusion as a parameter. This folder should not be cleaned up by the script. But somehow the script still deletes some files from the excluded folder. But in a strange way no files matching the conditions of the parameter.
For example I run the script to delete files older than 15 days and exclude the folder NOCLEANUP. but some files still get deleted from that folder, but there are still files older than 15 days in the NOCLEANUP folder.
Code below
Thanks in advance and apologies for the dirty code. Still new to PS.
Function CleanDir ($dir, $days, $exclude, $logpath)
{
$Limit = (Get-Date).AddDays(-$days)
$Path = $dir
#Folder to exclude
$ToExclude = $exclude
#Log location
$Log= $logpath
$Testpath = Test-Path -PathType Container -Path $Log
if ($Testpath -ne $true)
{
New-Item -ItemType Directory -Force -Path $Log
}
#Logs deleted files
cd $Log
Get-ChildItem -Path $Path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.LastwriteTime -lt $Limit } | Where-Object { $_.fullname -notmatch $ToExclude} | Where-Object { $_.fullname -notmatch '$RECYCLE.BIN'} | Out-File -FilePath CLEANUPLOG.TXT
# Delete files older than the $Limit.
Get-ChildItem -Path $Path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.LastwriteTime -lt $Limit } | Where-Object { $_.fullname -notlike $ToExclude} | Remove-Item -Force -Recurse
#Goes into every folder separately and deletes all empty subdirectorys without deleting the root folders.
$Folder = Get-ChildItem -Path $Path -Directory
$Folder.fullname | ForEach-Object
{
Get-ChildItem -Path $_ -Recurse -Force | Where-Object {$_.PSIsContainer -eq $True} | Where-Object {$_.GetFiles().Count -eq 0} | Remove-Item -Force -Recurse
}
}
Need a powershell script that will moved folders and files from a location to another location that are older then x number of days but some folders are exempted.
Also needs to have the ability to email a list of files and folders that it moved.
I can move the files in a folder, I'm not sure how to move the entire folders though.
Here is some code I have put together so far, any suggestions would be great
Set-ExecutionPolicy RemoteSigned
#----- define parameters -----#
#----- get current date ----#
$Now = Get-Date
#----- define amount of days ----#
$Days = "7"
#----- define folder where files are located ----#
$TargetFolder = "C:\test"
$TargetPath = "C:\test5"
#----- define extension ----#
$Extension = "*.*"
#----- define LastWriteTime parameter based on $Days ---#
$LastWrite = $Now.AddDays(-$Days)
#----Exclusion List ----#
$exclude =#('test1', 'test2')
#----- get files based on lastwrite filter and specified folder ---#
$Files = Get-Childitem -path $TargetFolder -Include $Extension -Recurse | Where {$_.LastWriteTime -le "$LastWrite"} -and $_Name -ne $exclude | foreach ($_)} #-
foreach ($File in $Files)
{
if ($File -ne $NULL)
{
write-host "Deleting File $File" -ForegroundColor "DarkRed"
Move-Item $File.FullName $TargetPath -force
}
else
{
Write-Host "No more files to delete!" -foregroundcolor "Green"
}
}
A shorthand version that is supported on PowerShell v3 or higher. This would find all the folder where the LastWriteTime is older that 7 days and move them.
$LastWrite = (Get-Date).AddDays(-7)
gci c:\temp -Directory -Recurse | ?{$_.LastWriteTime -le $LastWrite} | select -expand fullname | %{Move-Item $_ $TargetPath}
There would be no point with file exclusions if you are just looking at the folder time so that logic is omitted. Same code but easier to read:
$LastWrite = (Get-Date).AddDays(-7)
Get-ChildItem $TargetFolder | Where-Object{$_.LastWriteTime -le $LastWrite} | Select-Object -ExpandProperty FullName | ForEach-Object{
Move-Item $_ $TargetPath
}
Caveat
There could be an issue where you are trying to move a folder and a parent might have been previously moved. Don't really have the test environment to check that right now. Could just use a little test before the copy just in case.
If(Test-Path $_){Move-Item $_ $TargetPath}
Email
A starting point for working with email would be Send-MailMessage. There are other methods as well.
Folder Exclusion
If you wanted to omit certain folders there is a couple of ways to accomplish that. If you know the whole folder name you want to omit you could add this $exclude =#('test1', 'test2') like you already have and change the Where clause.
Where-Object{$_.LastWriteTime -le $LastWrite -and $exclude -notcontains $_.Name}
If you didnt know the whole name and maybe that $exclude only contained the partial name you could do this as well using a little regex
$exclude =#('test1', 'test2')
$exclude = "({0})" -f ($exclude -join "|")
#..... other stuff happens
Where-Object{$_.LastWriteTime -le $LastWrite -and $_.Name -notmatch $exclude}
I Need some help here: the following script needs to be changed so, that the script delete only Folders in the Subdirectory not Files. Can anyone help me?
$path = "C:\test\1"
$keep = 3
$strLogFileName = "c:\test\yourlogfile.log";
function Log-Message
{
Param ([string]$logtext)
Add-content $strLogFileName -value $logtext
}
$dirs = Get-ChildItem -Path $path -Recurse | Where-Object {$_.PsIsContainer}
foreach ($dir in $dirs) {
$files = Get-ChildItem -Path $dir.FullName | Where-Object {-not $_.PsIsContainer -and $_.name - like "*.zip"}
if ($files.Count -gt $keep) {
$files | Sort-Object CreationTime -desc| Select-Object -First ($files.Count - $keep) |
% { $dt=get-date;(Log-Message "Deleting File $_ on $dt");$_ }| Remove-Item -Force
}
}
My original answer was -WAY- off base, and having RTFQ I've got something that should work for you.
function Remove-LargeFolders
{
Param([string]$RootPath)
$keep = 5
#Get a list of the dirs in the first level of the folder
$dirs = Get-ChildItem -Path $RootPath | Where-Object {$_.PsIsContainer}
foreach ($dir in $dirs) {
#Call function on the new folder to check all the sublevels before deleting
#the top-level folder.
Remove-LargeFolders $dir.FullName
$files = Get-ChildItem -Path $dir.FullName | Where-Object {-not $_.PsIsContainer `
-and $_.name -like "*.zip"}
if ($files.Count -gt $keep) {
$dt=get-date
Log-Message "Deleting Folder $dir on $dt"
$dir
Remove-Item $dir.FullName -Force -Recurse
}
}
}
This should do what you're looking for. The answer was recursion! Essentially, the script only looks at one level of folders at a time. It digs down to the bottom of each folder tree, before working it's way back up to the top.
This script will work as written, but there is a proviso. Right now, if the script is targeting a structure like this:
Target Folder (4 files)
SubFolder (10 files)
SubSubFolder (5 files)
It will check SubSubFolder first and not delete it, since it has 5 or fewer files. However, once it hops back up to SubFolder, it will see a folder that is too big and kill it off, getting rid of SubSubFolder in the process. If you want a way around that, you'll need to build in some checks that will allow you to see if each folder has another folder in it before deleting.
Hope this helps more!
I would like to delete only the files that were created more than 15 days ago in a particular folder. How could I do this using PowerShell?
The given answers will only delete files (which admittedly is what is in the title of this post), but here's some code that will first delete all of the files older than 15 days, and then recursively delete any empty directories that may have been left behind. My code also uses the -Force option to delete hidden and read-only files as well. Also, I chose to not use aliases as the OP is new to PowerShell and may not understand what gci, ?, %, etc. are.
$limit = (Get-Date).AddDays(-15)
$path = "C:\Some\Path"
# Delete files older than the $limit.
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force
# Delete any empty directories left behind after deleting the old files.
Get-ChildItem -Path $path -Recurse -Force | Where-Object { $_.PSIsContainer -and (Get-ChildItem -Path $_.FullName -Recurse -Force | Where-Object { !$_.PSIsContainer }) -eq $null } | Remove-Item -Force -Recurse
And of course if you want to see what files/folders will be deleted before actually deleting them, you can just add the -WhatIf switch to the Remove-Item cmdlet call at the end of both lines.
If you only want to delete files that haven't been updated in 15 days, vs. created 15 days ago, then you can use $_.LastWriteTime instead of $_.CreationTime.
The code shown here is PowerShell v2.0 compatible, but I also show this code and the faster PowerShell v3.0 code as handy reusable functions on my blog.
just simply (PowerShell V5)
Get-ChildItem "C:\temp" -Recurse -File | Where CreationTime -lt (Get-Date).AddDays(-15) | Remove-Item -Force
Another way is to subtract 15 days from the current date and compare CreationTime against that value:
$root = 'C:\root\folder'
$limit = (Get-Date).AddDays(-15)
Get-ChildItem $root -Recurse | ? {
-not $_.PSIsContainer -and $_.CreationTime -lt $limit
} | Remove-Item
Basically, you iterate over files under the given path, subtract the CreationTime of each file found from the current time, and compare against the Days property of the result. The -WhatIf switch will tell you what will happen without actually deleting the files (which files will be deleted), remove the switch to actually delete the files:
$old = 15
$now = Get-Date
Get-ChildItem $path -Recurse |
Where-Object {-not $_.PSIsContainer -and $now.Subtract($_.CreationTime).Days -gt $old } |
Remove-Item -WhatIf
Try this:
dir C:\PURGE -recurse |
where { ((get-date)-$_.creationTime).days -gt 15 } |
remove-item -force
Esperento57's script doesn't work in older PowerShell versions. This example does:
Get-ChildItem -Path "C:\temp" -Recurse -force -ErrorAction SilentlyContinue | where {($_.LastwriteTime -lt (Get-Date).AddDays(-15) ) -and (! $_.PSIsContainer)} | select name| Remove-Item -Verbose -Force -Recurse -ErrorAction SilentlyContinue
If you are having problems with the above examples on a Windows 10 box, try replacing .CreationTime with .LastwriteTime. This worked for me.
dir C:\locationOfFiles -ErrorAction SilentlyContinue | Where { ((Get-Date)-$_.LastWriteTime).days -gt 15 } | Remove-Item -Force
Another alternative (15. gets typed to [timespan] automatically):
ls -file | where { (get-date) - $_.creationtime -gt 15. } | Remove-Item -Verbose
#----- Define parameters -----#
#----- Get current date ----#
$Now = Get-Date
$Days = "15" #----- define amount of days ----#
$Targetfolder = "C:\Logs" #----- define folder where files are located ----#
$Extension = "*.log" #----- define extension ----#
$Lastwrite = $Now.AddDays(-$Days)
#----- Get files based on lastwrite filter and specified folder ---#
$Files = Get-Childitem $Targetfolder -include $Extension -Recurse | where {$_.LastwriteTime -le "$Lastwrite"}
foreach ($File in $Files)
{
if ($File -ne $Null)
{
write-host "Deleting File $File" backgroundcolor "DarkRed"
Remove-item $File.Fullname | out-null
}
else {
write-host "No more files to delete" -forgroundcolor "Green"
}
}
$limit = (Get-Date).AddDays(-15)
$path = "C:\Some\Path"
# Delete files older than the $limit.
Get-ChildItem -Path $path -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force -Recurse
This will delete old folders and it content.
The following code will delete files older than 15 days in a folder.
$Path = 'C:\Temp'
$Daysback = "-15"
$CurrentDate = Get-Date
$DatetoDelete = $CurrentDate.AddDays($Daysback)
Get-ChildItem $Path -Recurse | Where-Object { $_.LastWriteTime -lt $DatetoDelete } | Remove-Item