we got a small script that creates folders named by the daily date. I got a script that deletes folders which are older than 30 days.
dir "\\nas\Backup_old\*" -ErrorAction SilentlyContinue |
Where { ((Get-Date) - $_.LastWriteTime).days -gt 30} |
Get-ChildItem -Recurse | Remove-Item -Recurse -Force
Principally it works fine. The Subfolders with contend will be deleted.
But the main folder is still existing and the LastWriteTime is canged to the runtime of the script. The folder is empty. Someone have a idea to solve this problem?
You probably just need to remove the second instance of Get-ChildItem (noting that dir is just an alias for Get-ChildItem), as that is causing it to remove the children of each of the directories returned by the first:
Get-ChildItem "\\nas\Backup_old\*" -ErrorAction SilentlyContinue |
Where-Object { ((Get-Date) - $_.LastWriteTime).days -gt 30} |
Remove-Item -Recurse -Force -WhatIf
Have a look at the WhatIf output and if it looks like it will now remove what you expect, remove -WhatIf.
Related
I need to write a script that deletes folders with last write time ~7 Days. But keep 2 "special" folder with the content in it.
Here's my script so far:
$source = "D:\TestOrdner"
$time = (Get-Date)#.AddDays(-7)
Start-Transcript "C:\log_files\log.txt"
gci $source -Recurse | ?{$_.LastWriteTime -lt $time} | del -Force -Verbose
Stop-Transcript
My only problem is how to EXCLUDE the folders with content?
My folder to keep: D:\TestOrdner\Test.
Be careful when deleting user profile folders, so keep the -WhatIf switch until you are absolutely sure the below will not delete folders that should not be deleted.
It might be a good idea to Move these folders instead of deleting them?
Since this concerns a user profile folder where every user has his/her own folder directly under the root folder, there is no need for the -Recurse switch on Get-ChildItem
Anyhow, this should do it:
$source = "D:\TestOrdner"
$time = (Get-Date).AddDays(-7)
Start-Transcript "C:\log_files\log.txt"
Get-ChildItem $source -Directory -Exclude 'Administrator','Default','Public' |
Where-Object {$_.LastWriteTime -lt $time} |
Remove-Item -Recurse -Force -confirm:$false -WhatIf
Stop-Transcript
I am currently new at PowerShell and I have created a script based on gathered information on the net that will perform a Delete Operation for found files within a folder that have their LastWriteTime less than 1 day.
Currently the script is as follows:
$timeLimit = (Get-Date).AddDays(-1)
$oldBackups = Get-ChildItem -Path $dest -Recurse -Force -Filter "backup_cap_*" |
Where-Object {$_.PSIsContainer -and $_.LastWriteTime -lt $timeLimit}
foreach($backup in $oldBackups)
{
Remove-Item $dest\$backup -Recurse -Force -WhatIf
}
As far as I know the -WhatIf command will output to the console what the command "should" do in real-life scenarios. The problem is that -WhatIf does not output anything and even if I remove it the files are not getting deleted as expected.
The server is Windows 2012 R2 and the command is being runned within PowerShell ISE V3.
When the command will work it will be "translated" into a task that will run each night after another task has finished backing up some stuff.
I did it in the pipe
Get-ChildItem C:\temp | ? { $_.PSIsContainer -and $_.LastWriteTime -lt $timeLimit } | Remove-Item -WhatIf
This worked for me. So you don't have to ttake care of the right path to the file.
other solution
$timeLimit = (Get-Date).AddDays(-1)
Get-ChildItem C:\temp2 -Directory | where LastWriteTime -lt $timeLimit | Remove-Item -Force -Recurse
The original issue was $dest\$backup would assume that each file was in the root folder. But by using the fullname property on $backup, you don't need to statically define the directory.
One other note is that Remove-Item takes arrays of strings, so you also could get rid of the foreach
Here's the fix to your script, without using the pipeline. Note that since I used the where method this requires at least version 4
$timeLimit = (Get-Date).AddDays(-1)
$Backups = Get-ChildItem -Path $dest -Directory -Recurse -Force -Filter "backup_cap_*"
$oldBackups = $backups.where{$_.LastWriteTime -lt $timeLimit}
Remove-Item $oldBackups.fullname -Recurse -Force -WhatIf
Using PowerShell I'd like to search a directory tree which will have a subset of folders. If a file called NOW is present within those folders and is 3 days old I'd like to delete the parent directory.
I think I have the search syntax right, then piping to a foreach loop but I can't figure out how to remove the parent directory.
Get-ChildItem -Path C:\tools\test1 -Filter NOW -Recurse |
foreach ($_) ???
Any help would be much appreciated, thanks
Get-ChildItem returns System.IO.FileInfo objects for files. One of the properties is Directory. So what you would be wanting to remove the directory. The Directory is still and object and we need the full path from it.
Remove-Item $_.Directory.FullName -Force -Recurse
The above would remove the folder, where NOW resides, and its contents. But you have another condition for age. Couple of ways to do this but one would be to use New-TimeSpan to compare the creation time to Now. Using the Days property of the TimeSPam
(New-TimeSpan -start $_.CreationTime -end ([datetime]::Now)).Days -gt 3
Putting that together with what you already have. -File will ensure we dont get folder matches.
$refdate = (Get-Date).Date.AddDays(-3)
Get-ChildItem -Path "C:\tools\test1" -Filter "NOW" -Recurse -File |
Where-Object{$_.CreationTime -gt $refdate} |
ForEach-Object{ Remove-Item $_.Directory.FullName -Force -Recurse -WhatIf }
The -WhatIf will help you identify the folders this process would attempt to remove. If you dont have at least PowerShell version 3 you could do this.
$refdate = (Get-Date).Date.AddDays(-3)
Get-ChildItem -Path "C:\tools\test1" -Filter "NOW" -Recurse |
Where-Object{(!$_.PSIsContainer) -and ($_.CreationTime -gt $refdate)} |
ForEach-Object{ Remove-Item $_.Directory.FullName -Force -Recurse -WhatIf }
I am trying delete all files within a folder but there is 1 folder called pictures which I would like to keep but don't know how to do that. I am using the following script , it deletes everything in a folder
if ($message -eq 'y')
{
get-childitem "C:\test" -recurse | % {
remove-item $_.FullName -recurse
}
}
One solution is to use something like:
Get-ChildItem -Path "c:\test" -Recurse | Where-Object { $_.FullName -cnotmatch "\\Pictures($|\\)" -and (Get-ChildItem $_.FullName -Include "Pictures" -Recurse).Length -eq 0 } | Remove-Item -Recurse -ErrorAction SilentlyContinue;
I suspect there must be a way more elegant way to do this. Here's what this does: it enumerates all files in the C:\test folder recursively (Get-ChildItem), then it removes all items from the result list using Where-Object where the path contains the directory to be excluded (specified using regex syntax) or when the item in question has child items that contains the file or directory to be excluded. The resulting list is fed to Remove-Item for removal. The -ErrorAction SilentlyContinue switch is applied to prevent errors being logged with recursive removal.
Get-ChildItem $PSScriptRoot -Force| Where-Object {$_.Name -ne "Pictures"} | Remove-Item -Recurse
I just tried this, and it worked for me. If you want to change what is deleted just change the "Pictures". This uses $PSScriptRoot for the path, which is the execution path of the Powershell script. You can rename that to be the path of where you want to delete.
I'm running the following command in a directory that is the root of a Mercurial repository. I want to delete any files and folders beginning with ".hg":
gci -rec -filter ".hg*" | remove-item -recurse -force
The strange thing, is that it does actually work, but still produces the following exception:
Get-ChildItem : Could not find a part of the path 'C:\temp\WSCopyTest\MyCompany.Services.DaftPunk\.hg\'.
At line:1 char:4
+ gci <<<< -rec -filter ".hg*" | remove-item -recurse -force
+ CategoryInfo : ReadError: (C:\temp\WSCopyT...es.DaftPunk\.hg:String) [Get-ChildItem], DirectoryNotFoundException
+ FullyQualifiedErrorId : DirIOError,Microsoft.PowerShell.Commands.GetChildItemCommand
Because the exception is thrown by Get-ChildItem, I suspect my understanding of the pipelining in PowerShell is flawed. Almost like Get-ChildItem finds an item, passes it to the next cmdlet in the pipeline, then looks for the next one? Is that how it works?
The following was supposed to be a script that would replicate the problem, but, on the same machine, it works flawlessly:
$repoRoot = "C:\Temp\HgDeleteTest\MyRepo"
remove-item $repoRoot -recurse -force
md $repoRoot
pushd $repoRoot
hg.exe init
add-content ShouldBeLeftBehind.txt "This is some text in a file that should be left behind once the various .hg files/directories are removed."
add-content .hgignore syntax:glob
add-content .hgignore *.dll
add-content .hgignore *.exe
hg.exe commit --addremove --message "Building test repository with one commit."
gci -rec -filter ".hg*" | remove-item -recurse -force
popd
Could it be that you're removing a folder starting with .hg, which in turn contains a file starting with .hg, but that file no longer exists?
I expect Antony is correct - using the -recurse on the gci will enumerate both files and directories that match ".hg*". The directory will be returned first. Then remove-item deletes the directory first, and the -force switch deletes all the files in it. Then it tries to delete the files that were in that directory that match ".hg*" (that were there when the gci ran) but they're gone now. This should stop the errors:
gci ".hg*" -recurse | select -expand fullname | sort length -desc | remove-item -force
Sorting the full names in descending order of length insures that no matched parent directory is deleted before all the matched files in that directory are deleted.
The following will restrict its deletions to files only, and exclude folders.
gci -rec -filter ".hg*" | Where {$_.psIsContainer -eq $false} | remove-item -recurse -force
Then run it again, by deleting the folders:
gci -rec -filter ".hg*" | Where {$_.psIsContainer -eq $true} | remove-item -recurse -force
I ended up separating the search from the removal. I believe my issue was to do with the way piping works by default (i.e. one item at a time) but I've not been able to build a test to check it.
In any case, the following works fine:
$hgObjects = gci -rec | ?{ $_.name -like ".hg*" -and $_.name -ne ".hgignore" }
remove-item $hgObjects -force -recurse
Using this style, the remove-item cmdlet gets an array of items found and works through them.
The .hg folder in my original example didn't have anything in it called .hg*, so I don't see what was going on. It felt more like it was trying to delete the folder twice, which I find very strange. I wonder if it's actually a PowerShell manifestation of this standard .NET behaviour:
An enumerator remains valid as long as the collection remains
unchanged. If changes are made to the collection, such as adding,
modifying or deleting elements, the enumerator can be invalidated and
the next call to MoveNext or Reset can throw an
InvalidOperationException. If the collection is modified between
MoveNext and Current, Current returns the element that it is set to,
even if the enumerator is already invalidated.
(Taken from http://msdn.microsoft.com/en-us/library/system.collections.ienumerable.getenumerator(v=vs.71).aspx).