My ultimate goal is to get a list of top level folders (for a given path) where a file has been modified in the last day.
There are probably a lot of ways to do this. The place where I am having a problem is getting the top level folder only.
Here is what I have so far:
Get-ChildItem -Path "c:\data\*" -recurse |
where-object {$_.lastwritetime -gt (get-date).addDays(-1)} |
where-object {-not $_.PSIsContainer} |
Foreach-Object { $_.DirectoryName} |
sort -unique
It gets all the directories though, not just the top level.
Here's how I would do it
$dirs = dir "sometoplevelpath" |?{ $_.PsIsContainer }
$oneDayAgo = (Get-Date).AddDays(-1)
$dirs |?{ dir $_ -Recurse |?{!$_.PsIsContainer -and $_.LastWriteTime -gt $oneDayAgo } | select -first 1 }
You could take the list of folders that you end up with and compare their full path without their name and see if it matches the directory that contains the folders you're interested in:
$folders | Where-Object {$_.FullName.Replace($_.Name,"") -eq $superDirectory}
Where $superdDirectory is the name of the directory that contains the "top level directories". In this case, that sounds like "C:\".
You could also investigate the PSParentPath property.
Another method would be to make a list of potential backup folders first:
$targetFolders = Get-Item -Path "C:\data*" | Where-Object {$_.PSIsContainer}
And then go through that list to see if they have any items that need backing up, taking action if they do.
$targetFolders | % {
$folderItems = Get-ChildItem $_.FullName | ? {.... use your filter here}
if (($folderItems | Measure-Object).Count -gt 0){
#Backup the folder, or add $_.FullName to the list of folders that should be backed up.
}
}
Try removing the -recurse
Get-ChildItem -Path "c:\data*" | where-object {$_.lastwritetime -gt (get-date).addDays(-1)} | where-object {-not $_.PSIsContainer} | Foreach-Object {$_.DirectoryName} | sort -unique
I also changed the $. to $_.. See if this works. I got it to give me only the top level directory names, but I don't have anything I can run as a pattern like "c\data*"
Related
My company recently moved to outlook365. We are entirely VDI based so our user profiles are stored on a single server. As a result our users all now have 2+ .ost files taking up storage space on the server. I'd like to write a script to find and delete the extraneous .ost files. In addition I'd like to schedule the script to run on a monthly basis to clean up any orphaned .ost's that occur for any other reason.
I've tried a few different solutions but can't seem to find the right syntax to identify just the oldest/original .ost in each subdirectory, all attempts have identified the oldest file from the whole directory or all .ost files in the directory.
$Path = "<path>"
$SubFolders = dir $Path -Recurse | Where-Object {$_.PSIsContainer} | ForEach-Object -Process {$_.FullName}
ForEach ($Folder in $SubFolders)
{
$FullFileName = dir $Folder | Where-Object {!$_.PSIsContainer} | Sort-Object {$_.LastWriteTime} -Descending | Select-Object -First 1
}
Inside of your loop, you could use the following to list the .ost file that has the oldest LastWriteTime value. Just add the -Descending flag to Sort-Object to list the newest file.
$FullFileName = foreach ($folder in $Subfolders) {
$Get-ChildItem -Path $folder -Recurse -File -Filter "*.ost" |
Sort-Object -Property LastWriteTime |
Select-Object -Property FullName -First 1
}
$FullFileName
If there is only one .ost file found in the $folder path, it will still find that file. So you will need logic to not delete when there is only one file. This does not guarantee it is the oldest file. You probably want a combination of the oldest CreationTime and newest LastWriteTime. The following will list the oldest .ost file based on CreationTime.
$FullFileName = foreach ($folder in $Subfolders) {
Get-ChildItem -Path $folder -Recurse -File -Filter "*.ost" |
Sort-Object -Property CreationTime |
Select-Object -Property FullName -First 1
}
$FullFileName
Another issue is setting the $FullFileName variable inside of the foreach loop. This means it will be overwritten through each loop iteration. Therefore, if you retrieve the value after the loop completes, it will only have the last value found. Setting the variable to be the result of the foreach loop output will create an array with multiple values.
To only output an OST file path when there are multiple OST files, you can do something like the following:
$FullFileName = foreach ($folder in $Subfolders) {
$files = Get-ChildItem -Path $folder -Recurse -File -Filter "*.ost" |
Sort-Object -Property LastWriteTime -Descending
if ($files.count -ge 2) {
$files | Select-Object -Property FullName -First 1
}
$FullFileName
This one liner should do the job, keeping the ost file with the newest LastWriteTime
gci -Path $Path -directory | where {(gci -Path $_\*.ost).count -gt 1}|%{gci -Path $_\*.cmd|Sort-Object LastWriteTime -Descending|Select-Object -Skip 1|Remove-Item -WhatIf}
Longer variant follows.
$Path = '<path>'
$Ext = '*.ost'
Get-ChildItem -Path $Path -directory -Recurse |
Where-Object {(Get-ChildItem -Path "$_\$Ext" -File -EA 0).Count -gt 1} |
ForEach-Object {
Get-ChildItem -Path "$_\$Ext" -File -EA 0| Sort-Object LastWriteTime -Descending |
Select-Object -Skip 1 | Remove-Item -WhatIf
}
The first two lines evaluate folders with more than one .ost file
The next lines iterates those folders and sort them descending by LastWriteTime, skips the first (newest) and pipes the other to Remove-Item with the -WhatIf parameter to only show what would be deleted while testing.
You can of course also move them to a backup location instead.
I have a folder structure with, for example, 100 folders. Each folder has 200 files in it.
I would like to delete (via scheduled task) all files in each folder but keep the last 10 versions of it.
I am trying to upskill in Powershell so I am guessing that this should be pretty simple. I have created this script,
#Delete all files, keep last 10 versions#
$Directory = "D:\Octopus\Packages"
$Keep = "10"
Get-ChildItem $Directory| ?{ $_.PSIsContainer } | Select-Object FullName | Export-Csv $Directory\FolderList.csv
$FolderList = import-csv $Directory\FolderList.csv
ForEach ($row in $FolderList)
{
Get-ChildItem -Recurse | where{-not $_.PsIsContainer}| sort CreationTime -desc | select -Skip $Keep | Remove-Item -Force
}
It appears to be looping through each folder, but keeping the last 10 files for the entire folder structure, not per folder. So some folders have 0 files, some may have 2 files, some may have 8 files.
Any pointers would be appreciated
Thanks !
If you actually need to have that CSV then just modify Get-ChildItem -Recurse to Get-ChildItem $row -recurse. However, if you don't need to be creating the CSV, you can remove of that and just pipe the results of your first Get-ChildItem into the next action.
$Directory = "D:\Octopus\Packages"
$Keep = "10"
Get-ChildItem $Directory| ?{ $_.PSIsContainer } | Select-Object FullName |
ForEach-object {Get-ChildItem $_.fullname -Recurse |
where{-not $_.PsIsContainer}| sort CreationTime -desc |
select -Skip $Keep | Remove-Item -Force }
I'm trying to run the script below in multiple paths using an array. My goal is to delete folders keeping the last 7 versions, but it is not working as expected. The action is only taking into account the first path D:\Test1.
I believe that I should add something like ($folders in $folders) after ForEach-Object but I don know how.
Any idea what I missing here?
$path = #("D:\Test1","D:\Test2","D:\Test3")
$folders = Get-ChildItem -Path $path -Recurse |
Where-Object { $_.PSIsContainer } |
Group-Object { $_.Name.Split('_')[0] } |
ForEach-Object $Folders {
$_.Group |
sort CreationTime -Descending |
Select -Skip 7 |
foreach { Remove-Item $_.FullName -Force -WhatIf }
}
This should do your job.
$path= #("D:\Test1","D:\Test2","D:\Test3")
$folders= Get-ChildItem -path $path -Recurse | Where-Object {$_.PsIsContainer} |Group-Object {$_.FullName.Split('_')[0] }
ForEach($folder in $folders)
{
$folder.Group | sort CreationTime -Descending | Select -Skip 7|% { Remove-Item $_.fullname -Force -whatIf}
}
I tested in my local and it is working fine. Although I didn't get any error in your code except few formatting issue which I have taken into variable and sorted it out cause I got tangled in too many pipeline objects.
If you are using foreach after a pipeline , that means it will take the pipeline objects one by one only. But if you are separately using it , then you have to assign each iteration into a variable.
Hope it helps you.
I got the answer from #Robert Israelsson !
" If you change your group-object to not group by name but instead fullname you will get the desired result."
From:
$folders= Get-ChildItem -path $path -Recurse | Where-Object {$_.PsIsContainer} |Group-Object {$_.Name.Split('_')[0] }
To:
$folders= Get-ChildItem -path $path -Recurse | Where-Object {$_.PsIsContainer} |Group-Object {$_.FullName.Split('_')[0] }
And this works perfectly!
I have this PowerShell code that compares 2 directories and removes files if the files no longer exist in the source directory.
For example say I have Folder 1 & Folder 2. I want to compare Folder 1 with Folder 2, If a file doesn't exist anymore in Folder 1 it will remove it from Folder 2.
this code works ok but I have a problem where it also picks up file differences on the date/time. I only want it to pick up a difference if the file doesn't exist anymore in Folder 1.
Compare-Object $source $destination -Property Name -PassThru | Where-Object {$_.SideIndicator -eq "=>"} | % {
if(-not $_.FullName.PSIsContainer) {
UPDATE-LOG "File: $($_.FullName) has been removed from source"
Remove-Item -Path $_.FullName -Force -ErrorAction SilentlyContinue
}
}
Is there an extra Where-Object {$file1 <> $file2} or something like that.?
I am not sure how you are getting the information for $source and $destination I am assuming you are using Get-ChildItem
What i would do to eliminate the issue with date/time would be to not capture it in these variables. For Example:
$source = Get-ChildItem C:\temp\Folder1 -Recurse | select -ExpandProperty FullName
$destination = Get-ChildItem C:\temp\Folder2 -Recurse | select -ExpandProperty FullName
By doing this you only get the FullName Property for each object that is a child item not the date/time.
You would need to change some of the script after doing this for it to still work.
If I am not getting it wrong, the issue is your code is deleting the file with different time-stamp as compared to source:
Did you try -ExcludeProperty?
$source = Get-ChildItem "E:\New folder" -Recurse | select -ExcludeProperty Date
The following script can serve your purpose
$Item1=Get-ChildItem 'SourcePath'
$Item2=Get-ChildItem 'DestinationPath'
$DifferenceItem=Compare-Object $Item1 $Item2
$ItemToBeDeleted=$DifferenceItem | where {$_.SideIndicator -eq "=>" }
foreach ($item in $ItemToBeDeleted)
{
$FullPath=$item.InputObject.FullName
Remove-Item $FullPath -Force
}
Try something like this
In PowerShell V5:
$yourdir1="c:\temp"
$yourdir2="c:\temp2"
$filesnamedir1=(gci $yourdir1 -file).Name
gci $yourdir2 -file | where Name -notin $filesnamedir1| remove-item
In old PowerShell:
$yourdir1="c:\temp"
$yourdir2="c:\temp2"
$filesnamedir1=(gci $yourdir1 | where {$_.psiscontainer -eq $false}).Name
gci $yourdir2 | where {$_.psiscontainer -eq $false -and $_.Name -notin $filesnamedir1} | remove-item
If you want to compare files in multiple dir, use the -recurse option for every gci command.
I'm having trouble getting the following PowerShell statement to work. The objective is to get a list of folders which are in the ..\archive folder sorted by oldest to youngest.
I would like to copy the number of folders which amount to or less than $ClosedJobssize from the ..\Archive to the ..\movetotape folder. This is so the size of the ..\Archive folder never changes on the hard drive.
get-childitem -path "\\srv02\d$\Prepress\Archive" | sort-object -property
#{Expression={$_.CreationTime};Ascending=$false} | % { if (((get-childitem -path
"\\srv02\d$\prepress\archive" -recurse -force | measure-object -Property Length -Sum).Sum + $_.Length)
-lt $closedjobssize ) { move-item -destination "\\srv02\d$\prepress\archive\MoveToTape\" }}
What might I be doing wrong? I don't get any errors. It just sits and hangs when I execute it.
Try this. It's a long one-liner (remove -whatIf to perform the move):
dir "\\srv02\d$\Prepress\Archive" | sort CreationTime -desc | where { $_.psiscontainer -AND (dir $_.fullname -recurse -force | measure-object -Property Length -Sum).Sum -lt $closedjobssize} | Move-Item -dest "\\srv02\d$\prepress\archive\MoveToTape\" -whatIf
I'm not quite sure I understand. But I think you want to move folders in \archive to \archive\movetotape to fill up \movetotape until it is $ClosedJobsSize or less in size. Right?
A couple of things: You are adding up the size of everything in \archive, so the result of your comparison will never change. Second, one of the folders checked is MoveToTape itself, which could cause you to move it into itself (this should give an exception).
Given that, I think this code will work, but I haven't tested it.
## Get all the directories in \arcive that need to be moved
$Directories = Get-ChildItem "\\srv02\d$\Prepress\Archive" |
Where-Object {$_.PSIsContainer -and ($_.Name -ne "MoveToTape")} | Sort-Object CreationTime -Descending
foreach ($Directory in $Directories)
{
$SumOfMoveToTape = (Get-ChildItem "\\srv02\d$\prepress\archive\MoveToTape\" -Recurse | Measure-Object -Property Length -Sum).Sum
$SumOfItem = (Get-ChildItem $_.FullName -Recurse | Measure-Object -Property Length -Sum).Sum
if(($SumOfMoveToTape + $SumOfItem) -lt $ClosedJobsSize)
{
## If we can fit on MoveToTape, then move this directory
Move-Item -Destination "\\srv02\d$\prepress\archive\MoveToTape\"
}
## If you want to keep folders in order (and not try to squeze whatever onto the tape
## then put an 'else {break}' here
}