Excluding Parent Directory if Any File is New - powershell

My company has individual folders on a share for each project they are working on, and if no files inside one of those folders or its subfolders has been touched in the last six months, I want to move them to an archive location. If any one file within the folder or any of its subfolders have been modified in the last six months, I want to skip the entire parent directory. I'm most of the way there now, but my current iteration only skips the individual files, and I'm not sure how to specify skipping the entire parent. Here is my current script:
$Date = (Get-Date).AddMonths(-6)
$Source = 'C:\Scripts\Source'
$Dest = 'C:\Scripts\Test Target'
Get-ChildItem $Source -File -Recurse | Where {$_.LastWriteTime -lt $Date} | ForEach {
$actualSource = Split-Path $_.FullName
$actualDest = Split-Path $_.FullName.Replace($source,$dest)
robocopy $actualSource $actualDest $_.Name /SEC
}
When using my test directories, I have a folder C:\Scripts\Source\Drivers. The script copies that Drivers folder like I want it to, but if I put a newer file anywhere within that Drivers folder, I want the entire folder to be skipped. Currently, the folder and anything older than six months within the folder are still being copied, and it is just skipping the individual files which are newer.
Please let me know if any more information is needed.

Simply pull back your copy and recurse statement one level up. First you want to iterate through all the parent folders. Then for each parent folder, recurse and check to see if there is any modified files, if there is, then copy the folder:
$Date = (Get-Date).AddMonths(-6)
$Source = 'C:\Scripts\Source'
$Dest = 'C:\Scripts\Test Target'
$ParentFolders = Get-ChildItem $Source -Directory
Foreach($Folder in $ParentFolders){
$NewFiles = Get-ChildItem $Folder -File -Recurse | Where {$_.LastWriteTime -lt $Date}
if($NewFiles.Count -eq 0)
{
#Archive
robocopy $Folder $Dest /SEC
}
}

Related

Powershell: Find Folders and Run Command in Those Folders

so trying to find a way to combine a couple of things the Stack Overflow crowd has helped me do in the past. So I know how to find folders with a specific name and move them where I want them to go:
$source_regex = [regex]::escape($sourceDir)
(gci $sourceDir -recurse | where {-not ($_.psiscontainer)} | select -expand fullname) -match "\\$search\\" |
foreach {
$file_dest = ($_ | split-path -parent) -replace $source_regex,$targetDir
if (-not (test-path $file_dest)){mkdir $file_dest}
move-item $_ -Destination $file_dest -force -verbose
}
And I also know how to find and delete files of a specific file extension within a preset directory:
Get-ChildItem $source -Include $searchfile -Recurse -Force | foreach{ "Removing file $($_.FullName)"; Remove-Item -force -recurse $_}
What I'm trying to do now is combine the two. Basically, I'm looking for a way to tell Powershell:
"Look for all folders named 'Draft Materials.' When you find a folder with that name, get its full path ($source), then run a command to delete files of a given file extension ($searchfile) from that folder."
What I'm trying to do is create a script I can use to clean up an archive drive when and if space starts to get tight. The idea is that as I develop things, a lot of times I go through a ton of incremental non-final drafts (hence folder name "Draft Materials"), and I want to get rid of the exported products (the PDFs, the BMPs, the AVIs, the MOVs, atc.) and just leave the master files that created them (the INDDs, the PRPROJs, the AEPs, etc.) so I can reconstruct them down the line if I ever need to. I can tell the script what drive and folder to search (and I'd assign that to a variable since the backup location may change and I'd like to just change it once), but I need help with the rest.
I'm stuck because I'm not quite sure how to combine the two pieces of code that I have to get Powershell to do this.
If what you want is to
"Look for all folders named 'Draft Materials.' When you find a folder with that name, get its full path ($source), then run a command to delete files of a given file extension ($searchfile) from that folder."
then you could do something like:
$rootPath = 'X:\Path\To\Start\Searching\From' # the starting point for the search
$searchFolder = 'Draft Materials' # the folder name to search for
$deleteThese = '*.PDF', '*.BMP', '*.AVI', '*.MOV' # an array of file patterns to delete
# get a list of all folders called 'Draft Materials'
Get-ChildItem -Path $rootPath -Directory -Filter $searchFolder -Recurse | ForEach-Object {
# inside each of these folders, get the files you want to delete and remove them
Get-ChildItem -Path $_.FullName -File -Recurse -Include $deleteThese |
Remove-Item -WhatIf
}
Or use Get-ChildItem only once, having it search for files. Then test if their fullnames contain the folder called 'Draft Materials'
$rootPath = 'X:\Path\To\Start\Searching\From'
$searchFolder = 'Draft Materials'
$deleteThese = '*.PDF', '*.BMP', '*.AVI', '*.MOV'
# get a list of all files with extensions from the $deleteThese array
Get-ChildItem -Path $rootPath -File -Recurse -Include $deleteThese |
# if in their full path names the folder 'Draft Materials' is present, delete them
Where-Object { $_.FullName -match "\\$searchFolder\\" } |
Remove-Item -WhatIf
In both cases I have added safety switch -WhatIf so when you run this, nothing gets deleted and in the console is written what would happen.
If that info shows the correct files are being removed, take off (or comment out) -Whatif and run the code again.

Copy-Item with overwrite?

Here is a section of code from a larger script. The goal is to recurse through a source directory, then copy all the files it finds into a destination directory, sorted into subdirectories by file extension. It works great the first time I run it. If I run it again, instead of overwriting existing files, it fails with this error on each file that already exists in the destination:
Copy-Item : Cannot overwrite the item with itself
I try, whenever possible, to write scripts that are idempotent but I havn't been able to figure this one out. I would prefer not to add a timestamp to the destination file's name; I'd hate to end up with thirty versions of the exact same file. Is there a way to do this without extra logic to check for a file's existance and delete it if it's already there?
## Parameters for source and destination directories.
$Source = "C:\Temp"
$Destination = "C:\Temp\Sorted"
# Build list of files to sort.
$Files = Get-ChildItem -Path $Source -Recurse | Where-Object { !$_.PSIsContainer }
# Copy the files in the list to destination folder, sorted in subfolders by extension.
foreach ($File in $Files) {
$Extension = $File.Extension.Replace(".","")
$ExtDestDir = "$Destination\$Extension"
# Check to see if the folder exists, if not create it
$Exists = Test-Path $ExtDestDir
if (!$Exists) {
# Create the directory because it doesn't exist
New-Item -Path $ExtDestDir -ItemType "Directory" | Out-Null
}
# Copy the file
Write-Host "Copying $File to $ExtDestDir"
Copy-Item -Path $File.FullName -Destination $ExtDestDir -Force
}
$Source = "C:\Temp"
$Destination = "C:\Temp\Sorted"
You are trying to copy files from a source directory to a sub directory of that source directory. The first time it works because that directory is empty. The second time it doesn't because you are enumerating files of that sub directory too and thus attempt to copy files over themselves.
If you really need to copy the files into a sub directory of the source directory, you have to exclude the destination directory from enumeration like this:
$Files = Get-ChildItem -Path $Source -Directory |
Where-Object { $_.FullName -ne $Destination } |
Get-ChildItem -File -Recurse
Using a second Get-ChildItem call at the beginning, which only enumerates first-level directories, is much faster than filtering the output of the Get-ChildItem -Recurse call, which would needlessly process each file of the destination directory.

Copy files from one folder to many via Powershell

I need to copy the files from one folder to many. Here's an example of my directory structure:
\\files\CA1\Files\Files
CA = state code
1 = office in that state
I want to copy all files from a source folder into the last files folder. The last files folder in that directory structure above is the destination. The script just needs to cycle through all of the directories with that state code and copy the new files into \\files\CA*\files\FILES\ folder. For instance, I want to copy all files from c:\documents into all folders that are for CA, regardless of the office number. Here's what I have so far:
$source = 'C:\Documents'
$destination = (Get-ChildItem -Path \\files\CA*\Files\Files -Recurse -Directory)
foreach ($dir in $destination){
Get-ChildItem $dir.Fullname | ForEach-Object {
$_.FullName
#Copy-Item -Path $Source -Destination $_ -Force -Recurse -WhatIf
}}

Moving files/folders based on creation date and keeping folder structure

Windows 7 Pro environment.
I'm looking to create a batch or PowerShell script to move folders based on creation or last modified date.
The source folder is "D:\Video". The destination is "F:\DVRBackups\".
The requirement is that the folder structure be maintained, and that only folders older than a certain creation date are moved, whilst the others are left untouched.
The folder structure for the source folder looks like this:
D:\Video\Cam01\XXXX
D:\Video\Cam02\XXXX
D:\Video\Cam03\XXXX
..etc..
(XXX = hundreds of folders within the Cam01/02/03 folders spanning months)
The number of camera folders changes based on the computer DVR box i'm working on. Some locations have 5 cameras, some have as many as 35 (i.e., Cam01, Cam02, .., Cam35).
Folders from within the Cam01/02/03 folders need to be moved over to F:\DVRBackups, whilst maintaining the original folder structure.
i.e.
D:\Video\Cam01\0318 --> F:\DVRBackups\Video\Cam01\0318
D:\Video\Cam01\0319 --> F:\DVRBackups\Video\Cam01\0319
D:\Video\Cam02\0501 --> F:\DVRBackups\Video\Cam02\0501
..etc..
Can someone assist?
If it's just the folders within each CamXX folder that you need to check the Last modified time and them move. You can use the following in PowerShell.
$source = "D:\Video\"
$destination = "F:\DVRBackups\"
$date = Get-Date "25/03/2015 12:00"
dir $source | %{ dir $_.FullName | ?{ $_.LastWriteTime -gt $date } | Copy-Item -Destination $destination -Recurse -Force }
If you want to test what folders will be moved before hand use this command first.
dir $destination | %{ dir $_.FullName | ?{ $_.LastWriteTime -gt $date } | select LastWriteTime, FullName }
This will provide a list of the all folders that will have all there content moved.

Powershell Copy files and folders

I have a PS script which Zips up the previous months logs and names the zip file FILENAME-YYYY-MM.zip
This works
What I now want to do is copy these zip files off to a network share but keeping some of the folder structure. I currently a folder structure similar to the following;
C:\Folder1\
C:\Folder1\Folder2\
C:\Folder1\Folder3\
C:\Folder1\Folder4\Folder5\
There are .zip files in every folder below c:\Folder1
What I want is for the script to copy files from c:\folder1 to \\networkshare but keeping the folder structure, so I should have 3 folders and another subfolder in folder4.
Currently I can only get it to copy the whole structure so I get c:\folder1\... in my \\networkshare
I keep running into issues such as the new folder structure doesn't exist, I can't use the -recurse switch within the Get-ChildItem command etc...
The script I have so far is;
#This returns the date and formats it for you set value after AddMonths to set archive date -1 = last month
$LastWriteMonth = (Get-Date).AddMonths(-3).ToString('MM')
#Set destination for Zip Files
$DestinationLoc = "\\networkshare\LogArchive\$env:computername"
#Source files
$SourceFiles = Get-ChildItem C:\Sourcefiles\*.zip -Recurse | where-object {$_.lastwritetime.month -le $LastWriteMonth}
Copy-Item $SourceFiles -Destination $DestinationLoc\ZipFiles\
Remove-Item $SourceFiles
Sometimes, you just can't (easily) use a "pure PowerShell" solution. This is one of those times, and that's OK.
Robocopy will mirror directory structures, including any empty directories, and select your files (likely faster than a filter with get-childitem will). You can copy anything older than 90 days (about 3 months) like this:
robocopy C:\SourceFiles "\\networkshare\LogArchive\$($env:computername)\ZipFiles" /E /IS /MINAGE:90 *.zip
You can specify an actual date with /MINAGE too, if you have to be that precise.
How about Copy-Item "C:\SourceFiles\" -dest $DestinationLoc\ZipFiles -container -recurse? I have tested this and have found that it copies the folder structure intact. If you only need *.zip files, you first get them, then for each you call Resolve-Path with -Relative flag set and then add the resultant path into Destination parameter.
$oldloc=get-location
Set-Location "C:\SourceFiles\" # required for relative
$SourceFiles = Get-ChildItem C:\Sourcefiles\*.zip -Recurse | where-object {$_.lastwritetime.month -le $LastWriteMonth}
$SourceFiles | % {
$p=Resolve-Path $_.fullname -relative
copy-item $_ -destination "$DestinationLoc\ZipFiles\$p"
}
set-location $oldloc # return back