Search for folders created after date X and copy them - powershell

I need to go through a large folder (let's call it D:\Files ) which contains about 50.000 folders and about 1 million files, and copy all folders that were created (not modified) after August 31th 2012, including everything in their subfolders (including second/third/fourth level deep subfolders etc.) and files (also files in subfolders, no matter when they got changed) in those folders, for them no matter when they were created, as long as they're in a "higher up" folder that go created after August, to another drive (Call it E:\Space) - still keeping the original folder structure, so a file which was before in D:\Files\Folder1\Subfolder3\hello.txt then should be in E:\Space\Folder1\Subfolder3\hello.txt
Background is, some folders got created in the structure after August, but the files in them do have an older modified date and I need to include everything in the search process that was added to D:\Files after 31st August 2012, so I can't figure out which files were recently added, and which were in there already before August, and I need to search for the folder creation date.
So, if the folder D:\Files\Folder1 got created in September, then files and folders in it - be it one, two, three, four etc. levels deep, no matter when it got changed/modified, should be moved to E:\Space with the powershell.
I've tried in the past two days to get through, but so far failed miserably.
I know this probably isn't easy with Powershell as it requires several steps, but any help would be greatly appreciated.
Thanks so much.

Try something like this:
$srcRoot = "D:\Files"
$dstRoot = "D:\Space"
$ct = Get-Date "2012-09-01"
Get-ChildItem -Recurse $srcRoot | ? {
$_.PSIsContainer -and $_.CreationTime -ge $ct
} | sort FullName | % {
$name = $_.FullName
if ($top -eq $null) {
$top = $name
} elseif ( $name.substring(0, [math]::Min($top.Length, $name.Length)) -ne $top ) {
$top = $name
$dst = $_ -replace [regex]::Escape($srcRoot), $dstRoot
Copy-Item $name $dst -Recurse -Force
}
}

You should consider using robocopy.exeto solve the task.
If youre looking for a ps Solution have a Look on the scriptinggames 2013 Event 1 submissions

Related

How to rename folders to put the year first using Powershell

I'm trying to organize some old photos that are split into many different folders. All of the folder names do contain the year, but almost always at the end of the folder name. This doesn't work very well when I'm trying to sort all of my photos from the past 20 years. I'm trying to write a script that would loop through all of the folder names and move the year (YYYY) to the beginning of the folder name.
Current Folders:
The best trip ever 2012
Visiting relatives 2010
2017 trip
Old photos from 2001
Convert to:
2012 The best trip ever
2010 Visiting relatives
2017 trip
2001 Old photos from
I am not very familiar with powershell so I've spent a few hours fiddling with regex and the necessary scripts to filter to the right subset of folder names (that start with a letter and contain a 4 digit year) but I'm struggling to actually rename these successfully.
Here's what I have:
$folders = Get-ChildItem -Path C:\Users\User\pictures\ | Where-Object { $_.Name -match '^[a-zA-Z].+[0-9]{4}' }
foreach ($folder in $folders)
{ $folder.Name.Split() | Where {$_ -match "[0-9]{4}"}
Rename-Item -Path $folder-NewName "$($Matches[0])_$folder.Name"
}
Any help is appreciated!
If you use the -match operator with a regex that captures the name parts of interest via capture groups ((...)), you can rearrange these name parts, as reflected in the automatic $Matches variable variable, in a delay-bind script block passed to the Rename-Item call:
Get-ChildItem -Directory C:\Users\User\pictures |
Where-Object Name -match '^(.+?) ?\b(\d{4})\b(.*?)$' |
Rename-Item -WhatIf -NewName {
'{0} {1}{2}' -f $Matches.2, $Matches.1, $Matches.3
}
Note: The -WhatIf common parameter in the command above previews the operation. Remove -WhatIf once you're sure the operation will do what you want.
For an explanation of the regex and the ability to interact with it, see this regex101.com page.
Note: The following simplification, which uses the -replace operator, works in principle, but, unfortunately, reports spurious Source and destination path must be different errors as of PowerShell 7.2.1, for all directories whose name either doesn't contain a 4-digit year or already has it at the start of the name:
# !! Works, but reports spurious errors as of PowerShell 7.2.1
Get-ChildItem -Directory C:\Users\User\pictures
Rename-Item -WhatIf -NewName { $_.Name -replace '^(.+?) ?\b(\d{4})\b(.*?)$' }
The reason is Rename-Item's unfortunate behavior of complaining when trying to rename a directory to its existing name (which happens when the -replace operation doesn't find a regex match and therefore returns the input string as-is), which should be a quiet no-op, as it already is for files - see GitHub issue #14903.

Move subfolder elsewhere and rename based on parent using PowerShell

I have many report folders under different parent folders that has the following structure:
C:\Users\USER\Downloads\LTFT01\Report
C:\Users\USER\Downloads\LTFT02\Report
C:\Users\USER\Downloads\LTFT03\Report
What I want to do is, if any of the report folders are non-empty, then move that report folder elsewhere and rename the folder with the original parent folder in the name. Such as LTFT01Report and LTFT02Report.
I have the 'test if it's non-empty' bit ready, but I have no idea what to do from here. I don't know really how foreach works so I haven't been able to implement that (even after searching!)
If (Test-Path -Path "C:\Users\USER\Downloads\*\Report\*"
Edit: It seems I need to clarify for some the following:
I'm new to coding, and new to PowerShell as of this week
I've googled a ton and found a bunch of answers, but nothing pertinent to my question (directly) or it's left me confused :(
I would really appreciate a nudge in the right direction rather than a git gud.
I think I need a foreach, hence my last line of the question, but not sure. Again - newbie here!
OP here!
So I've been able to create an answer based on some research:
#Get report folder path
$ReportPath = "C:\Users\USER\Downloads\*\Report"
$MasterReportPath = "C:\Users\USER\Downloads\MasterReports"
#Rename report folder to {currentparentname}report
Get-Item -Path $ReportPath | ForEach-Object {$a = $_.FullName | split-path -Parent | split-path -leaf; Rename-Item -Path $_.FullName -NewName $a"Report"}
#Move report folder
$AnyNamedReportFolder = Get-Item "C:\Users\USER\Downloads\*\*Report*" -Exclude *.jmx, *.csv
Move-Item -Path $AnyNamedReportFolder -Destination $MasterReportPath
Definitely isn't elegant, but does the job. Since I have the main answer to this question, I'll mark it as answered. However there is an issue with this, which is that if this is run multiple times then same named folders will not move (since it doesn't append a unique number or character). I've highlighted this in a new question here, should you be interested. If all you need is a one time working script, then the above script worked for me.
Moving Same Name Folders into Another Folder

Powershell: Compare Last Modified to Specific Date and replace with correct Date

I'm still fairly new to powershell, so please bear with me.
I have 2 almost identical directories. Files and folders from the old directory were copied over to a new directory. However, during this transfer process, something happened to the last modified date. The files and folders in the new directory have incorrect last modified dates (ex: today).
Rather than re-doing the transfer process, which would take a long time, I'd like to write something in powershell that will compare the last modified dates of the two directories and correct the dates in the new directory.
I'd also like to check first if file/folder has been modified since the file transfer. There would be no reason to change the date on those files.
What I found from looking around and googling:
Link1 Link2 Link 3 Link 4
I know that I can get the last modified date of a file with:
(Get-Item $filename).LastWriteTime
where $filename is the file directory.
I also came across the following:
dir $directory | ? {$_.lastwritetime -gt "6/1/19" -AND $_.lastwritetime -lt "12/30/19"}
I know I can get information regarding files that were modified between 2 dates. This, I can tweak to make it so the "less than (-lt)" can be used to check files that were not modified past a certain date.
dir $directory | ? {$_.lastwritetime -lt `12/13/19'}
This accomplishes one of my goals. I have a means to check if a file has been modified past a certain or not.
I saw this for changing the value lastwritetime
$folder = Get-Item C:\folder1
$folder.LastWriteTime = (Get-Date)
and realized this was simply
(Get-Item $filename).LastWriteTime = (Get-Date)
Which I could modify to meet my goal of replacing the new file's last write time wit the old file's correct time:
(Get-Item $filename).LastWriteTime = (Get-Item $filename2).LastWriteTime
I suppose what I'm struggling with is kind of putting it all together. I know how to recurse through files/folders for copy-item or even Get-Childitem by adding the "recurse" parameter. But I'm having difficulties wrapping my head around recursively navigating through each directory to change the dates.
Thank you for your help.
You could do the following to compare the LastWriteTime property of the original files and folders to the copies, while keping in mind that files in the copy folder could have been updated since the last transfer date.
# set the date to the last transfer date to determine if the file was updated after that
$lastTransferDate = (Get-Date).AddDays(-10) # just for demo 10 days ago
# set the paths for the rootfolder of the originals and the rootfolder to where everything was copied to
$originalPath = 'D:\OriginalStuff'
$copyPath = 'E:\TransferredStuff'
# loop through the files and folders of the originals
Get-ChildItem -Path $originalPath -Recurse | ForEach-Object {
# create the full path where the copied file of folder is to be found
$copy = Join-Path -Path $copyPath -ChildPath $_.FullName.Substring($originalPath.Length)
# test if this object can be found
if (Test-Path -Path $copy) {
$item = Get-Item -Path $copy
# test if the item has not been updated since the last transfer date
if ($item.LastWriteTime -le $lastTransferDate) {
# set the timestamp the same as the original
$item.LastWriteTime = $_.LastWriteTime
}
}
}
Great job with what you've done so far.
Just put what you have into a foreach statement.
Foreach($item in (gci 'C:\Users\usernamehere\Desktop\folder123' -recurse)){
(Get-Item $item.FullName).LastWriteTime = (Get-Item "C:\Users\usernamehere\Desktop\folderabc\RandomFile.txt").LastWriteTime
}
We wrap the Get-Childitem command with the -recurse flag into parenthesis so that the command executes on it's own and becomes a collection for our foreach command to traverse. $item is the current item in the loop. We will want to use the .FullName property to know the full path to the file for the current item. With that said you will use $item.FullName together for the files you are going to set the date on.

How do I show progress percentage when copying files in powershell form?

I'm building a powershell GUI to back up profile folders. All works well, except I can't figure out how to accurately show a progress percentage for each folder.
Here's what I currently have...
$dir = "Drive letter of choice" (ex. "C:")
$user = "C:\Users\username"
$userfolder = "username"
$Files = Get-ChildItem -Path "$User\Documents\*" -Recurse
$i=0
ForEach ($File in $Files) {
$i++
[int]$pct = ($i/$Files.count)*100
Copy-Item "$User\Documents\$File" "$dir\Backups\$UserFolder\Documents\" -Recurse
$Status_Label.Text = "Backing Up Documents... $($pct)%"
$GUI.Refresh()
start-sleep -Milliseconds 100
} # End ForEach
Based on a side-by-side compare it looks to copy all the folders/files just fine, but the percentage is way off. In the case of the Documents folder example above the file copy is done by the time it hits 15-20%. The rest are "source file can't be found" errors. It's looking for files in the main folder (Documents, Pictures, etc) that are actually in sub-folders.
I can see why it would be looking in the main Documents folder since the source of Copy-Item is the root folder, but why does it copy everything correctly into their sub-folders and then go back and complain about all those sub-files?
(This also needs to work in Windows 7)
Thanks!
I think you can use gauge in dialog boxes (Something like: 1, 2, 3 (this one is in portuguese, but has a good code sample))
But I'm not sure about Windows.

Check if data coming continuously

Every hour data comes into every folder of my dir tree. I need to check if it does come in every hour, or of there was any interruption. (For example, no data coming in for 2–3 hours.)
I am trying to write a PowerShell script that will check LastWriteTime for every folder, but it would solve the problem with gaps. If I would check the logs in the morning I would see that all is OK if some data come to folder one hour ago, but was not there a few hours earlier.
So IMHO LastWriteTime is not suitable for it.
Also there is a problem with subfolders. I need to check only the last folder in every dir tree. I do not know how to drop any previous folders like:
Z:\temp #need to drop
Z:\temp\foo #need to drop
Z:\temp\foo\1 #need to check
I had tried to write a script that checks the LastAccessTime, but it throws an error:
Expressions are only allowed as the first element of a pipeline.
The script is as follows:
$rootdir = "Z:\SatData\SatImages\nonprojected\"
$timedec1 = (Get-date).AddHours(-1)
$timedec2 = (Get-date).AddHours(-2)
$timedec3 = (Get-date).AddHours(-3)
$timedec4 = (Get-date).AddHours(-4)
$dir1 = get-childitem $rootdir –recurse | ?{ $_.PSIsContainer } | Select-Object FullName | where-object {$_.lastwritetime -lt $timedec1} | $_.LastWriteTime -gt $timedec4 -and $_.LastWriteTime -lt $timedec3
$dir1
But as I said before it does not solve the problem.
--
The main question exactly about checking of continuously data collections. I would make dir tree bu hands, but I need to way to check if data had come to folder every hour or there was any hours without data...
you can try to setup the powershell script to run in a Windows Scheduler (which will run every hour). This way, the script will only have to check if any data arrived within the past one hour.