Powershell: Compare Last Modified to Specific Date and replace with correct Date - powershell

I'm still fairly new to powershell, so please bear with me.
I have 2 almost identical directories. Files and folders from the old directory were copied over to a new directory. However, during this transfer process, something happened to the last modified date. The files and folders in the new directory have incorrect last modified dates (ex: today).
Rather than re-doing the transfer process, which would take a long time, I'd like to write something in powershell that will compare the last modified dates of the two directories and correct the dates in the new directory.
I'd also like to check first if file/folder has been modified since the file transfer. There would be no reason to change the date on those files.
What I found from looking around and googling:
Link1 Link2 Link 3 Link 4
I know that I can get the last modified date of a file with:
(Get-Item $filename).LastWriteTime
where $filename is the file directory.
I also came across the following:
dir $directory | ? {$_.lastwritetime -gt "6/1/19" -AND $_.lastwritetime -lt "12/30/19"}
I know I can get information regarding files that were modified between 2 dates. This, I can tweak to make it so the "less than (-lt)" can be used to check files that were not modified past a certain date.
dir $directory | ? {$_.lastwritetime -lt `12/13/19'}
This accomplishes one of my goals. I have a means to check if a file has been modified past a certain or not.
I saw this for changing the value lastwritetime
$folder = Get-Item C:\folder1
$folder.LastWriteTime = (Get-Date)
and realized this was simply
(Get-Item $filename).LastWriteTime = (Get-Date)
Which I could modify to meet my goal of replacing the new file's last write time wit the old file's correct time:
(Get-Item $filename).LastWriteTime = (Get-Item $filename2).LastWriteTime
I suppose what I'm struggling with is kind of putting it all together. I know how to recurse through files/folders for copy-item or even Get-Childitem by adding the "recurse" parameter. But I'm having difficulties wrapping my head around recursively navigating through each directory to change the dates.
Thank you for your help.

You could do the following to compare the LastWriteTime property of the original files and folders to the copies, while keping in mind that files in the copy folder could have been updated since the last transfer date.
# set the date to the last transfer date to determine if the file was updated after that
$lastTransferDate = (Get-Date).AddDays(-10) # just for demo 10 days ago
# set the paths for the rootfolder of the originals and the rootfolder to where everything was copied to
$originalPath = 'D:\OriginalStuff'
$copyPath = 'E:\TransferredStuff'
# loop through the files and folders of the originals
Get-ChildItem -Path $originalPath -Recurse | ForEach-Object {
# create the full path where the copied file of folder is to be found
$copy = Join-Path -Path $copyPath -ChildPath $_.FullName.Substring($originalPath.Length)
# test if this object can be found
if (Test-Path -Path $copy) {
$item = Get-Item -Path $copy
# test if the item has not been updated since the last transfer date
if ($item.LastWriteTime -le $lastTransferDate) {
# set the timestamp the same as the original
$item.LastWriteTime = $_.LastWriteTime
}
}
}

Great job with what you've done so far.
Just put what you have into a foreach statement.
Foreach($item in (gci 'C:\Users\usernamehere\Desktop\folder123' -recurse)){
(Get-Item $item.FullName).LastWriteTime = (Get-Item "C:\Users\usernamehere\Desktop\folderabc\RandomFile.txt").LastWriteTime
}
We wrap the Get-Childitem command with the -recurse flag into parenthesis so that the command executes on it's own and becomes a collection for our foreach command to traverse. $item is the current item in the loop. We will want to use the .FullName property to know the full path to the file for the current item. With that said you will use $item.FullName together for the files you are going to set the date on.

Related

How to handle copy file with infinity loop using Powershell?

I want to check .jpg file in the 2nd folder. 2nd folder has some subfolder. if .jpg exist in the subfolder of 2nd folder, I will copy a file from 1st folder to subfolder of 2nd folder based on the base name. I can do this part refer to this great answer How to copy file based on matching file name using PowerShell?
https://stackoverflow.com/a/58182359/11066255
But I want to do this process with infinity loop. Once I use infinity loop, I found that I have a lot of duplicate file. How do I make limitation, if I already copy the file, I will not copy again in the next loop.
Anyone can help me please. Thank you.
for(;;)
{
$Job_Path = "D:\Initial"
$JobError = "D:\Process"
Get-ChildItem -Path "$OpJob_Path\*\*.jpg" | ForEach-Object {
$basename = $_.BaseName.Substring(15)
$job = "$Job_Path\${basename}.png"
if (Test-Path $job) {
$timestamp = Get-Date -Format 'yyyyMMddhhmmss'
$dst = Join-Path $_.DirectoryName "${timestamp}_${basename}.gif"
$Get = (Get-ChildItem -Name "$OpJob_Path\*\*$basename.jpg*" | Measure-Object).Count
Copy-Item $job $dst -Force
}
}
}
File management 101 is that, Windows will not allow duplicate file names in the same location. You can only have duplicate files, if the name of the file is unique, but the content is the same. Just check for the filename, but they must be the same filename, and do stuff if it is not a match else do nothing.
Also, personally, I'd suggest using a PowerShell FileSystemWatcher instead of a infinite loop. Just saying...
This line …
$timestamp = Get-Date -Format 'yyyyMMddhhmmss'
… will always generate a unique filename by design, the content inside it is meaningless, unless you are using file hashing for the compare as part of this.
Either remove / change that line to something else, or use file hash (they ensure uniqueness regardless of name used) ...
Get-FileHash -Path 'D:\Temp\input.txt'
Algorithm Hash Path
--------- ---- ----
SHA256 1C5B508DED35A28B9CCD815D47ECF500ECF8DDC2EDD028FE72AB5505C0EC748B D:\Temp\input.txt
... for compare and prior to the copy if another if/then.
something like...
If ($job.Hash -ne $dst.Hash)
{Copy-Item $job.Path $dst.Path}
Else
{
#Do nothing
}
There are of course other ways to do this as well, this is just one idea.

Copying files defined in a list from network location

I'm trying to teach myself enough powershell or batch programming to figure out to achieve the following (I've had a search and looked through a couple hours of Youtube tutorials but can't quite piece it all together to figure out what I need - I don't get Tokens, for example, but they seem necessary in the For loop). Also, not sure if the below is best achieved by robocopy or xcopy.
Task:
Define a list of files to retrieve in a csv (file name will be listed as a 13 digit number, extension will be UNKNOWN, but will usually be .jpg but might occasionally be .png - could this be achieved with a wildcard?)
list would read something like:
9780761189931
9780761189988
9781579657159
For each line in this text file, do:
Search a network folder and all subfolders
If exact filename is found, copy to an arbitrary target (say a new folder created on desktop)
(Not 100% necessary, but nice to have) Once the For loop has completed, output a list of files copied into a text file in the newly created destination folder
I gather that I'll maybe need to do a couple of things first, like define variables for the source and destination folders? I found the below elsewhere but couldn't quite get my head around it.
set src_folder=O:\2017\By_Month\Covers
set dst_folder=c:\Users\%USERNAME&\Desktop\GetCovers
for /f "tokens=*" %%i in (ISBN.txt) DO (
xcopy /K "%src_folder%\%%i" "%dst_folder%"
)
Thanks in advance!
This solution is in powershell, by the way.
To get all subfiles of a folder, use Get-ChildItem and the pipeline, and you can then compare the name to the insides of your CSV (which you can get using import-CSV, by the way).
Get-ChildItem -path $src_folder -recurse | foreach{$_.fullname}
I'd personally then use a function to edit the name as a string, but I know this probably isn't the best way to do it. Create a function outside of the pipeline, and have it return a modified path in such a way that you can continue the previous line like this:
Get-ChildItem -path $src_folder -recurse | foreach{$_.CopyTo (edit-path $_.fullname)}
Where "edit-directory" is your function that takes in the path, and modifies it to return your destination path. Also, you can alternatively use robocopy or xcopy instead of CopyTo, but Copy-Item is a powershell native and doesn't require much string manipulation (which in my experience, the less, the better).
Edit: Here's a function that could do the trick:
function edit-path{
Param([string] $path)
$modified_path = $dst_folder + "\"
$modified_path = $path.substring($src_folder.length)
return $modified_path
}
Edit: Here's how to integrate the importing from CSV, so that the copy only happens to files that are written in the CSV (which I had left out, oops):
$csv = import-csv $CSV_path
Get-ChildItem -path $src_folder -recurse | where-object{$csv -contains $_.name} | foreach{$_.CopyTo (edit-path $_.fullname)}
Note that you have to put the whole CSV path in the $CSV_path variable, and depending on how the contents of that file are written, you may have to use $_.fullname, or other parameters.
This seems like an average enough problem:
$Arr = Import-CSV -Path $CSVPath
Get-ChildItem -Path $Folder -Recurse |
Where-Object -FilterScript { $Arr -contains $PSItem.Name.Substring(0,($PSItem.Length - 4)) } |
ForEach-Object -Process {
Copy-Item -Destination $env:UserProfile\Desktop
$PSItem.Name | Out-File -FilePath $env:UserProfile\Desktop\Results.txt -Append
}
I'm not great with string manipulation so the string bit is a bit confusing, but here's everything spelled out.

Windows file search within a search, how? App, script, GREP, powershell, notepad hack?

I am trying to search for folders created within a certain date range, then search for files with certain attributes in only those folders. I thought with Windows 8's "advanced query system" this would be a 2 minute job...it isn't!
Can anyone recommend an easy way to do this? I'm thinking along the lines of regular expressions i can input into AstroGrep, or a Notepad++ hack, as it's easy to copy folder paths from windows search into a text document.
Thanks!
EDIT: To clarify, I am trying to find files which were added to the system during a certain date range. Searching by file created/modified attributes does not help as these attributes are carried over when the file is moved. However a folder's date attributes do change when files are moved in and out. Therefore I need to search for folders by date, then (because of the huge number of files and subfolders) search within the resulting folders for my files.
You could use the Get-ChildItem cmldet to retrieve all directories during a certain date range (for example: Now and a Month ago):
$dateNow = Get-Date
$dateaMonthAgo = $dateNow.AddMonths(-1)
$directories = Get-ChildItem -Path 'C:\' -Directory -Recurse |
Where { $_.LastAccessTime -le $dateNow -and $_.LastAccessTime -ge $dateaMonthAgo }
Now you have all directories that matches the date range. You can iterate over them and search for your files:
$directories | Get-ChildItem -Filter 'yourFile.txt'

Search for folders created after date X and copy them

I need to go through a large folder (let's call it D:\Files ) which contains about 50.000 folders and about 1 million files, and copy all folders that were created (not modified) after August 31th 2012, including everything in their subfolders (including second/third/fourth level deep subfolders etc.) and files (also files in subfolders, no matter when they got changed) in those folders, for them no matter when they were created, as long as they're in a "higher up" folder that go created after August, to another drive (Call it E:\Space) - still keeping the original folder structure, so a file which was before in D:\Files\Folder1\Subfolder3\hello.txt then should be in E:\Space\Folder1\Subfolder3\hello.txt
Background is, some folders got created in the structure after August, but the files in them do have an older modified date and I need to include everything in the search process that was added to D:\Files after 31st August 2012, so I can't figure out which files were recently added, and which were in there already before August, and I need to search for the folder creation date.
So, if the folder D:\Files\Folder1 got created in September, then files and folders in it - be it one, two, three, four etc. levels deep, no matter when it got changed/modified, should be moved to E:\Space with the powershell.
I've tried in the past two days to get through, but so far failed miserably.
I know this probably isn't easy with Powershell as it requires several steps, but any help would be greatly appreciated.
Thanks so much.
Try something like this:
$srcRoot = "D:\Files"
$dstRoot = "D:\Space"
$ct = Get-Date "2012-09-01"
Get-ChildItem -Recurse $srcRoot | ? {
$_.PSIsContainer -and $_.CreationTime -ge $ct
} | sort FullName | % {
$name = $_.FullName
if ($top -eq $null) {
$top = $name
} elseif ( $name.substring(0, [math]::Min($top.Length, $name.Length)) -ne $top ) {
$top = $name
$dst = $_ -replace [regex]::Escape($srcRoot), $dstRoot
Copy-Item $name $dst -Recurse -Force
}
}
You should consider using robocopy.exeto solve the task.
If youre looking for a ps Solution have a Look on the scriptinggames 2013 Event 1 submissions

Check if data coming continuously

Every hour data comes into every folder of my dir tree. I need to check if it does come in every hour, or of there was any interruption. (For example, no data coming in for 2–3 hours.)
I am trying to write a PowerShell script that will check LastWriteTime for every folder, but it would solve the problem with gaps. If I would check the logs in the morning I would see that all is OK if some data come to folder one hour ago, but was not there a few hours earlier.
So IMHO LastWriteTime is not suitable for it.
Also there is a problem with subfolders. I need to check only the last folder in every dir tree. I do not know how to drop any previous folders like:
Z:\temp #need to drop
Z:\temp\foo #need to drop
Z:\temp\foo\1 #need to check
I had tried to write a script that checks the LastAccessTime, but it throws an error:
Expressions are only allowed as the first element of a pipeline.
The script is as follows:
$rootdir = "Z:\SatData\SatImages\nonprojected\"
$timedec1 = (Get-date).AddHours(-1)
$timedec2 = (Get-date).AddHours(-2)
$timedec3 = (Get-date).AddHours(-3)
$timedec4 = (Get-date).AddHours(-4)
$dir1 = get-childitem $rootdir –recurse | ?{ $_.PSIsContainer } | Select-Object FullName | where-object {$_.lastwritetime -lt $timedec1} | $_.LastWriteTime -gt $timedec4 -and $_.LastWriteTime -lt $timedec3
$dir1
But as I said before it does not solve the problem.
--
The main question exactly about checking of continuously data collections. I would make dir tree bu hands, but I need to way to check if data had come to folder every hour or there was any hours without data...
you can try to setup the powershell script to run in a Windows Scheduler (which will run every hour). This way, the script will only have to check if any data arrived within the past one hour.