Powershell/Batch file scripting - created time from file name - powershell

I want to use a file name to set the metadata created time for over 6,000 images in one folder.
The file name usually starts with the letters "IMG" then the date and then the letters "WA" with 4 numbers to represent the number of the picture on that date
i.e. IMG-YYYYMMDD-WA????
I've tried with advanced renamer and got it to work on my pc but the minute i move it back over to my device (galaxy s9) the created time changes to the time i move it.
Is there a way i can script this in powershell or batch?

I would recommend doing the following:
robocopy SourcePath DestinationPath /COPY:DT /R:1 /W:1 /DCOPY:DT /LOG:%TEMP%\PhotoCopy.log /NP
This will COPY all of the files from SourcePath recursively to DestinationPath, maintaining timestamps, retrying only once on failure, copying directory timestamps, and writing a log file to your system's Temporary folder called Photocopy.Log while displaying no per-file progress.

To change the CreationTime with PowerShell in place (adjust X:\Path)
without copying again:
Get-Chilitem X:\Path -File |
Where-Object BaseName -match '^IMG-(\d{8})-WA\d{4}$' |
ForEach-Object {
$_.CreationTime = [datetime]::ParseExact($Matches[1],'yyyyMMdd',$Null)
}
The script uses a RegEx to extract the date from the name and
converts it to a [datetime].
In lack of hour, minute and second these are set to 00:00:00.
The script doesn't care about the extension as long as the pattern
IMG-8digits-WA4digits matches.
PS: If the images aren't edited,
LastWriteTime could still reflect the date the images are taken. In that case you could copy that with
$_.CreationTime = $_.LastWriteTime

Related

Powershell: Compare Last Modified to Specific Date and replace with correct Date

I'm still fairly new to powershell, so please bear with me.
I have 2 almost identical directories. Files and folders from the old directory were copied over to a new directory. However, during this transfer process, something happened to the last modified date. The files and folders in the new directory have incorrect last modified dates (ex: today).
Rather than re-doing the transfer process, which would take a long time, I'd like to write something in powershell that will compare the last modified dates of the two directories and correct the dates in the new directory.
I'd also like to check first if file/folder has been modified since the file transfer. There would be no reason to change the date on those files.
What I found from looking around and googling:
Link1 Link2 Link 3 Link 4
I know that I can get the last modified date of a file with:
(Get-Item $filename).LastWriteTime
where $filename is the file directory.
I also came across the following:
dir $directory | ? {$_.lastwritetime -gt "6/1/19" -AND $_.lastwritetime -lt "12/30/19"}
I know I can get information regarding files that were modified between 2 dates. This, I can tweak to make it so the "less than (-lt)" can be used to check files that were not modified past a certain date.
dir $directory | ? {$_.lastwritetime -lt `12/13/19'}
This accomplishes one of my goals. I have a means to check if a file has been modified past a certain or not.
I saw this for changing the value lastwritetime
$folder = Get-Item C:\folder1
$folder.LastWriteTime = (Get-Date)
and realized this was simply
(Get-Item $filename).LastWriteTime = (Get-Date)
Which I could modify to meet my goal of replacing the new file's last write time wit the old file's correct time:
(Get-Item $filename).LastWriteTime = (Get-Item $filename2).LastWriteTime
I suppose what I'm struggling with is kind of putting it all together. I know how to recurse through files/folders for copy-item or even Get-Childitem by adding the "recurse" parameter. But I'm having difficulties wrapping my head around recursively navigating through each directory to change the dates.
Thank you for your help.
You could do the following to compare the LastWriteTime property of the original files and folders to the copies, while keping in mind that files in the copy folder could have been updated since the last transfer date.
# set the date to the last transfer date to determine if the file was updated after that
$lastTransferDate = (Get-Date).AddDays(-10) # just for demo 10 days ago
# set the paths for the rootfolder of the originals and the rootfolder to where everything was copied to
$originalPath = 'D:\OriginalStuff'
$copyPath = 'E:\TransferredStuff'
# loop through the files and folders of the originals
Get-ChildItem -Path $originalPath -Recurse | ForEach-Object {
# create the full path where the copied file of folder is to be found
$copy = Join-Path -Path $copyPath -ChildPath $_.FullName.Substring($originalPath.Length)
# test if this object can be found
if (Test-Path -Path $copy) {
$item = Get-Item -Path $copy
# test if the item has not been updated since the last transfer date
if ($item.LastWriteTime -le $lastTransferDate) {
# set the timestamp the same as the original
$item.LastWriteTime = $_.LastWriteTime
}
}
}
Great job with what you've done so far.
Just put what you have into a foreach statement.
Foreach($item in (gci 'C:\Users\usernamehere\Desktop\folder123' -recurse)){
(Get-Item $item.FullName).LastWriteTime = (Get-Item "C:\Users\usernamehere\Desktop\folderabc\RandomFile.txt").LastWriteTime
}
We wrap the Get-Childitem command with the -recurse flag into parenthesis so that the command executes on it's own and becomes a collection for our foreach command to traverse. $item is the current item in the loop. We will want to use the .FullName property to know the full path to the file for the current item. With that said you will use $item.FullName together for the files you are going to set the date on.

File transfer from one folder to another

I was wondering if anyone can help me with this problem, of moving files from C: drive to a network drive.
So at work we have a machine that outputs .txt files. For example these files include data about pets, so in the folder I have hundreds of files that are named similar to dogs_123456_10062019.txt then cats_123457_10062019.txt.
Now the first number is a reference number than changes per .txt file that is created and the other is a date, as said I can have hundreds of these per day the reference and date is not important to the transfer as the file includes all this information anyway
Now I have a network folder structure of Y:dogs & Y:cats and wanted a automated script that transfers all dog & cat text files to the corresponding network drive.
The network drive name cannot be changed as it's used by a monitoring software that outputs graphs based on the information in the text file.
Is this possible? Hopefully I've explained myself
Cheers
If folder names match file names then you can do something like that:
$SourceFolderPath = "C:\Source\"
$DestinationFolderPath = "Y:"
$FileList = Get-ChildItem -Path $SourceFolderPath
foreach($File in $FileList){
$FolderName = ($File.Name | Select-String -Pattern ".+?(?=_)").Matches.Value
$File | Move-Item -Destination "$DestinationFolderPath\$FolderName"
}
If names of folders do not match file names, you would need to manually create a dictionary of what should go where and then translate those
The code above is obviously not an enterprise level stuff :)

Windows file search within a search, how? App, script, GREP, powershell, notepad hack?

I am trying to search for folders created within a certain date range, then search for files with certain attributes in only those folders. I thought with Windows 8's "advanced query system" this would be a 2 minute job...it isn't!
Can anyone recommend an easy way to do this? I'm thinking along the lines of regular expressions i can input into AstroGrep, or a Notepad++ hack, as it's easy to copy folder paths from windows search into a text document.
Thanks!
EDIT: To clarify, I am trying to find files which were added to the system during a certain date range. Searching by file created/modified attributes does not help as these attributes are carried over when the file is moved. However a folder's date attributes do change when files are moved in and out. Therefore I need to search for folders by date, then (because of the huge number of files and subfolders) search within the resulting folders for my files.
You could use the Get-ChildItem cmldet to retrieve all directories during a certain date range (for example: Now and a Month ago):
$dateNow = Get-Date
$dateaMonthAgo = $dateNow.AddMonths(-1)
$directories = Get-ChildItem -Path 'C:\' -Directory -Recurse |
Where { $_.LastAccessTime -le $dateNow -and $_.LastAccessTime -ge $dateaMonthAgo }
Now you have all directories that matches the date range. You can iterate over them and search for your files:
$directories | Get-ChildItem -Filter 'yourFile.txt'

Amending timestamps using powershell or creating files in specified order from a csv

I have n number of files that have been created from a csv list by copying an original file using PowerShell:
Import-Csv C:\TEST\test.csv | % { Copy-Item -Path $_.oldfilepath –Destination "C:\TEST\$($_.newfilename)”}
Where .oldfile path is the original file and .newfilename is in the format xxxxx02012014.filetype
This takes the original file and creates as many copies as required using the names within the csv. However in doing so I lose the original/expected order of the files when viewing in explorer as they all appear to have been created at the same time (They are actually in ascending date order of the format of xxxxx02012014, xxxxx02082014 etc. where the number is a week in american date format).
Because of the required naming convention I cannot sort on name and due to how the files are produced (at the same timestamp) I cannot sort of time created within windows.
Is there either a way to ensure the files created and listed in the same order or something else I can run in PowerShell on the subsequently created files to modify the timestamps so they appear in windows explorer in the order required i.e in ascending american date format.
Thanks for any advice.
You can simply update the creation time of the copy with the creation timestamp from the original:
Import-Csv C:\TEST\test.csv | % {
$target = "C:\TEST\$($_.newfilename)"
Copy-Item -Path $_.oldfilepath –Destination $target
(Get-Item $target).CreationTime = (Get-Item $_.oldfilepath).CreationTime
}

Powershell script to move files based on a source list (.txt

I have thousands of files in a directory (.pdf, .xls, .doc) and they all have a similar naming convention (the "type" is always a constant string, ie: billing or invoice);
accountname_accountnumber_type.pdf
accountname_ accountnumber_type.doc
accountname_accountnumber_type.xls
The task at hand is to receive a random list of accountnames and account numbers (the "type" is always a constant, ie: billing, invoice, shipping or order and they vary in format) and move them from Directory A into Directory B. I can get the list into a .csv file to match the accountname_accountnumber_type.
I have been trying to create a powershell script to reference the accountname_accountnumber and move those items from one directory A to directory B with no luck.
SAMPLE I found something a bit simpler, but I wanted to be able to edit this to create a new destination and not halt if the file is not found from this list. Also, if I could have this pick from a .txt list I think that would be easier than pasting everything.
$src_dir = "C:\DirA\"
$dst_dir = "D:\DirB-mm-dd-yyyy\" #This code requires the destination dir to already be there and I need to have the code output a new Directory, it could output based on date script run that would be amazing
$file_list = "accountname1_accountnumber001_type", #If I can select to import csv here
"accountname2_accountnumber002_type",
"accountname3_accountnumber003_type",
"accountname4_accountnumber004_type",
"accountname5_accountnumber005_type",
"accountname6_accountnumber006_type"
foreach ($file in $file_list) #This errors out and stops the script if the file is not located in source and I need to have the script continue and move on, with hopefully an error output
{
move-Item $src_dir$file $dst_dir
}
They can be any file format, I am trying to get the code to match ONLY the accountname and accountnumber since those two will define the exact customer. Whether it is invoice, billing or shipping doesn't matter since they want all files associated with that customer moved.
For Example there could be 4 of each for every account and the type format may vary from pdf, doc and xls, I need to move all files based on their first two indicators (accountname,accountnumber).
alice_001_invoice.pdf
alice_001_billing.doc
alice_001_shipping.pdf
alice_001_order.xls
George_245_invoice.pdf
George_245_billing.doc
George_245_shipping.pdf
George_245_order.xls
Bob_876_invoice.pdf
Bob_876_billing.doc
Bob_876_shipping.pdf
Bob_876_order.xls
Horman_482_invoice.pdf
Horman_482_billing.doc
Horman_482_shipping.pdf
Horman_482_order.xls
CSV:
accountname,accountnumber
Alice,001
George,245
Bob,876
Horman,482
How about this :
$CurrentDate = [DateTime]::Now.ToString("MM-dd-yyyy")
$DestinationDir = "D:\DirB-$CurrentDate"
New-Item $DestinationDir -ItemType Directory -ErrorAction SilentlyContinue
$AccountToMove = Import-CSV $CSVPath
Foreach ( $Account In $AccountToMove ){
$FilePattern = "*$($Account.AccountName)*$($Account.AccountNumber)*"
ls $SourceDir | Where Name -like $FilePattern | Move-Item -Destination $DestinationDir
}
The code part - you already edited off from the post - about moving files to subdirectories doesn't make much sense with your business rules. As you never show the sample CSV file contents, it's all guessing.
For easier processing, assume you got the following source files. Edit your post to show the CSV file contents and where you would like to move the files.
C:\some\path\A\Alice_001_bill.doc
C:\some\path\A\Alice_001_invoice.xls
C:\some\path\A\Bob_002_invoice.pdf
C:\some\path\A\Bob_002_invoice.doc
C:\some\path\A\Eve_003_bill.xls
C:\some\path\A\Eve_003_invoice.doc