I have a stack load of images and videos on my Samsung phone. I copied these images to a USB then onto my PC.
I want to use Powershell to rename these files based on their Date Taken attribute.
Format required = yyyy-MM-dd HH.mm.ss ddd
I have been using a Powershell script (see below) that does this beautifully using the Date Modified attribute, but the copy above somehow changed the Date Modified value on me (WTH!), so I can't use that now (as its not accurate).
Get-ChildItem | Rename-Item -NewName {$_.LastWriteTime.ToString("yyyy-MM-dd HH.mm.ss ddd") + ($_.Extension)}
In summary - is there a way to change the file name based on the Date Taken file attribute? Suggestions I have seen online require use of the .NET System.Drawing.dll and convoluted code (I'm sure it works, but damn its ugly).
GG
Please checkout Set-PhotographNameAsDateTimeTaken Powershell module. It extract date and time from the picture and change name of the picture to it.
It allows to use -Recurse -Replace and -Verbose parameter. By default it will create reuslt folder at the same level as your working dir.
If you need change the format of the target names the code can be found here.
I 'glued' together a bunch of other answers to make a bulk script. Credit to those, but Chrome crashed and I lost those other webpages on Stack. This works on photo files only and will rename all files to YYYYMMDD_HHMMSS.jpg format.
Here it is:
$nocomment = [reflection.assembly]::LoadWithPartialName("System.Drawing")
get-childitem *.jpg | foreach {
$pic = New-Object System.Drawing.Bitmap($_.Name)
$bitearr = $pic.GetPropertyItem(36867).Value
$string = [System.Text.Encoding]::ASCII.GetString($bitearr)
$date = [datetime]::ParseExact($string,"yyyy:MM:dd HH:mm:ss`0",$Null)
[string] $newfilename = get-date $date -format yyyyMd_HHmmss
$newfilename += ".jpg"
$pic.Dispose()
rename-item $_ $newfilename -Force
$newfilename
}
In order to avoid this error:
New-Object : Cannot find type [System.Drawing.Bitmap]: verify that the assembly containing this type is
loaded.
...
Make sure the required assembly is loaded before executing the code above:
add-type -AssemblyName System.Drawing
Related
I get a CSV every week that our finance team puts in a shared drive. I have a script for that CSV that I run once I get it.
The first command of the script is of course Import-Csv.
The problem is, the finance team insists on naming the file differently each time plus they don't always put it in the same location within the drive.
As a result, I have to first hunt for the file, put it into the directory that the script points to and then rename the file.
I've tried talking to the team about putting it in the same location and making sure the filename is the same but they only follow the instructions for a couple of weeks before just doing whatever.
Ideally, I'd like for it so that when I run the script, there would be a popup that would ask me to pick a CSV (Similar to how it looks when you do "Save As" on an Office Document).
Anyway for this to be done within PowerShell?
You can access .Net classes and interface with the forms library to instantiate and take input from the standard FileOpen dialog. Something like below:
Using Namespace System.Windows.Forms
$FileBrowser = [OpenFileDialog]::new()
$FileBrowser.InitialDirectory = 'c:\temp'
$FileBrowser.Filter = 'Comma Separated Values (*.csv) | *.csv'
[Void]$FileBrowser.ShowDialog()
$CsvFile = $FileBrowser.FileName
Then use $CsvFile int he Import-Csv command.
You can change the .InitialDirectory property to make navigating a little more convenient.
Use the .Filter property to limit the file open display to CSV files, to make things that much more convenient.
Also, use the [Void] class to prevent the status return (usually 'OK' or 'Cancel') from echoing to the screen.
Note: A simple Google search will turn up many examples. I refined some of the work from here. That will also document some of the other properties if you want to explore etc.
If you are willing to settle for a selection box that doesn't look as nice as the Save As dialog, you can use Out-Gridview. Something along these lines might help.
$filenames =
#(Get-ChildItem -Path C:\temp -Recurse -Filter *.csv |
Sort-Object LastWriteTime -Descending |
Out-GridView -Title 'Choose a file' -PassThru)
$csvfile = $filenames[0].FullName
Import-Csv $csvfile | More
The -Path specifies a directory that contains all the locations where your csv file might be delivered. The sort is just to put the recently written files at the top of the grid. This supposedly makes selection easier. The #() wrapper merely makes sure the result stored in $filenames is an array.
You would do something else with the results of Import-Csv.
Steven's response certainly satisfies your original question, but an alternative would be to let PowerShell do the work. If you know the drive, and you know the name of the file this week, you can pass the name to your script and let it search the drive filtering on the specific csv file you need. Make it recursive, and open the only file that matches. Sorry, didn't have time yesterday to include code. Here's a function that returns the full file path when provided with a top level search path and a filename with possible wildcards.
function gfp { $result=gci $args[0] -recurse -include $args[1]; return ($result.DirectoryName + "\" + $result.Name) }
Example: gfp "d:\rootfolder" "thisweeksfilename.csv"
I'm still fairly new to powershell, so please bear with me.
I have 2 almost identical directories. Files and folders from the old directory were copied over to a new directory. However, during this transfer process, something happened to the last modified date. The files and folders in the new directory have incorrect last modified dates (ex: today).
Rather than re-doing the transfer process, which would take a long time, I'd like to write something in powershell that will compare the last modified dates of the two directories and correct the dates in the new directory.
I'd also like to check first if file/folder has been modified since the file transfer. There would be no reason to change the date on those files.
What I found from looking around and googling:
Link1 Link2 Link 3 Link 4
I know that I can get the last modified date of a file with:
(Get-Item $filename).LastWriteTime
where $filename is the file directory.
I also came across the following:
dir $directory | ? {$_.lastwritetime -gt "6/1/19" -AND $_.lastwritetime -lt "12/30/19"}
I know I can get information regarding files that were modified between 2 dates. This, I can tweak to make it so the "less than (-lt)" can be used to check files that were not modified past a certain date.
dir $directory | ? {$_.lastwritetime -lt `12/13/19'}
This accomplishes one of my goals. I have a means to check if a file has been modified past a certain or not.
I saw this for changing the value lastwritetime
$folder = Get-Item C:\folder1
$folder.LastWriteTime = (Get-Date)
and realized this was simply
(Get-Item $filename).LastWriteTime = (Get-Date)
Which I could modify to meet my goal of replacing the new file's last write time wit the old file's correct time:
(Get-Item $filename).LastWriteTime = (Get-Item $filename2).LastWriteTime
I suppose what I'm struggling with is kind of putting it all together. I know how to recurse through files/folders for copy-item or even Get-Childitem by adding the "recurse" parameter. But I'm having difficulties wrapping my head around recursively navigating through each directory to change the dates.
Thank you for your help.
You could do the following to compare the LastWriteTime property of the original files and folders to the copies, while keping in mind that files in the copy folder could have been updated since the last transfer date.
# set the date to the last transfer date to determine if the file was updated after that
$lastTransferDate = (Get-Date).AddDays(-10) # just for demo 10 days ago
# set the paths for the rootfolder of the originals and the rootfolder to where everything was copied to
$originalPath = 'D:\OriginalStuff'
$copyPath = 'E:\TransferredStuff'
# loop through the files and folders of the originals
Get-ChildItem -Path $originalPath -Recurse | ForEach-Object {
# create the full path where the copied file of folder is to be found
$copy = Join-Path -Path $copyPath -ChildPath $_.FullName.Substring($originalPath.Length)
# test if this object can be found
if (Test-Path -Path $copy) {
$item = Get-Item -Path $copy
# test if the item has not been updated since the last transfer date
if ($item.LastWriteTime -le $lastTransferDate) {
# set the timestamp the same as the original
$item.LastWriteTime = $_.LastWriteTime
}
}
}
Great job with what you've done so far.
Just put what you have into a foreach statement.
Foreach($item in (gci 'C:\Users\usernamehere\Desktop\folder123' -recurse)){
(Get-Item $item.FullName).LastWriteTime = (Get-Item "C:\Users\usernamehere\Desktop\folderabc\RandomFile.txt").LastWriteTime
}
We wrap the Get-Childitem command with the -recurse flag into parenthesis so that the command executes on it's own and becomes a collection for our foreach command to traverse. $item is the current item in the loop. We will want to use the .FullName property to know the full path to the file for the current item. With that said you will use $item.FullName together for the files you are going to set the date on.
Newby to Stackoverflow so apologies if this request is in the wrong location.
I'm also new to PowerShell and have been researching a solution that appears to be offered by PowerShell. My lack of experience has left me unable to modify an example script so I'm here looking for help.
My objective is to rename many outlook .msg files, located in folders and sub-folders, using data extracted from each .msg file. In my case, I require the date sent (Senton) (and topic, but less important). This is currently, and painfully, being done manually and takes a long time so I thought we needed something to semi-automate a solution.
I've searched around on various forums and blogs and have found a script that seems to be heading towards what I want (details below) but I just don't have the skills to do the last bit of taking the Sent Date and changing the .msg filename. I've attempted various piping solutions referring to variable $msg.Senton. The PowerShell debugger indicates that the line with comment "My code changes" have good values but produces an error (relating to parameter NewName) for each object iteration (see below).
Any help in making progress will be appreciated. In the mean time, I'll continue experimenting with the debugger.
The script was sourced from this (http://jon.glass/blog/reads-e-mail-with-powershell/) site and the script looks like...
Get-ChildItem "C:\Users\higginsr4\Test\StudyExamples\SmallMsgFolder" -Filter *.msg|
ForEach-Object{
$outlook = New-Object -comobject outlook.application
$msg = $outlook.CreateItemFromTemplate($_.FullName)
$msg | Select senderemailaddress,to,subject,Senton,body|ft -AutoSize
$msg | Rename-Item -NewName { $msg.Senton + $_.name} ## My code changes
}
PowerShell debugger error...
Rename-Item : The input to the script block for parameter 'NewName' failed. Cannot convert argument "1", with value: "", for "op_Addition" to type "System.TimeSpan": "Cannot
convert null to type "System.TimeSpan"."
At My Documents\WindowsPowerShell\Modules\GetEmailDetails\RobsGetEmailDetails.ps1:6 char:33
+ $msg | Rename-Item -NewName { $msg.Senton + $_.name}
+ ~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (System.__ComObject:__ComObject) [Rename-Item], ParameterBindingException
+ FullyQualifiedErrorId : ScriptBlockArgumentInvocationFailed,Microsoft.PowerShell.Commands.RenameItemCommand
I'm using Windows 7 Enterprise but if further details are required then please let me know.
A couple of things to be highlighted in what you have tried:-
$msg is a ComObject created by by the base type System.MarshalByRefObject and not an actual physical file. Hence, you are sort of deviating from your objective there.
You are including the -NewName parameter in a script block. Note that, you have already done that when you piped the output of Get-ChildItem to Foreach-Object. Hence, the extra script block is redundant.
You can do something like this -
Get-ChildItem "C:\Users\higginsr4\Test\StudyExamples\SmallMsgFolder" -Filter *.msg |
ForEach-Object{
$outlook = New-Object -comobject outlook.application
$msg = $outlook.CreateItemFromTemplate($_.FullName)
$msg | Select senderemailaddress,to,subject,Senton,body|ft -AutoSize
Rename-Item -LiteralPath $_.FullName -NewName "$($_.Basename)_$($msg.Senton.ToString('ddMMyy')$($_.Extension)"
}
I'm trying to teach myself enough powershell or batch programming to figure out to achieve the following (I've had a search and looked through a couple hours of Youtube tutorials but can't quite piece it all together to figure out what I need - I don't get Tokens, for example, but they seem necessary in the For loop). Also, not sure if the below is best achieved by robocopy or xcopy.
Task:
Define a list of files to retrieve in a csv (file name will be listed as a 13 digit number, extension will be UNKNOWN, but will usually be .jpg but might occasionally be .png - could this be achieved with a wildcard?)
list would read something like:
9780761189931
9780761189988
9781579657159
For each line in this text file, do:
Search a network folder and all subfolders
If exact filename is found, copy to an arbitrary target (say a new folder created on desktop)
(Not 100% necessary, but nice to have) Once the For loop has completed, output a list of files copied into a text file in the newly created destination folder
I gather that I'll maybe need to do a couple of things first, like define variables for the source and destination folders? I found the below elsewhere but couldn't quite get my head around it.
set src_folder=O:\2017\By_Month\Covers
set dst_folder=c:\Users\%USERNAME&\Desktop\GetCovers
for /f "tokens=*" %%i in (ISBN.txt) DO (
xcopy /K "%src_folder%\%%i" "%dst_folder%"
)
Thanks in advance!
This solution is in powershell, by the way.
To get all subfiles of a folder, use Get-ChildItem and the pipeline, and you can then compare the name to the insides of your CSV (which you can get using import-CSV, by the way).
Get-ChildItem -path $src_folder -recurse | foreach{$_.fullname}
I'd personally then use a function to edit the name as a string, but I know this probably isn't the best way to do it. Create a function outside of the pipeline, and have it return a modified path in such a way that you can continue the previous line like this:
Get-ChildItem -path $src_folder -recurse | foreach{$_.CopyTo (edit-path $_.fullname)}
Where "edit-directory" is your function that takes in the path, and modifies it to return your destination path. Also, you can alternatively use robocopy or xcopy instead of CopyTo, but Copy-Item is a powershell native and doesn't require much string manipulation (which in my experience, the less, the better).
Edit: Here's a function that could do the trick:
function edit-path{
Param([string] $path)
$modified_path = $dst_folder + "\"
$modified_path = $path.substring($src_folder.length)
return $modified_path
}
Edit: Here's how to integrate the importing from CSV, so that the copy only happens to files that are written in the CSV (which I had left out, oops):
$csv = import-csv $CSV_path
Get-ChildItem -path $src_folder -recurse | where-object{$csv -contains $_.name} | foreach{$_.CopyTo (edit-path $_.fullname)}
Note that you have to put the whole CSV path in the $CSV_path variable, and depending on how the contents of that file are written, you may have to use $_.fullname, or other parameters.
This seems like an average enough problem:
$Arr = Import-CSV -Path $CSVPath
Get-ChildItem -Path $Folder -Recurse |
Where-Object -FilterScript { $Arr -contains $PSItem.Name.Substring(0,($PSItem.Length - 4)) } |
ForEach-Object -Process {
Copy-Item -Destination $env:UserProfile\Desktop
$PSItem.Name | Out-File -FilePath $env:UserProfile\Desktop\Results.txt -Append
}
I'm not great with string manipulation so the string bit is a bit confusing, but here's everything spelled out.
I have a folder which contains thousands of PDF files. I need to filter through these files based on file name (which will group these into 2 or more PDF's) and then merge these 2 more more PDF's into 1 PDF.
I'm OK with group the files but not sure the best way of then merging these into 1 PDF. I have researched iTextSharp but have been unable to get this to work in PowerShell.
Is iTextSharp the best way of doing this? Any help with the code for this would be much appreciated.
Many thanks
Paul
Have seen a few of these PowerShell-tagged questions that are also tagged with itextsharp, and always wondered why answers are given in .NET, which can be very confusing unless the person asking the question is proficient in PowerShell to begin with. Anyway, here's a simple working PowerShell script to get you started:
$workingDirectory = Split-Path -Parent $MyInvocation.MyCommand.Path;
$pdfs = ls $workingDirectory -recurse | where {-not $_.PSIsContainer -and $_.Extension -imatch "^\.pdf$"};
[void] [System.Reflection.Assembly]::LoadFrom(
[System.IO.Path]::Combine($workingDirectory, 'itextsharp.dll')
);
$output = [System.IO.Path]::Combine($workingDirectory, 'output.pdf');
$fileStream = New-Object System.IO.FileStream($output, [System.IO.FileMode]::OpenOrCreate);
$document = New-Object iTextSharp.text.Document;
$pdfCopy = New-Object iTextSharp.text.pdf.PdfCopy($document, $fileStream);
$document.Open();
foreach ($pdf in $pdfs) {
$reader = New-Object iTextSharp.text.pdf.PdfReader($pdf.FullName);
$pdfCopy.AddDocument($reader);
$reader.Dispose();
}
$pdfCopy.Dispose();
$document.Dispose();
$fileStream.Dispose();
To test:
Create an empty directory.
Copy code above into a Powershell script file in the directory.
Copy the itextsharp.dll to the directory.
Put some PDF files in the directory.
Not sure how you intend to group filter the PDFs based on file name, or if that's your intention (couldn't tell if you meant just pick out PDFs by extension), but that shouldn't be too hard to add.
Good luck. :)