20000 images batch process photoshop - photo

Hello i have around 100000 photos and 20000 need optimising.
I can run a batch process in photoshop however each image is across 100000 folders. They all have the same name. Is there a way to target the file name on a batch process in photoshop that you know of or should i be looking at alternatives.
Thanks

What's the problem with using the FOLDER > INCLUDE ALL SUBFOLDERS option?

Write a script to create symlinks to your files into a single folder.
Then run the Photoshop batch on this folder. Your original files will also have been updated.

You have to select override Action "Open" and "Save As"

Related

using powershell to only read last 1month of folders. copy in 7 days worth of folders then using robocopy

apologies to start as im new to powershell and robocopy.
i have a robocopy command that pulls in any files within its many subfolders that are within a maxage of 7. however, the main folder has a huge amount of folders dating back years(and i only need last 7 days each week it runs) so its slow reading each file in each folder before it even copies using robocopy.
it looks like powershell commands may be a way for me to limit the search of files for my robocopy, would this be possible? currently robocopy search each files in each folder in my main folder, ideally i would want it to be smart enough to only search even a months worth of files and then copy over last 7 days. this would speed up the run time hugely.
if possible even further, i only want csv files in each of the folders in my main folder but current robocopy searches the other folders and its files as well which takes time. all the csv files are in a folder called "run" in each parent folder(parent folder is a unique number within the "mainfolder".
my robocopy command:
robocopy \\server\mainfolder \\server\new_main_folder /S /maxage:7 /r:0 /w:0
I was going to point to you either FastCopy or FreeFileSync, both handle long file name paths and work well for me. But found problems running FastCopy when trying to filter folders the way you described. I wasn't getting the results I expected, so that leaves FreeFileSync. There is a little bit of a learning curve with FreeFileSync, but really, the only problem/complaint I've had with it is the xml based batch script that you can use to automate the program kept changing formats and they haven't been providing a way to read the old xml batch scripts with the new version of the software. Maybe that has changed, I haven't looked into that lately.
Maybe other people have had better experience with RoboCopy, but I found it to take literally many multiples longer to do the same job as many other copy programs. I don't think FreeFileSync is as fast as FastCopy, but I've never seen it act as bad as what I experienced with RoboCopy.
The way FreeFileSync works is:
You define 1 or more source/destination pairs.
There is a global setting at the top to set the defaults for all copy pairs.
There are individual settings per each copy pair that when set override the global settings.
In the filter tab of the settings you can set "Time span:" to "Last x days:" and set it to the 7 days that you want.
You can change include from * to something like \run\*.csv. I didn't try that exact pattern, but the patterns I did try worked as expected (Unlike FastCopy).
The Synchronization tab is the tricky/fun one. You can do logs, versioning, tell the system to shutdown or restart when done, maintain a database for tracking moved files ("Detect moved files" checkbox), and all kinds of adjustments to how it behaves when files don't match.
When done, there is I believe at least 2 options for saving the configuration - though I've always just created the xml based batch script and called that from another scripting language or an icon on the desktop.

how to create a script that allows to use the path list as a reference for copying files in PowerShell in .bat script

I'm looking for a way to automate archiving where after I plug my two external drives I can copy all my resources. The problem is that I have different file structures on my laptop and on both external drives so I need to select specific folders to be copied. It means that I can't select one root folder and copy it straightforward. I tried to find a way to declare more than one path in the cp command and in the copy command, without success. An example path:
/my_programming_stuff
/folder1
/folder2
/folder3
/folder4
I want to select only the first 3 folders to copy them into external drive1 and external drive 2. The idea is to create a .bat file that will copy everything at once ( in the best case scenario it will be copied simultaneously on both external drives, so it will be much faster). Another problem is that there needs to be a bypass the ntfs long path limitations (max. 260 characters).
Flags that I want to use:
Copy the files and directories and all of their attributes,
including ownerships and permissions.
Recursively copy directories and their contents.
When copying files from one directory to another, only
copy files that either doesn't exist or are newer than the
existing corresponding files, in the destination
directory.
data verification (so it's certain that the copy was verified)
progression bar with time eta
Until now I was using Total Commander to do this but every day I need to pick only a few folders to be copied which takes time and is inefficient.
I have experience with Bash and PowerShell but I am not sure how to handle this topic.
Create a static batch file with robocopy commands. I think /copyall is the only switch you need to specify for all this. Other defaults should satisfy requirements.
https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
I think your time will be better spent learning how to use either FastCopy or FreeFileSynce. I used FreeFileSync some years ago but got disgusted with the it's constantly changing format of its xml file used for starting a backup, so I switched to FastCopy. But it looks like FreeFileSync may be getting their act together and I aim to do some experiments over the summer to see if I want to switch back to it.
Both can handle the long filename format issues, both can be executed by a batch file, both seem to have a lot of quality, but FreeFileSync has more features - and more bloated because of the features. But speed wise, I think FastCopy is probably one of the better products out there and very streamline in use and design.

set custom path for script in MATLAB

I have a lot of folders each containing a lot of .txt files which i need to process with matlab.
i have a script finished which does it, it reads all .txt files in 1 folder and does some computing.
so if i want to process those files i move the script into the desired folder and run it, it is kinda lame to move the script every time.
is it possile to start a script in a standard directory, then make the script to ask me to browse for the desired directory to run the script in that directory? because it doesnt write any files, it just reads. after it finishes it should reset it, so i could browse to different folders every time. like save the path in a value and clear it at the end...
i use "fopen" in that script to open the .txt files, so it could be possible to assign the full path to that function, but its important to me to being able to browse the right folder every time i run the script.
i use
path=uigetdir;
fileid=fopen([path filesep 'file.txt']);
and it works just the way i wanted.
thanks #excaza and #Laure for a quick reply

Code to perform multiple copies of AutoCad Electrical template drawings and rename

I am trying to create a script that will make copies of a template.dwg from AutoCAD Electrical multiple times (10 - 100 times) and rename it at the same time. I have seen programs like this online for excel workbooks but unsure if it will work outside of excel files.
My problem is I am not sure which language would be best suited for such a code. I have experience with C++, Java, Matlab, and just recently started playing around with VB and Macros.
I can make coppies of this file through windows explorer, but am hoping to streamline this process by using excel to make copies of files that are not excel.
Right now I am making one copy at a time and it takes up a lot of my time at work.
Is this possible? and if so what language would be best suited to do so?
BATCH FILE TO PERFORM MULTIPLE COPIES
The following is batch (.bat) script to make multiple copies of a file, and works to make copies of AutoCAD dwg through the computer vs AutoCAD. This is written and tested on a PC platform.
To run this copy and paste the script into Notepad++ and change the Lang. option to Batch, update the following line for the path of the file you wish to copy:
set Pathname="C:\<enter file path to dwg to copy>"
and then save. This will bring up a dialog box when ran promting the user for filename (template name) and number of copies deisired.
`#echo OFF
title File Duplicator
color 0a
:start
set /P TemplateName=Please enter the template name you wish to copy:
set /P NumberOfCopies=Please enter how many copies you wish to make:
set Pathname="C:\<enter file path to dwg to copy>"
cd /d %Pathname%
:init
for /L %%f in (1,1,%NumberOfCopies%) do copy %TemplateName%.dwg C:\Temp\%%f%TemplateName%.dwg

Talend tWaitForFile insufficiency

We have a producer process that write files into a specific folder, which run continuously, we have to read files one by one using talend, there is 2 issues:
The 1st: tWaitForFile read only files which exist before its starting, so files which have created after the component starting are not visible for it.
The 2nd: There is no way to know if the file is released by the producer process, it may be read while it is not completely written, the parameter _wait_release_ of tWaitForFile does not work on Linux system !
So how can make Talend read complete written files from a directory that have an increasing files number ?
I'm not sure what you mean by your first issue. tWaitForFile has options to trigger when files are created, modified or deleted in a folder.
As for the second issue, your best bet here is for the file producer to be creating an OK or control file which is a 0 byte touch when it has finished writing the file you want.
In this case you simply look for the appearance of the OK file and then pick up the relevant completed file. If you name the 2 files the same but with a different file extension (the OK file is typically called ".OK" then this should be easy enough to look for. So you would set your tWaitForFile to look for "*.OK" files and then connect this to an iterate to a tFileInputDelimited (in the case you want to pick up a delimited text file) and then declare the file name as ((String)globalMap.get("tWaitForFile_1_CREATED_FILE")).substring(0,((String)globalMap.get("tWaitForFile_1_CREATED_FILE")).length()-3) + ".txt"
I've included some screenshots to help you below: