go through sub directories using for loop in batch - spring-batch

I try make a script witch go through a directory and for each file in the directory it will make a zip archive copy the archive somewhere else and then extract the archive. In the end, the whole directory i copy from has to equal to the new directory somewhere else but in files were copied has archive.
i had a problem with going through all files in the directory because it has a lot of sub directories so i could not go through all of them with a simple for loop.
My plan was to copy first all the directories and sub directories "xcopy /t /e" and then go through the files and archive copy and extract each one of them individually but as i said earlier i could not do it.
If someone can help me and show me how to to go through files like that or how to accomplish my mission, it will be perfect.
Thank you.

You could use recursion for that. Both zip -r or rsync could do the job in a very simple way. You do not have to loop.

Related

Filter files out of a folder, and copy them to a other folder

I ask a few questions here and every time I am very pleased with the answers, so here we go again.
I have three folders inside the folders the files have the same names except for the file structure:
1 folder with .zip
1 folder with .7zip
1 folder with different folders
I really would like to get all the matching files from the .zip.
So .zip is leading and I am looking for a batch file that will copy the files from the other folders.
I really hope this makes any sense :) English isn’t my first language.
Use the find command.
find . -regex '.*\.zip' -exec cp {} target_dir \;

get latest external file automaticly with project

I was create Project folder and Some sub folder that have solution under sub folders.
user A use solution A under sub folder A, user B use solution B under sub folder B (means that he map only folder b and work on it) and etc.
we have some common files that use on the all project (A,B,...) we can put them under the one folder for examople "Common" under the Project folder.
user A want to work with his project he get folder A but he need to Common folder files too. what to do?
like ewise user B want changes this file. and use to his project (B).
I want have automatic way to get external file with solution simultaneously.
To get the common files under the Common folder together with solution, you can consider using batch script which contains two tf get commands:
tf.exe get /recursive $/SolutionA
tf.exe get /recursive $/Common
When running this batch script, both SolutionA and Common folder are downloaded.

Matlab: accessing files in subfolders

when I am trying to access files in sub folders of a main folder, How do I come back to main folder.
Ex: C:\Users\User1\Documents\MATLAB\folder1\folder2\folder3
How do I come back to folder2 to access another sub folder in folder2
You either save your current directory in a variable and cd there later, or you go up in the directory hierarchy by doing cd ..
To expand on the previous answer, could use the ..\ operator.
For eg, in order to read an image in folder4 (which is a sub directory of folder2), you can use this:
ex_img = imread('..\folder4\ex_img.jpg');

Install4j "Copy files and directories" installer action does not copy the file directory path

I do a backup of the files that need to be replaced as part of the installation. For this I use the above action. However, it just copies the files in the child directory to the destination directory but not the directory hierarchy of the source to the destination directory. Example: I have a some files in this directory structure \dir1\dir2\dir3\files. It copies only the files under dir3 but not the \dir1\dir2\dir3. I need to preserve the directory structure in the backup. Can anyone help with this please?
I am using Windows 7 Enterprise.
Thanks very much.
You have to select the root directory in the action, i.e. the one containing "dir1". Then, it will copy the whole directory tree to the destination.
If you want to copy only parts of the directory tree, use the "Directory filter" and "File filter" properties.

How can I set a temp directory for uncompleted downloads in Wget?

I'm trying to mirror files on FTP server.
Those files can be very large so downloads might be interrupted.
I'd like to keep the original files while downloading partial files to a temporary folder and once completed override local older versions.
Can I do this? how?
Is there another easy to use (command line) tool that I can use?
First, download the files to a temp directory. Use -c so you can resume.
After the download has completed, use copy, rename or rsync to copy the files to the final place.
Note: Consider using rsync for the whole process because it was designed for just this use case and it will cause much less strain on the server and the Internet. Most site admins are happy if you ask them for an rsync access just for this reason.
Looking at the wget manual I can't see this functionality, however you could write a bash script to do what you want, which would essentially run an individual wget for each file then move it using normal mv command.
Alternativly have a look at rsync according to the manual there is a paramater that sets a temp dir
-T --temp-dir=DIR create temporary files in directory DIR
I am not 100% sure wheather this is where it puts the files during downloads as not had chance to try it out.