I want to recursively copy a directory (with sub-directories and files). This can be easily done using xcopy. But with regard to files, I only want their names (something "touch" could do in Linux) and not their content, the reason being the files are large and I am doing processing only on the filenames and not in their content. Any suggestions for a program/script for this task?
Use Gnuutils and find -exec touch?
Related
I am looking for a way to count the number of files in a folder path without caring about the names of the files. DIR function extracts all the names which is unnecessary for my specific application.
Since I'm looking at 100 folders and each folder includes almost 35000 files in it, it is very time consuming if I use the "dir" function.
Any help is greatly appreciated.
Do
someDir = 'c:\Users\You\somePath\' //whatever directory you want to do it for
[status,cmdout] = System(['dir ' someDir '*.* /s'])
and you can parse out the number of files from cmdout
This should be faster because its just running a system command so you lose all the overhead of matlab.
My question is how to use MATLAB to search for a certain type of files in a folder. I give an example to detail on my question:
Suppose we have the following folder as well as files in it:
My_folder
Sub_folder1
Sub_sub_folder1
a.txt
1.txt
2.txt
Sub_folder2
3.txt
abc.txt
In this example, I want to find all the .txt files in My_folder as well as its sub-folders. I was wondering what I could do with MATLAB. Thanks!
To my knowledge Matlab doesn't have an inbuilt function to do recursive directory searches, however there are a couple available for download on Matlab Central: here and here.
Alternatively you could write your own recursive function and use the dir function to search at each level for files matching your criterea or other directories to recurse into.
I agree with the Matlab Central options -- another method which I've used when MLC is not an option (no network, or customer computer, etc) is the quick and dirty dos commands:
dos(['dir /s/b ' mywildcard])
The /s will do a recursive directory search for whatever wildcards you specify, and /b will make it so you only get filenames (complete will full path, but no headers, file sizes, etc).
This is obviously platform dependant, so is mostly used when you are forced to work without your "standard" set of utilities you've accumulated.
Even though an answer has been accepted, I would like to point out Matlab's dir function.
This built-in function returns the contents of the folder in question. Furthermore, it indicates which content is a folder of its own. Therefore, with a little loop one could use this function to search sub-directories as well.
This is a new one for me so I'm pretty much flying blind.
I have a folder at 192.168.1.2\mainFolder that contains folder1, folder2, and folder3. Inside each of those folders are a handful of different file extentions, and a couple of files of each type. I need to take all files that exist inside mainFolder of the .dep type, and copy them to 192.168.1.2\copyFolder
copyFolder will not have any folders inside it, but just many many files.
What is the best way to go about doing this? I have been told by TPTB that robocopy would be helpful, however I have never used robocopy and thought you guys may know of something better
So you don't want .dep files inside folder1, 2 etc.? Robocopy / xcopy is usually a good choice. Powershell is slow for such a simple operation. If you just want the .dep files in mainfolder but not those inside the subfolders, try:
robocopy \\192.168.1.2\mainFolder \\192.168.1.2\copyFolder *.dep
I have a situation where we have several thousand image files that have become corrupted on our server (Windows 2008 R2 x64). I have a working image file that I want to replace the corrupt files with. The files must retain the same name and path (size, timestamps, etc do not matter).
So the basic idea would be to replace each corrupt image file with the working file.
I do not write code, only the occasional windows batch file.
Should I use VB or PowerShell (or something else) for this? What will the script look like for this?
I apologize in advance if this question is too basic for stackoverflow.
You don't really need a batch file,
try looking at the for command
e.g.
FOR /R %f in (*.jpg) DO copy newfile.jpg "%f"
This should do a recursive search and copy newfile.jpg over the jpg's it finds.
It all boils down to how you are identifying the broken jpgs.
When I dont use a wild card for example
FOR /R %f in (broken.jpg) DO copy newfile.jpg "%f"
Then newfile.jpg gets copied to every subdirectory. If I use a wildcard ( *,?) the command works as expected. Is there a way to have this commend work with a (set) that does not contain wildcards?
I'm doing some testing to ensure that the all in one zip file that i created using a script file will produce the same output as the content of a few zip files that i must manually click and create via web interface. Therefore the zip will have different folder structure.
Of course i can manually extracted them out and using my powerful eyeball technique to scan them or even lazier i can write a script to do that, but before i invest more time and get accused by my boss for company time robbery, i'm asking if there's a better way to do this?
I'm using perl LAMP stack by the way.
thanks.
You can use perl's Archive::ZIP or Python's zipfile to extract the filenames, sizes and CRC checksums of the files in the archives. Create a file which contains the results sorted by file name (ignore the path).
For your smaller ZIPs, merge the results of the script (cat list1 list2 list3 | sort).
Now, you can use diff to compare the results.
I can wholeheartly recommend Beyond Compare. Unless you're really getting underpaid, it's the biggest bang for your (bosses) buck.
[Edit] I seem to have scanned over the different folder structure, sorry about that.Beyond Compare can compare all files in folders with the same folderstructure. It does not have (I believe) the intelligence to go searching for matches in files in different folders.
Regards,
Lieven
Create a crc checksum for your files.
If your checksum is the same for the original files and the unzipped files, you can be sure the files are the same. And even works for non text data.
A checksum be easily be created with an external program such as "SFV Checker" or programmatically (.net/java for example include libraries to do this).
Taking a cue from Carra's answer...if A.zip is your single big archive and B.zip is the archive generated through the web then use the following algorithm
Extract all files from A.zip and recursively (w.r.t folders) compute the checksum of the files present in the folder (using cksum, md5sum etc) where the contents were extracted and save this information after sorting it (pipe it through sort) to a file (say A.txt)
Do the same for B.zip and generate B.txt
Compare A.txt with B.txt they should be exactly the same.
OR
Use unzip -l to get file/directory lists for both the (zip) archives and then flatten the hierarchy of the user generated zip file and compare with the contents of your script generated zip file using some thing like diff. By flattening of hierarchy I mean you may need to do some kind of pre-precessing on one or both lists before you can do a meaningful comparison with diff.