diff of two files from two different directories - perl

I have two or more directories which contains an ample of files.
I want to take a diff between the two files(which will be of same name for sure) but exists in different directories.Please help on how can in do this in perl scripting.Thanks.

You don't need Perl to accomplish it.
Try:
diff -r -N folder1/ folder2/

Related

I want to rename multiple files in a directory in Linux based on the delimeter

I need help to rename multiple files in a directory based on the delimeter.
Sample:
From
R01235-XYZ-TRAIL.PDF
TO
R01234-TRAIL.PDF
and
From
XYZ-C12345-TRAIL.PDF
TO
C12345-TRAIL.PDF
is it possible to delete based on - delimeter?
I am not specifically removing XYZ but rather remove anything before the first - and the middle occurence between two -.. XYZ is just a representation of the characters in that field.
Thanks!
I tried SED, LS, MV, I also tried RENAME but it seems not working for me.
This might work for you:
rename -n 's/XYZ-//' file
This removes XYZ- from the file name.
If this meets your requirements, remove the -n option for the renaming to take place.
On retrospect, perhaps:
rename -n 's/([A-Z][0-9]{5}-).*-/$1/;s/^.*-([A-Z][0-9]{5}-)/$1/' file
With sed:
sed -E 's/^([A-Z][0-9]{5}-).*-|^.*([A-Z][0-9]{5}-.*)/mv & \1\2/' file
Check the results and then:
sed -E 's/^([A-Z][0-9]{5}-).*-|^.*([A-Z][0-9]{5}-.*)/mv & \1\2/' file | sh

Git Bash find exec recursively on folders and files containing spaces

Question: In Git Bash on windows, how would you run the following in a way that it will also search folders with spaces in the name, and execute on files with spaces in the name?
$ find ./ -type f -name '*.png' -exec sh -c 'cwebp -q 75 $1 -o "${1%.png}.webp"' _ {} \;
Context I'm running Git Bash on windows, trying to execute a command on all found .png files to convert them to .webp format. It works for all files without spaces in the path, but it's failing to find files with spaces in the filename or files within folders that have spaces in the folder name.A few considerations:
I have many, many levels of folders to iterate through, and I can't run this command separately for each. I really need the recursion to work.I cannot change the folder names; it will break other dependencies (nor did I create the folder or filenames originally, so cut me some slack!)I arrived here by following the suggestions from this article: https://www.smashingmagazine.com/2018/07/converting-images-to-webp/the program, to my knowledge, doesn't ship with any built-in recursive command... golly that'd be handy
Any help you can provide will be appreciated. Thanks!

How do I diff only certain files?

I have a list of files (a subset of the files in a directory) and I want to generate a patch that includes only the differences in those files.
From the diff manual, it looks like I can exclude (-x), but would need to specify that for every file that I don't want to include, which seems cumbersome and difficult to script cleanly.
Is there a way to just give diff a list of files? I've already isolated the files with the changes into a separate directory, and I also have a file with the list of filenames, so I can present it to diff in whichever way works best.
What I've tried:
cd filestodiff/
for i in `*`; do diff /fileswithchanges/$i /fileswithoutchanges/$i >> mypatch.diff; done
However patch doesn't see this as valid input because there's no filename header included.
patchutils provides filterdiff that can do this:
diff -ur old/ new/ | filterdiff -I filelist > patchfile
It is packaged for several linux distributions

Copy multiple files to different directories in Makefile

I have a Makefile where I currently have two files that should be copied to different directories. Currently, I've tested
echo ${dirs} | xargs -n 1 cp ${sources}
So I understand that this will not work since it will try to copy both source files to one of the directory every time. But is there a way that I can execute the copy command for every source file and directory each?
Best regards,
Simon
I think it is possible to deduce what you want from what you wrote, but as others pointed out, you should be more clear, so we don't have to spend time deducing it.
Anyway, since you want to not copy all files to all directories, you must somehow tell Make where you want to copy which files. The easiest way is to list the full paths of the copies you want in a variable such as $(COPIES), and not just ${dirs}. In this answer I am going to assume the destination directories already exist.
.PHONY: all
all: $(COPIES)
PERCENT := %
.SECONDEXPANSION:
$(COPIES): %: $$(filter $$(PERCENT)/$$(notdir $$*), $(sources)) Makefile
cp $< $#

Creating a script that compares multiple files in multiple servers

I have several different linux servers, all of which are essentially mirrors of each other. However, some of them have gone out of sync (file A in machine 1 is different from file B in machine 2).
I'm in the process of designing a script (shell or Perl only) that will systematically walk through certain directories and diff the corresponding files in the different machines against each other, and generate a meaningful report. Later on, I will try to sync up the files.
These are my thoughts so far on how to approach this:
sftp files to /tmp and diff locally
using ssh and diff
using rsync
My question is: what is the best way to systematically compare two files that are in different machines (but similar directory structure), and are there any built-in Perl utilities that may be helpful?
rsync will figure out the difference and sync your files by sending only the diff. Once two folders get synced, it will be pretty quick. (But the 1st time to sync will take some time)
You can also use git here. One possible workflow: just checkin all files you want to compare (or complete directories using git add -A). Then create an empty git repository on your local workstation which is used fetch all the other repositories, and which is used to do the comparisons:
git init
git remote add firstmachine ssh://user#firstmachine/path/to/directory
git remote add othermachine ssh://user#othermachine/path/to/directory
git fetch --all
Now the contents of two machines may be compared:
git diff remotes/firstmachine/master remotes/othermachine/master
Or just compare the contents of a specific file:
git diff remotes/firstmachine/master remotes/othermachine/master -- file/to/compare
It's not strictly necessary to use a third machine for the comparisons. You can also git-fetch the contents from othermachine to firstmachine.
I had worked on a similar tool (which was in python). What it did was, run a cron job, at a given time of the night, which would bring the tar bzipped files to one server, extract the directories and run a recursive diff on it. The diff output was then run through some python scripts, which would analyse the diff hunks (+ lines/! lines etc) to know the amount of change.
Not sure if there are pre-built modules in Perl or Python, but some helper utils might sure be available in one of them.
If you need to know the difference between some local and remote file systems, the following method minimizes the network load:
make a local copy ($C) of the local directory ($D) you want to compare. I.e.:
cp -R $D $C
use rsync to copy the remote directory ($R) you want to compare over $C:
rsync -av --delete $remote_host:$R $C
compare $D to $C:
diff -u $D $C