Shell Script to update the contents of a folder - perl

I'm a beginner in Unix Shell Scripting and Perl Scripting.
I would like to have an example program that teaches me how to update a file contents on a directory.
The scenario is, there is a directory which has some n number of files.
Among those n number of files, m number of files have been modified.
I need to update the contents of the modified files in the directory.
Give me a simple shell script to do this.
Thanks and Regards,
Vijay

I would do it with find like this:
find your_directory -newermt time_of_last_check -exec modify_script.sh {} \;
where:
your_directory is the directory where you have the files.
time_of_last_check is when you last ran this command
modify_script.sh is the program that you will run to modify the files, it should take one argument, and that is the filename to modify.

In Perl
To Update a File content see perlfaq5, you will find lot of information regarding File manipulation.You will get a lot of examples of file manipulations.
Getting File or Dir Statistics see perl built in function stat.
For Traverse a directory tree, see
File::Find

Related

Git Bash find exec recursively on folders and files containing spaces

Question: In Git Bash on windows, how would you run the following in a way that it will also search folders with spaces in the name, and execute on files with spaces in the name?
$ find ./ -type f -name '*.png' -exec sh -c 'cwebp -q 75 $1 -o "${1%.png}.webp"' _ {} \;
Context I'm running Git Bash on windows, trying to execute a command on all found .png files to convert them to .webp format. It works for all files without spaces in the path, but it's failing to find files with spaces in the filename or files within folders that have spaces in the folder name.A few considerations:
I have many, many levels of folders to iterate through, and I can't run this command separately for each. I really need the recursion to work.I cannot change the folder names; it will break other dependencies (nor did I create the folder or filenames originally, so cut me some slack!)I arrived here by following the suggestions from this article: https://www.smashingmagazine.com/2018/07/converting-images-to-webp/the program, to my knowledge, doesn't ship with any built-in recursive command... golly that'd be handy
Any help you can provide will be appreciated. Thanks!

Copy multiple files to different directories in Makefile

I have a Makefile where I currently have two files that should be copied to different directories. Currently, I've tested
echo ${dirs} | xargs -n 1 cp ${sources}
So I understand that this will not work since it will try to copy both source files to one of the directory every time. But is there a way that I can execute the copy command for every source file and directory each?
Best regards,
Simon
I think it is possible to deduce what you want from what you wrote, but as others pointed out, you should be more clear, so we don't have to spend time deducing it.
Anyway, since you want to not copy all files to all directories, you must somehow tell Make where you want to copy which files. The easiest way is to list the full paths of the copies you want in a variable such as $(COPIES), and not just ${dirs}. In this answer I am going to assume the destination directories already exist.
.PHONY: all
all: $(COPIES)
PERCENT := %
.SECONDEXPANSION:
$(COPIES): %: $$(filter $$(PERCENT)/$$(notdir $$*), $(sources)) Makefile
cp $< $#

Copy all contents of all files in a directory with a certain suffix

I have a bunch of directories named project1, project2, etc.
In those folders are a bunch of perl files (extension ".pl").
Basically, I want to just copy the contents of those .pl files into a new file, let's call it "everything.txt".
Can someone help me out with this? I really don't care which programming language it's done in, although I'd prefer something commandline. But perl, python, and Java would work too.
Edit: Also, there are some duplicate names, which shouldn't be a problem given I just want to write their contents out to a file, but just thought I'd let you know.
bash: cat project*/*.pl > everything.txt
In Unix-y systems:
find project1 project2 ... -name \*.pl -exec cat {} \; > everything.txt
To make, say, a proper .tar archive file that will let you recover the original file names and permissions:
tar cf everything.txt.tar $(find project1 project2 ... -name \*.pl)
(The $(...) syntax requires the bash shell).

matlab, textfile

I have a bunch of text files which have both strings and numbers in it, but the string are just in the first few rows.
I'm trying to write a script which go in to my folder search all the file in the folder and delete the text from the files and write the rest as it is in the new text file.
Does anybody know how?
I don't think this is a good use of MATLAB.
I think you'd be better off scripting this in Python or shell. Here is one way you could do it with tr in shell if you're on *nix or mac and if your files are all in the same directory and all have the file extension .txt:
#!/bin/sh
for i in `ls *.txt`
do
cat $i | tr -d "[:alpha:]" > $i.tr.txt
done
To run. save the code above as a file, make it executable (chmod a+x filename), and run it in the directory with your text files.
If the number of string lines is always the same, you can use textread() with 'headerlines' option to skip over those string lines, then write the entire text buffer out.

How can I copy a directory but ignore some files in Perl?

In my Perl code, I need to copy a directory from one location to another on the same host excluding some files/patterns (e.g. *.log, ./myDir/abc.cl).
What would be the optimum way of doing this in Perl across all the platforms?
On Windows, xcopy is one such solution. On unix platforms, is there a way to do this in Perl?
I think you're looking for rsync. It's not Perl, but it's going to work a lot better than anything you make in Perl:
% rsync --exclude='*.log' --exclude='./myDir/abc.cl' SOURCE DEST
If you have a bunch of patterns, you can put those all in a file:
*.log
./myDir/abc.cl
Now ignore all the patterns in a file:
% rsync --exclude-from=do_not_sync.txt SOURCE DEST
I'd use File::Find, and step over each file, but instead of calling File::Copy's copy() on each file, first test to see if it matches the pattern, and then next if it does.
On *nix, you can use native tar command, with -exclude options. Then after creating the tar file, you can bring it over to your destination to untar it.