* not expanding in quotes - sh

I wrote a small shell script to package a Java project I'm working on into a jar file and copy it to my desktop and my flash drive, if plugged in. It looks like this:
#!/bin/sh
NAME="ExeFinder.jar"
SRC_DIR="$HOME/Documents/Programs/Java/NetBeansProjects/ExeFinder/src"
javac $SRC_DIR/*.java # compile them in place
cp $SRC_DIR/*.class . # copy them here
rm $SRC_DIR/*.class # remove the ones there
mkdir src # make the dir for the source code
cp $SRC_DIR/*.java src # copy the source files to src
jar cfe $NAME ExeFinder ./*.class src # create the jar file
rm -r src # remove the temporary src; already in the jar
rm ./*.class # ditto for the class files
if [ -d /media/moses/Moses\ 8GB ]; then # if my flash drive is inserted
cp ExeFinder.jar /media/moses/Moses\ 8GB # copy the jar file to it
fi
cp ExeFinder.jar ~/Desktop # copy it to my desktop
rm ExeFinder.jar # remove the old file
My problem is: When I run shellcheck on the file, it says "Double quote to prevent globbing and word splitting", and highlights a few lines with * in them. When I surround these with double quotes, the * doesn't expand properly. Although it isn't necessary to surround these with quotes, I want to know if, and if so how, I can expand * in quotes.

Related

access failed error - no such file while trying to move files

I am trying to move all the *.csv files to another folder on server but every time i get access failed error , I am able to get all the files to local server using mget but mv fails everytime , i can see the file on the server and got full permissions on the files, sh script is not working with wild characters. struck here with the simple command .
Download to local directory
localDir="/home/toor/UCDownloads/"
[ ! -d $localDir ] && mkdir -p $localDir
#sftp in the file directory to be downloaded
remoteDir="/share/CACHEDEV1_DATA/Lanein1/Unicard/"
#The file to be downloaded is fileName
lftp -u ${sftp_user},${password} sftp://${host}:${port}<<EOF
PS4='$LINENO: '
set xfer:log true
set xfer:log-file "$logfileUCARC"
set xfer:clobber true
set xfer:auto-rename true
debug 9
cd ${remoteDir}
lcd ${localDir}
#mget *.CSV
ls -l
mv "/share/CACHEDEV1_DATA/Lanein1/Unicard/"*.csv "/share/CACHEDEV1_DATA/Lanein1/Unicard/Archives/"
#rm /share/CACHEDEV1_DATA/Lanein1/Unicard/!(*.pdf)
bye
EOF
This is not a shell or Bash problem. It is a LFTP problem.
From the manual of LFTP:
mv file1 file2
Rename file1 to file2. No wildcard expansion is performed.
LFTP just does not support what you asking for. It will treat *.csv as a part of the file name.
See here for an alternative.

Talend multiple build jobs

We are using open source Talend studio and we have more then 50 jobs.
Each build generate zip file contains all it's artifacts ( .bat .sh context, jar files)
Is there a way to generate multiple build process from the studio or command line ( Talend open source tool )
In the "build job" window, there is a double arrow in the left,
Click on it, and you get the job tree, select all jobs or what you want, and you will get a single zip file containing all your jobs each one in a separate folder.
Not an ideal solution but you can use a small script to split the whole zip into separate job zips:
ZIP=test.zip # path to your all-in-one zip file
ROOT=$(basename $ZIP .zip)
DEST=./dest
rm -rf $DEST # be careful with this one!
mkdir -p $DEST
unzip $ZIP
find $ROOT -mindepth 1 -maxdepth 1 -type d ! -name lib|while read JOBPATH
do
JOB=$(basename $JOBPATH)
echo "job: $JOB"
DJOB="$DEST/$JOB"
mkdir -p "$DJOB"
cp -R "$JOBPATH" "$DJOB/$JOB"
cp $ROOT/jobInfo.properties $DJOB # here you should replace job=<proper job name> and jobId, but not sure you really need it
mkdir -p "$DJOB/lib"
RUNFILE="${JOBPATH}/${JOB}_run.sh"
LIBS=$(grep "^java" "$RUNFILE"|cut -d' ' -f 5)
IFS=':' read -ra ALIB <<< "$LIBS"
for LIB in "${ALIB[#]}"; do
if [ "$LIB" = "." -o "$LIB" = "\$ROOT_PATH" ]; then continue; fi
echo "$LIB"
done|grep "\$ROOT_PATH/../lib"|cut -b 19-|while read DEP
do
cp "$ROOT/lib/$DEP" "$DJOB/lib/"
done
(cd $DJOB ; zip -r -m ../$JOB.zip .)
rmdir $DJOB
done

Shell Script (Linux): Copy and overwrite only files that have changed in destination?

I want to do the following in Shell Script:
I have two directories and I want to copy over only these files from the source directory that have changed or are missing in the destination directory.
Note 0: (in case the above is clear): I want to replace destination folder with source directory. Changed files in destination should be restored to the original state from the source directory. Unchanged files should not be touched (to save time).
Note 1: I looked up rsynch but I cannot figure out if it can do what I want. There is only -u but that doesn't seem to do what I want.
EDIT: This is my script: (fixed)
#!/bin/sh
################################################################################
# Copies resources to build output directory (Ubuntu)
################################################################################
directoryWtSource="/usr/share/Wt"
directoryBuildOutput="bin__output__"
# Get the absolute path to the script
scriptPath=`readlink -e $0`
directoryDestination=`dirname $scriptPath`
################################################################################
# Start
################################################################################
cd ..
directoryWtSourceResources=$directoryWtSource/resources
directoryBuildOutputResources=${PWD}/$directoryBuildOutput
if [ ! -d ${PWD}/$directoryBuildOutput ]
then
mkdir ${PWD}/$directoryBuildOutput
fi
#echo $directoryWtSourceResources
#echo $directoryBuildOutputResources
# BEGIN - REMOVE THIS: No need for cp
#if [ ! -d $directoryBuildOutputResources ]
#then
# echo " --> No destination directory found: copying resources..."
# cp -Rp $directoryWtSourceResources $directoryBuildOutputResources
#else
# echo " --> Destination directory found: won't copy."
#fi
# END - REMOVE THIS
rsync -arv $directoryWtSourceResources $directoryBuildOutputResources
################################################################################
# End
################################################################################
rsync can do this:
rsync -av source/ destination/
This command will also print out the list of files that have been replaced.

how to print the progress of the files being copied in bash [duplicate]

I suppose I could compare the number of files in the source directory to the number of files in the target directory as cp progresses, or perhaps do it with folder size instead? I tried to find examples, but all bash progress bars seem to be written for copying single files. I want to copy a bunch of files (or a directory, if the former is not possible).
You can also use rsync instead of cp like this:
rsync -Pa source destination
Which will give you a progress bar and estimated time of completion. Very handy.
To show a progress bar while doing a recursive copy of files & folders & subfolders (including links and file attributes), you can use gcp (easily installed in Ubuntu and Debian by running "sudo apt-get install gcp"):
gcp -rf SRC DEST
Here is the typical output while copying a large folder of files:
Copying 1.33 GiB 73% |##################### | 230.19 M/s ETA: 00:00:07
Notice that it shows just one progress bar for the whole operation, whereas if you want a single progress bar per file, you can use rsync:
rsync -ah --progress SRC DEST
You may have a look at the tool vcp. Thats a simple copy tool with two progress bars: One for the current file, and one for overall.
EDIT
Here is the link to the sources: http://members.iinet.net.au/~lynx/vcp/
Manpage can be found here: http://linux.die.net/man/1/vcp
Most distributions have a package for it.
Here another solution: Use the tool bar
You could invoke it like this:
#!/bin/bash
filesize=$(du -sb ${1} | awk '{ print $1 }')
tar -cf - -C ${1} ./ | bar --size ${filesize} | tar -xf - -C ${2}
You have to go the way over tar, and it will be inaccurate on small files. Also you must take care that the target directory exists. But it is a way.
My preferred option is Advanced Copy, as it uses the original cp source files.
$ wget http://ftp.gnu.org/gnu/coreutils/coreutils-8.21.tar.xz
$ tar xvJf coreutils-8.21.tar.xz
$ cd coreutils-8.21/
$ wget --no-check-certificate wget https://raw.githubusercontent.com/jarun/advcpmv/master/advcpmv-0.8-8.32.patch
$ patch -p1 -i advcpmv-0.8-8.32.patch
$ ./configure
$ make
The new programs are now located in src/cp and src/mv. You may choose to replace your existing commands:
$ sudo cp src/cp /usr/local/bin/cp
$ sudo cp src/mv /usr/local/bin/mv
Then you can use cp as usual, or specify -g to show the progress bar:
$ cp -g src dest
A simple unix way is to go to the destination directory and do watch -n 5 du -s . Perhaps make it more pretty by showing as a bar . This can help in environments where you have just the standard unix utils and no scope of installing additional files . du-sh is the key , watch is to just do every 5 seconds.
Pros : Works on any unix system Cons : No Progress Bar
To add another option, you can use cpv. It uses pv to imitate the usage of cp.
It works like pv but you can use it to recursively copy directories
You can get it here
There's a tool pv to do this exact thing: http://www.ivarch.com/programs/pv.shtml
There's a ubuntu version in apt
How about something like
find . -type f | pv -s $(find . -type f | wc -c) | xargs -i cp {} --parents /DEST/$(dirname {})
It finds all the files in the current directory, pipes that through PV while giving PV an estimated size so the progress meter works and then piping that to a CP command with the --parents flag so the DEST path matches the SRC path.
One problem I have yet to overcome is that if you issue this command
find /home/user/test -type f | pv -s $(find . -type f | wc -c) | xargs -i cp {} --parents /www/test/$(dirname {})
the destination path becomes /www/test/home/user/test/....FILES... and I am unsure how to tell the command to get rid of the '/home/user/test' part. That why I have to run it from inside the SRC directory.
Check the source code for progress_bar in the below git repository of mine
https://github.com/Kiran-Bose/supreme
Also try custom bash script package supreme to verify how progress bar work with cp and mv comands
Functionality overview
(1)Open Apps
----Firefox
----Calculator
----Settings
(2)Manage Files
----Search
----Navigate
----Quick access
|----Select File(s)
|----Inverse Selection
|----Make directory
|----Make file
|----Open
|----Copy
|----Move
|----Delete
|----Rename
|----Send to Device
|----Properties
(3)Manage Phone
----Move/Copy from phone
----Move/Copy to phone
----Sync folders
(4)Manage USB
----Move/Copy from USB
----Move/Copy to USB
There is command progress, https://github.com/Xfennec/progress, coreutils progress viewer.
Just run progress in another terminal to see the copy/move progress. For continuous monitoring use -M flag.

how to create zip file without including all of the folders the files came from

how can I create a zip file without including all of the folders the files came from?
SO I have
file1.xml,
file2.xml,
file3.xml,
...
and these are all in folder desktop/data/xmlFiles/
When I zip using following command:
zip -9 -m -q C:\Users\Desktop\data\xmlFiles\XML.zip C:\Users\Desktop\data\xmlFiles\*.xml
This stores the entire path into the XML.zip file.
How do I just include my file1.xml, file2.xml, file3.xml ... inside of my XML.zip?
How about:
cd C:\Users\Desktop\data\xmlFiles
zip -9 -m -q XML.zip *.xml