Talend multiple build jobs - talend

We are using open source Talend studio and we have more then 50 jobs.
Each build generate zip file contains all it's artifacts ( .bat .sh context, jar files)
Is there a way to generate multiple build process from the studio or command line ( Talend open source tool )

In the "build job" window, there is a double arrow in the left,
Click on it, and you get the job tree, select all jobs or what you want, and you will get a single zip file containing all your jobs each one in a separate folder.

Not an ideal solution but you can use a small script to split the whole zip into separate job zips:
ZIP=test.zip # path to your all-in-one zip file
ROOT=$(basename $ZIP .zip)
DEST=./dest
rm -rf $DEST # be careful with this one!
mkdir -p $DEST
unzip $ZIP
find $ROOT -mindepth 1 -maxdepth 1 -type d ! -name lib|while read JOBPATH
do
JOB=$(basename $JOBPATH)
echo "job: $JOB"
DJOB="$DEST/$JOB"
mkdir -p "$DJOB"
cp -R "$JOBPATH" "$DJOB/$JOB"
cp $ROOT/jobInfo.properties $DJOB # here you should replace job=<proper job name> and jobId, but not sure you really need it
mkdir -p "$DJOB/lib"
RUNFILE="${JOBPATH}/${JOB}_run.sh"
LIBS=$(grep "^java" "$RUNFILE"|cut -d' ' -f 5)
IFS=':' read -ra ALIB <<< "$LIBS"
for LIB in "${ALIB[#]}"; do
if [ "$LIB" = "." -o "$LIB" = "\$ROOT_PATH" ]; then continue; fi
echo "$LIB"
done|grep "\$ROOT_PATH/../lib"|cut -b 19-|while read DEP
do
cp "$ROOT/lib/$DEP" "$DJOB/lib/"
done
(cd $DJOB ; zip -r -m ../$JOB.zip .)
rmdir $DJOB
done

Related

access failed error - no such file while trying to move files

I am trying to move all the *.csv files to another folder on server but every time i get access failed error , I am able to get all the files to local server using mget but mv fails everytime , i can see the file on the server and got full permissions on the files, sh script is not working with wild characters. struck here with the simple command .
Download to local directory
localDir="/home/toor/UCDownloads/"
[ ! -d $localDir ] && mkdir -p $localDir
#sftp in the file directory to be downloaded
remoteDir="/share/CACHEDEV1_DATA/Lanein1/Unicard/"
#The file to be downloaded is fileName
lftp -u ${sftp_user},${password} sftp://${host}:${port}<<EOF
PS4='$LINENO: '
set xfer:log true
set xfer:log-file "$logfileUCARC"
set xfer:clobber true
set xfer:auto-rename true
debug 9
cd ${remoteDir}
lcd ${localDir}
#mget *.CSV
ls -l
mv "/share/CACHEDEV1_DATA/Lanein1/Unicard/"*.csv "/share/CACHEDEV1_DATA/Lanein1/Unicard/Archives/"
#rm /share/CACHEDEV1_DATA/Lanein1/Unicard/!(*.pdf)
bye
EOF
This is not a shell or Bash problem. It is a LFTP problem.
From the manual of LFTP:
mv file1 file2
Rename file1 to file2. No wildcard expansion is performed.
LFTP just does not support what you asking for. It will treat *.csv as a part of the file name.
See here for an alternative.

Move files and copy subtree

source="/somedir/dir-a"
dest="/somedir2/dir-z"
I need to find all files recursively within the $source directory which contain the string 720p and move them to $dest
Just 2 things to take care of -
For all such files which are to be moved , first create that file's outer 2 directories in $dest and then move this matched file inside that
i have to do this for lakhs of files so a bit of parallelization would be helpful
Example
For a file like - "$source/dir-b/dir-c/file-720p.mp4" , it should do as follows :
mkdir -p "$dest/dir-b/dir-c"
mv "$source/dir-b/dir-c/file-720p.mp4" "$dest/dir-b/dir-c/file-720p.mp4"
You're looking for something like this:
src=foo
dst=bar
export dst
find "${src}" -name '*720p*' -type f -exec sh -c '
for p; do
np=${dst}${p#"${p%/*/*/*}"}
echo mkdir -p "${np%/*}" &&
echo mv "$p" "$np"
done' sh {} +
This can be parallelized using GNU find's -print0 primary in conjunction with GNU xargs, but I don't think that'd make much of a difference performance-wise, as this is rather an IO-intensive task.
Remove echos if the output is satisfactory.

mv: "Directory not Empty" - how do you merge directories with `mv`?

I tried to deploy my personal blog website to my remote server recently. When I tried to move a few files and directories to another place by executing mv, some unexpected errors happened. The command line echoed "Directory not Empty". After doing some googling, I tried again with '-f' switch or '-v', the same result showed.
I logged in on the root account, and the process is here:
root#danielpan:~# shopt -s dotglob
root#danielpan:~# mv /var/www/html/wordpress/* /var/www/html
mv: cannot move `/var/www/html/wordpress/wp-content` to `/var/www/html/wp-content`:
Directory not empty
root#danielpan:~# mv -f /var/www/html/wordpress/* /var/www/html
mv: cannot move `/var/www/html/wordpress/wp-content` to `/var/www/html/wp-content`:
Directory not empty
Anybody know why?
(I'm running Ubuntu 14.04)
If You have sub-directories and "mv" is not working:
cp -R source/* destination/
rm -R source/
I found the solution finally. Because the /var/www/html/wp-content already exists, then when you try to copy /var/www/html/wordpress/wp-content there, error of Directory not Empty happens. So you need to copy /var/www/html/wordpress/wp-content/* to /var/www/html/wp-content.
Just execute this:
mv /var/www/html/wordpress/wp-content/* /var/www/html/wp-content
rmdir /var/www/html/wordpress/wp-content
rmdir /var/www/html/wordpress
Instead of copying directories by cp or rsync, I prefer
cd ${source_path}
find . -type d -exec mkdir -p ${destination_path}/{} \;
find . -type f -exec mv {} ${destination_path}/{} \;
cd $oldpwd
moves files (actually renames them) and overwrites existing ones. So it's fast enough.
But when ${source_path} contains empty subfolders you can cleanup by rm -rf ${source_path}

how to create zip file without including all of the folders the files came from

how can I create a zip file without including all of the folders the files came from?
SO I have
file1.xml,
file2.xml,
file3.xml,
...
and these are all in folder desktop/data/xmlFiles/
When I zip using following command:
zip -9 -m -q C:\Users\Desktop\data\xmlFiles\XML.zip C:\Users\Desktop\data\xmlFiles\*.xml
This stores the entire path into the XML.zip file.
How do I just include my file1.xml, file2.xml, file3.xml ... inside of my XML.zip?
How about:
cd C:\Users\Desktop\data\xmlFiles
zip -9 -m -q XML.zip *.xml

How to compare the content of a tarball with a folder

How can I compare a tar file (already compressed) of the original folder with the original folder?
First I created archive file using
tar -kzcvf directory_name.zip directory_name
Then I tried to compare using
tar -diff -vf directory_name.zip directory_name
But it didn't work.
--compare (-d) is more handy for that.
tar --compare --file=archive-file.tar
works if archive-file.tar is in the directory it was created. To compare archive-file.tar against a remote target (eg if you have moved archive-file.tar to /some/where/) use the -C parameter:
tar --compare --file=archive-file.tar -C /some/where/
If you want to see tar working, use -v without -v only errors (missing files/folders) are reported.
Tipp: This works with compressed tar.bz/ tar.gz archives, too.
It should be --diff
Try this (without the last directory_name):
tar --diff -vf directory_name.zip
The problem is that the --diff command only looks for differences on the existing files among the tar file and the folder. So, if a new file is added to the folder, the diff command does not report this.
The method of pix is way slow for large compressed tar files, because it extracts each file individually. I use the tar --diff method loking for files with different modification time and extract and diff only these. The files are extracted into a folder base.orig where base is either the top level folder of the tar file or teh given comparison folder. This results in diffs including the date of the original file.
Here is the script:
#!/bin/bash
set -o nounset
# Print usage
if [ "$#" -lt 1 ] ; then
echo 'Diff a tar (or compressed tar) file with a folder'
echo 'difftar-folder.sh <tarfile> [<folder>] [strip]'
echo default for folder is .
echo default for strip is 0.
echo 'strip must be 0 or 1.'
exit 1
fi
# Parse parameters
tarfile=$1
if [ "$#" -ge 2 ] ; then
folder=$2
else
folder=.
fi
if [ "$#" -ge 3 ] ; then
strip=$3
else
strip=0
fi
# Get path prefix if --strip is used
if [ "$strip" -gt 0 ] ; then
prefix=`tar -t -f $tarfile | head -1`
else
prefix=
fi
# Original folder
if [ "$strip" -gt 0 ] ; then
orig=${prefix%/}.orig
elif [ "$folder" = "." ] ; then
orig=${tarfile##*/}
orig=./${orig%%.tar*}.orig
elif [ "$folder" = "" ] ; then
orig=${tarfile##*/}
orig=${orig%%.tar*}.orig
else
orig=$folder.orig
fi
echo $orig
mkdir -p "$orig"
# Make sure tar uses english output (for Mod time differs)
export LC_ALL=C
# Search all files with a deviating modification time using tar --diff
tar --diff -a -f "$tarfile" --strip $strip --directory "$folder" | grep "Mod time differs" | while read -r file ; do
# Substitute ': Mod time differs' with nothing
file=${file/: Mod time differs/}
# Check if file exists
if [ -f "$folder/$file" ] ; then
# Extract original file
tar -x -a -f "$tarfile" --strip $strip --directory "$orig" "$prefix$file"
# Compute diff
diff -u "$orig/$file" "$folder/$file"
fi
done
To ignore differences in some or all of the metadata (user, time, permissions), you can pipe the result to awk:
tar --compare --file=archive-file.tar -C /some/where/ | awk '!/Mode/ && !/Uid/ && !/Gid/ && !/time/'
That should output only the true differences between the tar and the directory /some/where/
I recently needed a better compare than what "tar --diff" produced so I made this short script:
#!/bin/bash
tar tf "$1" | while read ; do
if [ "${REPLY%/}" = "$REPLY" ] ; then
tar xOf "$1" "$REPLY" | diff -u - "$REPLY"
fi
done
The easy way is to write:
tar df file This compares the file with the current working directory, and tell us about if any of the files has been removed.
tar df file -C path/folder This compares the file with the folder.