I have a ./build folder (inside a project on a Windows machine) that I need to compress and send to a remote Linux server where I need to unzip it.
Here's the cmdlet function that I wrote (it doesn't work as intended):
function scpFileFromBuildFolder {
param (
[string]$userAndHost,
[string]$targetDirectory,
)
Compress-Archive -Force -Path "./build/" -Destinationpath "./build/main.zip" -CompressionLevel Optimal
Write-Host -fore Cyan "`r`nCopying archive from ./build/ to ${targetDirectory}"
. scp "./build/main.zip" "${userAndHost}:${targetDirectory}"
Write-Host -fore Cyan "`r`nDecompressing archive on a remote directory:"
. ssh $userAndHost ". cd ${targetDirectory}"
. ssh $userAndHost ". unzip main"
. ssh $userAndHost ". cd build/"
. ssh $userAndHost ". find . -maxdepth 1 -exec mv {} .. \" # move all files to a parent dir
. ssh $userAndHost ". cd .."
. ssh $userAndHost ". rm -rf build build.zip main"
. ssh $userAndHost ". exit"
}
As you can see, I tried to perform multiple commands on a remote server via ssh. All of that works if I do things manually through PowerShell: I compress the file, scp it, and ssh to the server where I unzip the file and do all the other things. But it doesn't work this way through a .ps1 script — it only copies the file but doesn't do anything after the print statement ("Decompressing archive on a remote directory:").
Please, help me finish this function.
You should only run ssh once, with any remote commands getting fed to ssh up through to exit. Otherwise it will just pause and wait for more commands. For example:
ssh $UserAndHost "cd /tmp; hostname; pwd; exit"
You can put them on separate lines like a bash script, BUT be careful of crlf line endings (or strip them):
Write-Host -fore Cyan "`r`nDecompressing archive on a remote directory:"
ssh $userAndHost "cd ${targetDirectory}
unzip main
cd build/
find . -maxdepth 1 -exec mv {} .. \ # move all files to a parent dir
cd ..
rm -rf build build.zip main
exit"
# ERROR if crlf like:
bash: line 1: $'\r': command not found
Related
I have a situation where I want to speed up deployment time by caching the git resources into a shared PVC.
This is the bash script I use for checkout the resource and save into a share PVC folder
#!/bin/bash
src="$1"
dir="$2"
echo "Check for existence of directory: $dir"
if [ -d "$dir" ]
then
echo "$dir found, no need to clone the git"
else
echo "$dir not found, clone $src into $dir"
mkdir -p $dir
chmod -R 777 $dir
git clone $src $dir
echo "cloned $dir"
fi
Given I have a Deployment with more than 1 pods and each of them have an initContainer. The problem with this approach is all initContainers will start almost at the same time.
They all check for the existence of the git resource directory. Let's say first deployment we dont have the git directory yet. Then it will create the directory, then clone the resource. Now, the second and third initContainers see that the directory is already there so they finish immediately.
Is there a way to make other initContainers wait for the first one to finish?
After reading the kubernetes documentation, I don't think it's supported by default
Edit 1:
The second solution I can think of is to deploy with 1 pod only, after a successful deployment, we will scale it out automatically. However I still don't know how to do this
I have found a workaround. The idea is to create a lock file, and write a script to wait until the lock file exists. In the initContainer, I prepare the script like this
#!/bin/bash
src="$1"
dir="$2"
echo "Check for existence of directory: $dir/src"
if [ -d "$dir/src" ]
then
echo "$dir/src found, check if .lock is exist"
until [ ! -f $dir/.lock ]
do
sleep 5
echo 'After 5 second, .lock is still there, I will check again'
done
echo "Finish clone in other init container, I can die now"
exit
else
echo "$dir not found, clone $src into $dir"
mkdir -p $dir/src
echo "create .lock, make my friends wait for me"
touch $dir/.lock
ls -la $dir
chmod -R 777 $dir
git clone $src $dir/src
echo "cloned $dir"
echo "remove .lock now"
rm $dir/.lock
fi
This kind of like a cheat, but it works. The script will make other initContainers wait until the .lock is remove. By then, the project is cloned already.
I tried to use this command line in Windows:
My goal is to find all the directories in root#xx.xx.xx.xx:/var/log/ for a specific port and download them locally.
For the find command, I am excluding all folders which include this name "logFolder".
set outputPath=C:\MyDestination
scp -P 22 -r 'find root#xx.xx.xx.xx:/var/log/ -type d \( ! -name logFolder \)` %outputPath%
xx#xx-PC ~/xampp/htdocs/sites
$ rmdir /s "yo-2"
rmdir: `/s': No such file or directory
rmdir: `yo-2': Directory not empty
xx#xx-PC ~/xampp/htdocs/sites
$ rmdir "yo-2"
rmdir: `yo-2': Directory not empty
I cant seem to get rmdir to work in git bash. Its not in a git repo and I've tried the above. Mkdir works as expected, why doesnt this?
rmdir will not work if directory is empty
Try
rm -rf yo-2
git-bash is a Linux like shell
If you are trying to remove an entire directory regardless of contents, you could use:
rm <dirname> -rf
just use the command below:
rm -rfv mydirectory
After trying out a couple of other commands, this worked for me:
rm dirname -rf
A bit late, but I believe it still can help someone with performance problems on Windows systems. It is REALLY FAST to delete on Windows using git bash comparing with just the ordinary rm -rf. The trick here is to move the file/directory to another random name in a temporary directory at the same drive (on Windows) or at the same partition (on *nix systems) and invoke the rm -rf command in background mode. At least you don't need to wait for a blocking IO task and OS will perform the deletion as soon it gets idle.
Depending on the system you are using you may need to install the realpath program (ie macOS). Another alternative is to write a bash portable function like in this post: bash/fish command to print absolute path to a file.
fast_rm() {
path=$(realpath $1) # getting the absolute path
echo $path
if [ -e $path ]; then
export TMPDIR="$(dirname $(mktemp -u))"
kernel=$(uname | awk '{print tolower($0)}')
# if windows, make sure to use the same drive
if [[ "${kernel}" == "mingw"* ]]; then # git bash
export TMPDIR=$(echo "${path}" | awk '{ print substr($0, 1, 2)"/temp"}')
if [ ! -e $TMPDIR ]; then mkdir -p $TMPDIR; fi
fi
if [ "${kernel}" == "darwin" ]; then MD5=md5; else MD5=md5sum; fi
rnd=$(echo $RANDOM | $MD5 | awk '{print $0}')
to_remove="${TMPDIR}/$(basename ${path})-${rnd}"
mv "${path}" "${to_remove}"
nohup rm -rf "${to_remove}" > /dev/null 2>&1 &
fi
}
# invoking the function
directory_or_file=./vo-2
fast_delete $directory_or_file
I have faced same issue. this is worked for me
rimraf is a Node.js package, which is the UNIX command rm -rf for node, so you will need to install Node.js which includes npm. Then you can run:
npm install -g rimraf
Then you can run rimraf from the command line.
rimraf directoryname
visit https://superuser.com/questions/78434/how-to-delete-directories-with-path-names-too-long-for-normal-delete
I found this solution because npm itself was causing this problem due to the way it nests dependencies.
Late reply, but for those who search a solution, for me the
rm <dirname> -rf
wasn't good, I always get the directory non-empty or path too long on node directories.
A really simple solution :
Move the directory you want to delete to the root of your disk (to shorten your path) and then you can delete it normally.
With this command, I'm able to ZIP all files from the folders:
wzzip.exe -a -p -r C:\DestinationPath\DataFiles_20130903.zip C:\SourcePath\*.*
But, my folder has .dat,.bat,.txt,.xls files.I want to ZIP only .dat and .bat file.How to do this?
Thanks.
use this command (for the particular scenario in the question):
wzzip.exe -a -p -r C:\DestinationPath\DataFiles_20130903.zip C:\SourcePath\*.dat C:\SourcePath\*.bat
for more command line options for winZip refer to the following links:
winZip command line Reference 1
winZip command line Reference 2
To provide multiple file names you can also use #filename where the filename is a file which contains the list of files which you want to include in the zip file.
If you are making the command configurable then you can ask the user/ other program which is calling your command to select the file extensions and then write these selected extensions into the "filename" file using java code or any other language you prefer.
For example if the user selects bat and dat , then write "C:\SourcePath\*.bat" and "C:\SourcePath\*.dat" into the file(assume filename is fileExtensions.txt) and call the command
wzzip.exe -a -p -r "C:\DestinationPath\DataFiles_20130903.zip" #"C:\SourcePath\fileExtensions.txt"
You can use the D7zip
An excellent zipador file and folders D7zip.exe
link to download
https://drive.google.com/file/d/0B4bu9X3c-WZqdlVlZFV4Wl9QWDA/edit?usp=sharing
How to use
compressing files
D7Zip.exe -z "c:\fileout.zip" -f "C:\filein.txt"
compressing files and putting password
D7Zip.exe -z "c:\fileout.zip" -f "C:\filein.txt" -s "123"
compressing folders
D7Zip.exe -z "c:\folderout.zip" -f "C:\folderin\"
unzipping files
D7Zip.exe -u "c:\fileout.zip" -f "c:\folderout\"
unzipping files that have password
D7Zip.exe -u "c:\fileout.zip" -f "c:\folderout\" -s "123"
decompressing files by extension
D7Zip.exe -u "c:\fileout.zip" -f "c:\folderout\*.txt"
decompressing files without asking for confirmation to replace
D7Zip.exe -u "c:\fileout.zip" -f "c:\folderout\" -r
help
D7Zip.exe -?
D7Zip.exe by Delmar Grande.
If the command line given above is right then give this a go: but check the paths.
#echo off
pushd "C:\SourcePath"
"c:\program files\winzip\wzzip.exe" -a -p -r "C:\DestinationPath\DataFiles_20130903.zip" *.dat *.bat
popd
I am not sure if this belongs to superuser. Please excuse.
Here is what I am trying to do. I need to create a ksh script which will establish an ssh connection to a remote machine and find all ".tar" files in a particular path for a particular date and list them. Next, I will need to perform an scp command to copy all those .tar files to the server I am executing the ksh script on.
Here is what I have so far and it is far from complete... (please bear with me.. I am very new to ksh scripting).
Can someone please advise if I am going in the right direction and provide some pointers as to how I can improve and achieve what I am trying to do?
Many thanks in advance.
SSERVER=server1
SOURCEPATH=/tmp/test
sudo ssh $SSERVER \
find $SOURCEPATH -name "*.tar" -mtime +7 -exec ls {} \;
#will the above two statements work?
#I then need to output the ls results to a temp variable (i believe) and issue an scp on each of the files
#Copy files from SOURCEPATH to PATH
sudo scp "$SSERVER:$SOURCEPATH/$file1" /tftpboot
sudo scp "$SSERVER:$SOURCEPATH/$file2" /tftpboot
SSERVER=server1
SOURCEPATH=/tmp/test
sudo ssh "$SSERVER" "find $SOURCEPATH -name '*.tar' -mtime +7" |
while IFS= read -r; do
sudo scp "$SSERVER:'$REPLY'" /tftpboot
done