I have multiple files in my directory that start with split4_, I want to add a .csv extension to all these files.
I am trying the following command.
mv split4_* *.csv
It doesn't work. What am I missing?
Try:
for file in *; do echo mv -- "$split4" "$split4.csv";
Related
I am trying to move all the *.csv files to another folder on server but every time i get access failed error , I am able to get all the files to local server using mget but mv fails everytime , i can see the file on the server and got full permissions on the files, sh script is not working with wild characters. struck here with the simple command .
Download to local directory
localDir="/home/toor/UCDownloads/"
[ ! -d $localDir ] && mkdir -p $localDir
#sftp in the file directory to be downloaded
remoteDir="/share/CACHEDEV1_DATA/Lanein1/Unicard/"
#The file to be downloaded is fileName
lftp -u ${sftp_user},${password} sftp://${host}:${port}<<EOF
PS4='$LINENO: '
set xfer:log true
set xfer:log-file "$logfileUCARC"
set xfer:clobber true
set xfer:auto-rename true
debug 9
cd ${remoteDir}
lcd ${localDir}
#mget *.CSV
ls -l
mv "/share/CACHEDEV1_DATA/Lanein1/Unicard/"*.csv "/share/CACHEDEV1_DATA/Lanein1/Unicard/Archives/"
#rm /share/CACHEDEV1_DATA/Lanein1/Unicard/!(*.pdf)
bye
EOF
This is not a shell or Bash problem. It is a LFTP problem.
From the manual of LFTP:
mv file1 file2
Rename file1 to file2. No wildcard expansion is performed.
LFTP just does not support what you asking for. It will treat *.csv as a part of the file name.
See here for an alternative.
How do I prepends a special character in front of all the lines in all .txt files in my directory? Im new to writing bash scripts and having trouble doing this. I only know of using the grep function but thats only to search for keyword.
For now, I have this,
sed -i 's/^/#/' Machine1.txt
However, this is only for that specific .txt file. I want to do this for all files with a .txt extension in my directory. There are other extensions like .tar, .rpm, .sh files which I want to ignore. Thank you!
Just give a wildcard filename argument.
sed -i 's/^/#/' *.txt
you can use for loop.
For example
cd your_folder
for f in *.txt; do
sed -i 's/^/#/' "$f";
done
My source folders are on an external hard drive, but I want my thumbnails local. The following works, but it puts all the extracted files in the same folder as the source files, which requires another step to collect them and move them to a folder on my local machine.
exiftool -b -ThumbnailImage -w _thumb.jpg -ext CR2 -r source_folder_path\ > _extraction_results.txt
Is there any way to write them to a different folder in the same call to ExifTool?
Thanks!
Add the directory path to name given in the -w (textout) option (see examples at that link).
Example:
exiftool -b -ThumbnailImage -w /path/to/thumbdir/%f_thumb.jpg -ext CR2 -r source_folder_path\ > _extraction_results.txt
I want to download an entire website using the wget -r command and change the name of the file.
I have tried with:
wget -r -o doc.txt "http....
hoping that the OS would have automatically create file in order like doc1.txt doc2.txt but It actually save the stream of the stdout in that file.
Is there any way to do this with just one command?
Thanks!
-r tells wget to recursively get resources from a host.
-o file saves log messages to file instead of the standard error. I think that is not what you are looking for, I think it is -O file.
-O file stores the resource(s) in the given file, instead of creating a file in the current directory with the name of the resource. If used in conjunction with -r, it causes wget to store all resources concatenated to that file.
Since wget -r downloads and stores more than one file, recreating the server file tree in the local system, it has no sense to indicate the name of one file to store.
If what you want is to rename all downloaded files to match the pattern docX.txt, you can do it with a different command after wget has end:
wget -r http....
i=1
while read file
do
mv "$file" "$(dirname "$file")/doc$i.txt"
i=$(( $i + 1 ))
done < <(find . -type f)
how can I create a zip file without including all of the folders the files came from?
SO I have
file1.xml,
file2.xml,
file3.xml,
...
and these are all in folder desktop/data/xmlFiles/
When I zip using following command:
zip -9 -m -q C:\Users\Desktop\data\xmlFiles\XML.zip C:\Users\Desktop\data\xmlFiles\*.xml
This stores the entire path into the XML.zip file.
How do I just include my file1.xml, file2.xml, file3.xml ... inside of my XML.zip?
How about:
cd C:\Users\Desktop\data\xmlFiles
zip -9 -m -q XML.zip *.xml