how to exclude snapshots while running tar in Solaris - solaris

I'm trying to take a tar of the '/home/store/' directory content.
tar cvf store.tar /home/store/
While doing so, I can see that the .snapshot directories are also getting included. My understanding is that snapshots are kind of backups. Can I skip this? If possible, how? Tried excluding a test directory using the below command ran from /home/store/
tar cvfX store.tar <(echo /home/store/test) /home/store/
But this is not excluding the test directory from the tar created.
Also, tried this
tar cvf store.tar /home/store/ --exclude-file=exclude.txt
Output:
a /home/store// 0K
a /home/store//.profile 1K
a /home/store//local.profile 1K
a /home/store//.vas_logon_server 1K
a /home/store//.vas_disauthcc_611400381 1K
a /home/store//.bash_history 7K
a /home/store//test/ 0K
a /home/store//test/1.txt 1K
a /home/store//test/migrate-perf3.txt 3958K
a /home/store//test.txt 1K
a /home/store//exclude.txt 1K
a /home/store//.snapshot/hourly.0/d2/dd/d5d/f82-1 59K
a /home/store//.snapshot/hourly.0/d2/dd/d5d/f83-1 58K
.....
tar: --exclude-file=exclude.txt: No such file or directory
/home/store/exclude.txt has the entry 'test'. Tried entering the following as well and got same error.
/home/store/test/
/home/store/test/1.txt
When I gave the full path to 'exclude.txt' like this
`tar cvf store.tar /home/store/ --exclude-file=/home/store/exclude.txt`
it's giving the below error
tar: can't change directories to --exclude-file=/home/store: No such file or directory
tar -h
Usage: tar {c|r|t|u|x}[BDeEFhilmnopPqTvw#[0-7]][bfk][X...] [blocksize] [tarfile] [size] [exclude-file...] {file | -I include-file | -C directory file}...
Thanks well in advance!
Van Peer

Try to do so:
tar cvfX /var/tmp/src.tar /var/tmp/excl.txt /var/tmp/src/
Your exclude file should contain path:
/home/store//.snapshot
Best practice not to use full path of your tar dir, because in future you can overwite your /etc , when extract tar archive from /var/tmp, for example.

For example:
sudo tar -zcvpf /backup/farm-backup-$(date +%d-%m-%Y).tar.gz --exclude ".snapshots" --exclude ".cache" farm
Did not use a backslash in the command ie:/farm for the directory. Execute the tar command from the /home directory to back up "farm" user.
for making a backup in the root /backup directory.
OS: OpenSuse 15.1

Related

Retrieve file path downloaded via wget

I am downloading files with wget command
wget abc.com -nH -r -l1 --no-parent
This is storing files in different sub folders. I want path of each download file. So, how do I get it ?
Example:
wget is downloading file to:
c:/test/com/test/pacakage/filename1.text
c:/test/com1/test/package1/filename2.text
So, how to retrieve complete file path - ie. com/test/package/filename1.text ?
Thanks

tar: Error opening archive: Failed to open 'wekaUT.tar.gz' in Command Line Windows 10

I want to extract a tar file that I obtained at tar.gz file. However, when I try: tar -xzvf wekaUT.tar.gz
I get the following error:
tar: Error opening archive: Failed to open 'wekaUT.tar.gz'
I see the file in my directory as wekaUT.tar.gz.
Any help would be much appreciated.
Just commenting here as I ran into the same issue. In windows "tar xfv .tar.gz" should work if you open the command prompt as administrator.
In my case, I was building the file to untar dynamically using a variable like
tar -xzvf "$SOME_TEMP_DIR/a_cool_file.tar.gz"
And encountered this error because of the quotes. If I updated it to:
tar -xzvf $SOME_TEMP_DIR/a_cool_file.tar.gz
It worked :)
In windows compand promt use quotation marks ("") when specifying the path. It will work properly
Exaple : tar -xvzf "C:/PATH/TO/FILE/FILE-NAME.tar.gz" -C "C:/PATH/TO/FOLDER/EXTRACTION"
tar -xvzf "C:/PATH/TO/FILE/FILE-NAME.tar.gz"

How Can I make gsutil cp skip false symlinks?

I am using gsutil to upload a folder which contains symlinks, the problem is that some of these files are false symlinks ( Unfortunately, that's the case)
Here is an example of the command I am using:
gsutil -m cp -c -n -e -L output-upload.log -r output gs://my-storage
and I get the following:
[Errno 2] No such file or directory: 'output/1231/file.mp4'
CommandException: 1 file/object could not be transferred.
Is there a way to make gsutil skip this file or fail safely without stopping the upload ?
This was a bug in gsutil (which it looks like you reported here) and it will be fixed in gsutil 4.23.

wget --warc-file --recursive, prevent writing individual files

I run wget to create a warc archive as follows:
$ wget --warc-file=/tmp/epfl --recursive --level=1 http://www.epfl.ch/
$ l -h /tmp/epfl.warc.gz
-rw-r--r-- 1 david wheel 657K Sep 2 15:18 /tmp/epfl.warc.gz
$ find .
./www.epfl.ch/index.html
./www.epfl.ch/public/hp2013/css/homepage.70a623197f74.css
[...]
I only need the epfl.warc.gz file. How do I prevent wget to creating all the individual files?
I tried as follows:
$ wget --warc-file=/tmp/epfl --recursive --level=1 --output-document=/dev/null http://www.epfl.ch/
ERROR: -k or -r can be used together with -O only if outputting to a regular file.
tl;dr Add the options --delete-after and --no-directories.
Option --delete-after instructs wget to delete each downloaded file immediately after its download is complete. As a consequence, the maximum disk usage during execution will be the size of the WARC file plus the size of the single largest downloaded file.
Option --no-directories prevents wget from leaving behind a useless tree of empty directories. By default wget creates a directory tree that mirrors the one on the host, and downloads each file into the appropriate directory of the mirrored tree. wget does this even when the downloaded file is temporary due to --delete-after. To prevent that, use option --no-directories.
The below demonstrates the result, using your given example (slightly altered).
$ cd $(mktemp -d)
$ wget --delete-after --no-directories \
--warc-file=epfl --recursive --level=1 http://www.epfl.ch/
...
Total wall clock time: 12s
Downloaded: 22 files, 1.4M in 5.9s (239 KB/s)
$ ls -lhA
-rw-rw-r--. 1 chadv chadv 1.5M Aug 31 07:55 epfl.warc
If you forget to use --no-directories, you can easily clean up the tree of empty directories with find -type d -delete.
For individual files (without --recursive) the option -O /dev/null will make wget not to create a file for the output. For recursive fetches /dev/null is not accepted (don't know why). But why not just write all the output concatenated into one single file via -O tmpfile and delete this file afterwards?

solaris tar for files > 8G

I made an archive of 19G size in Solaris10 with tar E option. But now neither tar tvf nor tar xvf on the tarball works!! How can I extract the files?
Have you tried GNU tar (gtar)? There is a solaris SFW package for this SUNWgtar or try SunFreeware.
From the tar(1) man page:
See largefile(5) for the description of the behavior of tar
when encountering files greater than or equal to 2 Gbyte (
2^31 bytes).
On my Solaris 10 system largefile(5) says that tar is largefile-aware.
Perhaps truss can help:
truss -a -f -o /tmp/truss.out tar xf foo.tar
(please post truss.out if it's not too long, or perhaps just the tail of it otherwise).
EDIT: I just stumbled across patches 138621-02/138622-02, "SunOS 5.10: tar patch" from June 2010. In particular, fixes bug "6578528 /usr/bin/tar dumps core when extracting large files". (This is not a Recommended or Security patch so could have been missed).