wget files by pattern only from specified directories recursively [closed] - wget

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I need to download on an hourly basis (sometimes more frequently), files which are being written in segments of 24 hours. The files I am interested in are in specific subdirectories which I am trying to specify with -I list but this doesn't work for some reason.
If I don't specify directories the files I need download fine with the -A acclist option but I end up with lots of empty directories that are being created because they exist on the host.
my current line reads:
wget -np -nH --cut-dirs=X -c -N -r -l 0 \
-I /dir1,/dir2,...,/some_dir -A acclist \
http://hostname/X_sub_directories/
How do I download only the files I want and create only the directory hierarchy for those files?

Related

Streaming too fast with avconv on Raspbian to justin.tv via RTMP [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I want to stream *.mp4 files to justin.tv using avconv on raspbian. I'm using the following command to do this:
avconv -i ./${FILE_TO_STREAM} \
-vcodec copy \
-acodec copy \
-threads 0 \
-r 24 \
-f flv rtmp://live-fra.justin.tv/${SECRET_KEY}
I can see my stream for a short time on justin.tv but it's streaming to fast. So the stream jumps to another part of the file and plays this part, after some time it jumps again, and so on. The fps is far to high as you can see in the output of avconv that says:
frame= 2673 fps=423 q=-1.0 Lsize= 4431kB time=106.58 bitrate= 340.6kbits/s
The frames and time is increasing that fast, like seen in the fps. I hoped that I could clamp the fps with the -r 24 command, but it's still on >200 fps. What can I do?
Solved it by adding -re as parameter to read input at native framerate.
So this worked for me:
#!/bin/bash
avconv -re \
-i ${FILE_TO_STREAM} \
-threads 0 \
-vcodec copy \
-acodec copy \
-f flv rtmp://live-fra.justin.tv/${SECRET_KEY}

delete last character of filenames in a folder [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I have a folder that contains some text files. I need to delete the last character of each filename in this folder. filenames are shown below.
1ADFG.txt
RG25A.txt
5SDFC.txt
Desired output
1ADF.txt
RG25.txt
5SDF.txt
I would do like this:
for i in *.txt; do echo "mv '$i' '${i/?.txt}.txt'"; done
If the output looks good, then pipe it to | sh, that is:
for i in *.txt; do echo "mv '$i' '${i/?.txt}.txt'"; done | sh
This
awk 'BEGIN {FS="."} {print substr($1,1,length($1)-1) "." $2;}'
feeded with the list of the names should do the job, provided that there's only 1 dot, the one for the extension.
You could also use sed like this :
$ ls -1
1ADFG.txt
5SDFC.txt
RG25A.txt
$ ls -1|sed "s/\([A-Za-z0-9]\{4\}\)[A-Za-z0-9]*\(\.[A-Za-z0-9]\)/\1\2/g"
1ADF.txt
5SDF.txt
RG25.txt
another way is to use the perl rename utility:
$ rename -n 's/.\./\./' *.txt
1ADFG.txt renamed as 1ADF.txt
5SDFC.txt renamed as 5SDF.txt
RG25A.txt renamed as RG25.txt

Grep sed command [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions must demonstrate a minimal understanding of the problem being solved. Tell us what you've tried to do, why it didn't work, and how it should work. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
Can anybody say me what does this command mean? Thanks
grep -h -o "\#string\/\(\w*\)" * -R | sed "s!#string\/\(\w*\)!\1!p" | sort | uniq > ..\AndroidProject1\tmp_used_strings.txt
This command will give you the list of string which is used in android layout xml file.
grep -h -o "\#string\/\(\w*\)" * -R
-R - Recursive searching
-h - no file name
-o - print only matched part of string
This command will give you the exact match string. Then, you are piping this output to input of sed command.
sed "s!#string\/\(\w*\)!\1!p"
This command will parse the input and separate the name. Then, sorting the result and store the uniq values to the file.
For more information about options, see the man page of command.

How do I compare two source trees in Linux? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have two directories containing source files to a project I've inherited with little by way of documentation. How do I compare both directories to make see what the differences are?
Try this:
diff -Naur dir1/ dir2/
The -u option makes the output a
little easier to read.
The -r option recurses through all
subdirectories
The -N and -a options are really
only necessary if you wanted to create
a patch file.
You can try Meld. It is a wonderful visual diff tool ;-)
diff -u -r dirA dirB
Will show you a unified recursive diff between the files in dirA and dirB
You may use the diff command in the shell. Or install a tool like KDiff3.
The diff command to compare directories kept telling me that I didn't have differences, when I knew there were differences.
Instead of using diff directly, I used a sorted list of md5sums and then compared those files with diff:
find /path1/dir/ -type f -exec md5sum {} + | awk '{print $2 $1}' | sort >! path1.log
find /path2/dir/ -type f -exec md5sum {} + | awk '{print $2 $1}' | sort >! path2.log
gvimdiff path1.log path2.log
If the beginning part of the path is causing headaches, then change it. Select the Path1 window and type:
:%s|path1|path2|g
This will replace all instances of path1 with path2 in the first file, and now your diff should only show differences.

copy the symbolic link in Solaris [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I am trying to copy a link on Solaris OS but find that it does not simply copy the link instead copies the whole contents of the directory/file the link is poinitng to? Which is not in other OSes like AIX,HP-UX,Linux.
Is this a normal behaviour of Solaris OS?
Charlie was close, you want the -L, -H or -P flags with the -R flag (probably just -R -P). Similar flags exist for chmod(1) and chgrp(1). I've pasted an excerpt from the man-page below.
Example:
$ touch x
$ ln -s x y
$ ls -l x y
-rw-r--r-- 1 mjc mjc 0 Mar 31 18:58 x
lrwxrwxrwx 1 mjc mjc 1 Mar 31 18:58 y -> x
$ cp -R -P y z
$ ls -l z
lrwxrwxrwx 1 mjc mjc 1 Mar 31 18:58 z -> x
$
Alternatively, plain old tar will happily work with symbolic links by default, even the venerable version that ships with Solaris:
tar -cf foo | ( cd bar && tar -xf - )
(where foo is a symlink or a directory containing symlinks).
/usr/bin/cp -r | -R [-H | -L | -P] [-fip#] source_dir... target
...
-H Takes actions based on the type and contents of the
file referenced by any symbolic link specified as a
source_file operand.
If the source_file operand is a symbolic link, then cp
copies the file referenced by the symbolic link for
the source_file operand. All other symbolic links
encountered during traversal of a file hierarchy are
preserved.
-L Takes actions based on the type and contents of the
file referenced by any symbolic link specified as a
source_file operand or any symbolic links encountered
during traversal of a file hierarchy.
Copies files referenced by symbolic links. Symbolic
links encountered during traversal of a file hierarchy
are not preserved.
-P Takes actions on any symbolic link specified as a
source_file operand or any symbolic link encountered
during traversal of a file hierarchy.
Copies symbolic links. Symbolic links encountered dur-
ing traversal of a file hierarchy are preserved.
You want cp -P I believe (check the man page, as I don't have a solaris box handy right now.) I faintly suspect that's a System V-ism, but wouldn't swear to it.
It sounds like you're trying to duplicate a single symlink.
You might want to just do:
link_destination=`/bin/ls -l /opt/standard_perl/link|awk '{print $10}'`
ln -s $link_destination /opt/standard_perl/link_new
If you are trying to copy a directory hierarchy, this can be very difficult to do in general without the GNU tools (or rsync). While there are solutions that often work, there is no solution that works on every "standard" unix with every type of filename you might encounter. If you're going to be doing this regularly, you should install the GNU coreutils, find, cpio, and tar, and also rsync as well.
Will cpio do the trick for you?