I want to open multiple files with xdg-open with the following codes
me#host:~/Downloads$ find . -type f -iregex "./[^.]*"
./3ed090f2dde306e5e9f7200f1022a2c3
./ebd9863a73a5ef22344550a650d169a1
./edbdb765d87586fda75c4287a1e9ea1e
./d9e39bfe0a907ffb580a975d8c8719d2
./2b9cc942c04a8063bd8d4d8fd98814d9
./f5938dd24367ffaf766ef99928660786
./a51accbbf14c8a05cb82caa7d8bec0c6
./0820fb50b412f8e40f63b3bea12e9fb5
./53ef22110569d46b445a1e908a7ae88f
./61ee21f83a33b91674926daf70c34947
Try to open them
me#host:~/Downloads$ find . -type f -iregex "./[^.]*" | xargs xdg-open
xdg-open: unexpected argument './ebd9863a73a5ef22344550a650d169a1'
Try 'xdg-open --help' for more information.
me#host:~/Downloads$ find . -type f -iregex "./[^.]*" -print0| xargs -0 xdg-open
xdg-open: unexpected argument './ebd9863a73a5ef22344550a650d169a1'
Try 'xdg-open --help' for more information.
What's the problem with my usage of xdg-open?
Your problem is that xdg-open does not accept more than one argument, meaning that you can open only one file with it. This seems to be by design, as there are different underlying commands for opening files in different distros, and some of them accept only one argument.
If you are designing distribution-specific script, then you might want to try to find out what command xdg-open invokes. In Ubuntu MATE 16.04 it is gvfs-open, which in turn accepts multiple arguments. I found out this by feeding malformed filepath to xdg-open, as I (yet again) tried to open two files with it. Malformation I used was simply just two files with their paths, separated by comma, no spaces. This was accepted by xdg-open, but gvfs-open complained in return, exposing itself.
If you are designing distribution-independent script, then you may want to look for a solution from here: https://askubuntu.com/questions/356650/how-to-open-multiple-files-with-the-default-program-from-terminal/
Related
Could you please help me with find syntax. I'm trying to replicate the effect of this command, which opens all files in each of the specified subdirectories:
open mockups/dashboard/* mockups/widget/singleproduct/* mockups/widget/carousel/*
I'd like to make it work for any set of subdirectories below mockups.
I can show all subdirectories with:
find mockups -type d -print
But I'm not sure how to use xargs to add in the "*". Also, I don't want to separately execute open for each file with "-exec open {} \;", because that launches 50 different copies of Preview, when what I need is one instance of Preview with the 50 files loaded into it.
Thanks!
The version of find I have at hand allows to specify a + sign after the -exec argument:
From the man page:
-exec command {} +
This variant of the -exec action runs the specified command on the
selected files, but the command line is built by appending each
selected file name at the end; the total number of invocations of
the command will be much less than the number of matched files.
The command line is built in much the same way that xargs builds
its command lines. Only one instance of `{}' is allowed within
the command. The command is executed in the starting directory.
That means that as few instances of open will be executed as possible, e.g.:
find mockups -type f -exec open {} +
I have been trying to do a recursive grep command on files in sub folders using grep in NTemacs and Cygwin. So far the "best" results have been using grep in eshell. When I use this:
grep "t" -r *
I get a list of all file names containing the letter t, in all sub folders one layer down but notthing else. In Cygwin i get nothing. I'm working on a directroy that is not in the Cygwin install. Don't know if that mather or not.
What I want is to match the content of a more complex string in all files (and not just the file names, but the content). And in all sub directories.
I would like to use eshell from emacs but I'm open to suggestions, apart form using LINUX. This is a work PC and I don't want to do all the setup of a LINUX install.
i just wrote a very similar answer to another question, but i suspect it's the same root problem:
my first thought is that your files have windows line endings (CRLF) as opposed to unix/linux line endings (LF), and that is messing with grep's ability to parse the file. try running this:
dos2unix filename
on each file you need to search then try your grep statement again.
if you need to convert many files across several directories, i suggest using dos2unix with the -exec action of find:
find . -exec dos2unix {} \;
(add whatever other options you need to find before running that, of course)
My question is in two parts :
1) Why does grep hang when I grep all files under "/" ?
for example :
grep -r 'h' ./
(note : right before the hang/crash, I note that I see some "no such device or address" messages , regarding sockets....
Of course, I know that grep shouldn't run against a socket, but I would think that since sockets are just files in Unix, it should return a negative result, rather than crashing.
2) Now, my follow up question : In any case -- how can I grep the whole filesystem? Are there certain *NIX directories which we should leave out when doing this ? In particular, I'm looking for all recently written log files.
As #ninjalj said, if you don't use -D skip, grep will try to read all your device files, socket files, and FIFO files. In particular, on a Linux system (and many Unix systems), it will try to read /dev/zero, which appears to be infinitely long.
You'll be waiting for a while.
If you're looking for a system log, starting from /var/log is probably the best approach.
If you're looking for something that really could be anywhere in your file system, you can do something like this:
find / -xdev -type f -print0 | xargs -0 grep -H pattern
The -xdev argument to find tells it to stay within a single filesystem; this will avoid /proc and /dev (as well as any mounted filesystems). -type f limits the search to ordinary files. -print0 prints the file names separated by null characters rather than newlines; this avoid problems with files having spaces or other funny characters in their names.
xargs reads a list of file names (or anything else) on its standard input and invokes the specified command on everything in the list. The -0 option works with find's -print0.
The -H option to grep tells it to prefix each match with the file name. By default, grep does this only if there are two or more file names on its command line. Since xargs splits its arguments into batches, it's possible that the last batch will have just one file, which would give you inconsistent results.
Consider using find ... -name '*.log' to limit the search to files with names ending in .log (assuming your log files have such names), and/or using grep -I ... to skip binary files.
Note that all this depends on GNU-specific features. Some of these options might not be available on MacOS (which is based on BSD) or on other Unix systems. Consult your local documentation, and consider installing GNU findutils (for find and xargs) and/or GNU grep.
Before trying any of this, use df to see just how big your root filesystem is. Mine is currently 268 gigabytes; searching all of it would probably take several hours. A few minutes spent (a) restricting the files you search and (b) making sure the command is correct will be well worth the time you spend.
By default, grep tries to read every file. Use -D skip to skip device files, socket files and FIFO files.
If you keep seeing error messages, then grep is not hanging. Keep iotop open in a second window to see how hard your system is working to pull all the contents off its storage media into main memory, piece by piece. This operation should be slow, or you have a very barebones system.
Now, my follow up question : In any case -- how can I grep the whole filesystem? Are there certain *NIX directories which we should leave out when doing this ? In particular, Im looking for all recently written log files.
Grepping the whole FS is very rarely a good idea. Try grepping the directory where the log files should have been written; likely /var/log. Even better, if you know anything about the names of the files you're looking for (say, they have the extension .log), then do a find or locate and grep the files reported by those programs.
How do I find and replace every occurrence of:
foo
with
bar
in every text file under the /my/test/dir/ directory tree (recursive find/replace).
BUT I want to be able to do it safely within an SVN checkout and not touch anything inside the .svn directories
Similar to this but now with the SVN restriction: Awk/Sed: How to do a recursive find/replace of a string?
There are several possiblities:
Using find:
Using find to create a list of all files, and then piping them to sed or the equivalent, as suggested in the answer you reference, is fairly straightforward, and only requires scanning through the files once.
You'd use one of the same answers as from the question you referenced, but adding -path '*/.svn' -prune -o after the find . in order to prune out the SVN directories. See this question for a discussion of using the prune option with find -- although note that they've got the pattern wrong. Thus, to print out all the files, you would use:
find . -path '*/.svn' -prune -o -type f -print
Then, you can pipe that into an xargs call or whatever to do the individual replacements, as suggested in the question you referenced. There is a lot of discussion there about different options, which I won't reproduce here, although I prefer the version from John Zwinck's answer:
find . -path '*/.svn' -prune -o -type f -exec sed -i 's/foo/bar/g' {} +
Using recursive grep:
If you have a system with GNU grep, you can use that to find the list of files as well. This is probably less efficient than find, but it does allow you to only call sed on the files that match, and I personally find the syntax a lot easier to remember (or figure out from manpages):
sed -i 's/foo/bar/g' `grep -l -R --exclude-dir='*/.svn' 'foo' .`
The -l option causes grep to only output the list of file names, rather than the matching lines.
Using a GUI editor:
Alternately, if you're using windows, do what I do -- get a copy of the NoteTab editor (available in a free version), and use its search-and-replace-on-disk command, which ignores hidden .svn directories automatically and just works.
Edit: Corrected find pattern to */.svn instead of .svn, added more details and some other possibilities. However, this depends on your platform and svn version: .svn without */ may be required in some cases, like on CentOS 7.
How about this?
grep -i "search_string" `find "*.some_extension"`
That is halfway solution to finding a search_string within files that have a specific extension....once you know the files that has the string, can be easily modified by piping it into sed....
I'm wondering if there is a way to change a specific word in all of the files within the /www/ directory using command line. Just looking for a faster way to change out a specific word so I don't need to open all the files manually! Thanks!
find /www -type f -exec sed -i 's/foo/bar/g' \{\} \;
This line will replace foo with bar every time foo occurs in any file in /www. Be very sure you know what's under /www and what the replacement would do to those files before running it.
You might be looking for a grep-sed solution to find and replace, if you are on a Mac (and referring to the Mac's Terminal app).