I have a list of filenames, possibly relative, possibly absolute. I want to open all of these files at once within an existing Emacs session. Is there a way to do this without having to open each file individually?
I could simply start another Emacs session passing it the list of files at the command-line; or if there was a glob pattern that matched only the files I wanted I could pass the glob to find-file. But suppose I simply have a list of the form "../relative/a.txt b.txt /absolute/path/c.txt", is there some command I could use to open all of the files in the list?
Related
I have a workspace with a lot of folders. They share a file that I would like to compare. I know I can't compare multiple files at once (wouldn't it be nice) but at least I would like to open the same file from all subfolders and then use Compare Active File with command.
Is this possible? CTRL+P only gives me a possibility to see the results but not to open all of them:
Nubie question ... but you've got to start somewhere.
In my powershell console, I call Windows.Forms.OpenDialog to open a GUI file picker and browse to a location on my system and multi-select files that I want to copy. I stash that list in a variable.
Next, I use browseForFolder to choose my destination folder path and stash that in a variable
I run folder.copyHere() to copy a file using the Windows default file xfer GUI.
One file at a time works.
I want to present my multi-selected filename list to folder.copyhere and have it process the list ...
Instead, I'm stuck in my ignorance using foreach($file in $list and sending that to folder.copyhere one at a time. It's unpretty.
I tried turning my list of full-path file names into an array, and feeding folder.copyhere() the $vararray - but it wouldn't copy even the first file. One at a time yes, but that's not what I want.
If I send my list to robocopy, it processes each item in order, like a champ.
But I have GUI users and they want the GUI. Ideas on how to get folder.copyhere() to accept an array of names to process?
How can I search files just like with / command but recursively scanning subfolders?
Or maybe there are other approaches to get a list of files that match some pattern in the current folder including all subfolders.
:find command
There is :fin[d] command for that. Internally it invokes find utility (this is configurable via 'findprg' option), so you can do everything find is capable of. That said, in most cases the simple form of the command suffices:
:find *.sh
Note that by default argument is treated as regular file pattern (-name option of find), which is different from regular expressions accepted by /. For searching via regexp, use:
:find -regex '.*_.*'
If you want to scan only specific subfolders, just select them before running the command and search will be limited only to those directories.
:find command brings up a menu with search results. If you want to process them like regular files (e.g. delete, copy, move), hit b to change list representation.
Alternative that uses /
Alternatively you can populate current view with list of files in all subdirectories with command like (see %u):
:!find%u
and then use /, although this might be less efficient.
For example,
#!/usr/bin/perl
open FILE1, '>out/existing_file1.txt';
open FILE2, '>out/existing_file2.txt';
open FILE3, '>out/existing_file3.txt';
versus
#!/usr/bin/perl
if (-d out) {
system('rm -f out/*');
}
open FILE1, '>out/new_file1.txt';
open FILE2, '>out/new_file2.txt';
open FILE3, '>out/new_file3.txt';
In the first example, we clobber the files (truncate them to zero length). In the second, we clean the directory and then create new files.
The second method (where we clean the directory) seems redundant and unnecessary. The only advantage to doing this (in my mind) is that it resets permissions, as well as the change date.
Which is considered the best practice? (I suspect the question is pedantic, and the first example is more common.)
Edit: The reason I ask is because I have a script that will parse data and write output files to a directory - each time with the same filename/path. This script will be run many times, and I'm curious whether at the start of the script I should partially clean the directory (of the files I am writing to) or just let the file handle '>' clobber the files for me, and take no extra measures myself.
Other than the permissions issue you mentioned, the only significant difference between the two methods is if another process has one of the output files open while you do this. If you remove the file and then recreate it, the other process will continue to see the data in the original file. If you clobber the file, the other process will see the file contents change immediately (although if it's using buffered I/O, it may not notice it until it needs to refill the buffer).
Removing the files will also update the modification time of the containing directory.
Is there any way to make find-name-dired to only show filenames that I can move through and select? I have a lot of files that are buried in subdirectories, and I don't want it to print out the entire subdirectory every time it finds a file.
Two problems with this:
How would you distinguish between two files with the same file name in different directories?
Dired needs the full path in order to be able to do anything with that file.
You could deal with (2) by using text properties or overlays to hide the directories, but due to (1) I really couldn't recommend that.
Edit: to otherwise customise the output of dired to reduce unwanted noise you can use Dired Details (optionally with Dired Details Plus)
How do I hide number of links in dired?
Emacs dired: too much information