what are exfat disk file name limitations? - copy

Have lots of failures copying my disk to another, exFAT disk.
Specifically, file names with ':', '|' chars did not get through.
What is the allowed charset?
Is there a program to solve the issue, like changing the illegal chars to '.'?

One possible solution (works for me) is to rename all the original files replacing the illegal chars, e.g.
find . -name "[:|]" -execdir rename -v 's/[:|]/./g' '{}' +

Related

Using the perl-based rename command to change filenames commencing with dashes

rename is a perl-based command-line tool to rename multiple files.
I have, by accident, created numerous files with names that commence with, or contain, a dash or double dash (- or --). When I try to use rename to get rid of the dashes in, for example, '--the-file-name' using
rename 's/-//g' --the-file-name
rename (understandably) complains that 'the-file-name' is not an allowable option for the rename command.
Is there a way to tell rename that '--the-file-name' is a file name and not an option.
Many commands, including Perl's rename script, support a double-dash to denote the end of a command's options. Hence, to rename --the-file-name to the-file-name:
rename 's/-//g' -- --the-file-name
Perl's Getopt::Long module supports this and is used by rename.
In general, see also: https://unix.stackexchange.com/questions/11376/what-does-double-dash-mean-also-known-as-bare-double-dash
JRFerguson's answer actually works for many commands, not just rename.
Some alternatives also work for commands that do not recognize --:
Prefix the filename with ./ or the full path to it:
rename 's/-//g' ./--*
or
rename 's/-//g' $PWD/--*
Use find (which will also traverse any subdirectories):
find . -name '-*' -exec rename 's/-//g' '{}' ';'
or
find . -name '-*' -exec rename 's/-//g' '{}' +

system command accessing folders with spaces

I'm currently trying to write a program that requires I access files on a OneDrive folder that will be shared with multiple computers. Currently, an issue is appearing where the 'system' command is throwing an error when I try and access the OneDrive folder because the full path name has spaces in it.
folder = '/Users/myuser/Desktop/OneDrive\ -\ -\ Company\ Name/foldername-AVL'
STR = sprintf('cd %s',folder);
system(STR)
The error I keep receiving is
/bin/bash: line 0: cd: %s/Users/myuser/Desktop/OneDrive: No such file
or directory
So it is effectively cutting off all entries after the second space. I've looked through the documentation and all, and I can't seem to find a solution or a guide for using the system command in this specific situation.
I am guessing that you are trying to escape the spaces. In general I prefer to wrap all arguments that have spaces with double quotes. I would have guessed that escaping the path would work as well, but maybe not ...
This should work ... and it is much easier to read (IMHO).
folder = '"/Users/myuser/Desktop/OneDrive - - Company Name/foldername-AVL"'
STR = sprintf('cd %s',folder);
system(STR)
OR - moving " to sprintf
folder = '/Users/myuser/Desktop/OneDrive - - Company Name/foldername-AVL'
STR = sprintf('cd "%s"',folder);
system(STR)

Using PowerShell to replace string that contains + in the string

I am trying to use PowerShell do a simple find and replace. Essentially, I used to have some support files in a directory with the same name of a "master" file. All but one of those support files are no longer necessary. The "master" file has a text reference to the path of the other file. What I want to do is modify this path in the "master" file to remove the deleted directory.
For example, let's say I have the file C:\Temp\this+that.txt I used to have C:\Temp\this+that\this+that.dat that has now been moved to C:\Temp\this+that.dat
C:\Temp\this+that.txt has a line like this:
/temp/this+that/this+that.dat
I would like this line to become:
/temp/this+that.dat
I have lots of these files that a batch file is moving. Everything is working fine using the powershell command below for all file names that do NOT contain a plus + sign. For those files, the call below does not work.
powershell -Command "(gc '!CURRENT_FILE!') -replace '/!BASE_NAME!/', '/' | Set-Content '!CURRENT_FILE!'"
For the example above, CURRENT_FILE would be C:\Temp\this+that.txt and BASE_NAME would be this+that
Can anyone help me with why this isn't working for file names that contain a plus + sign?
#ma_il is exactly right. The '+' character is a special character in RegEx, so you will need to escape it.
powershell -Command "(gc '!CURRENT_FILE!') -replace [regex]::escape('/!BASE_NAME!/'), '/' | Set-Content '!CURRENT_FILE!'"

Load file.q from path containing space character

How to load script file from a path containing spaces?
For example, this works:
\l F:/file.q
Below attempts throw an error:
\l F:/Folder with spaces/file.q
\l "F:/Folder with spaces/file.q"
\l hsym `$"F:/Folder with spaces/file.q"
system "l "F:/Folder with spaces/file.q""
Not exactly practical, but if you need to load files with spaces in the path, you can use windows short file names:
So given a script path: F://Folder with spaces/file with spaces.q
Given
Folder with spaces gets shortname folder~1
script with spaces.q gets shortname filewi~.q
You can load the file as follows in q:
q)system "l F://folder~1/filewi~1.q"
Hello from a q script with spaces in file name
You can get the short name of a file/folder by listing the directory in command print with /x flag (eg. dir /x)
As in general with this situation in windows, you're likely better off avoiding spaces in a filepath.
I know this is a very old post, but just had the same issue. Found a solution that works for me:
system "cd c:/your path/with spaces/"
system "l your_script.q"

Why does grep hang when run against the / directory?

My question is in two parts :
1) Why does grep hang when I grep all files under "/" ?
for example :
grep -r 'h' ./
(note : right before the hang/crash, I note that I see some "no such device or address" messages , regarding sockets....
Of course, I know that grep shouldn't run against a socket, but I would think that since sockets are just files in Unix, it should return a negative result, rather than crashing.
2) Now, my follow up question : In any case -- how can I grep the whole filesystem? Are there certain *NIX directories which we should leave out when doing this ? In particular, I'm looking for all recently written log files.
As #ninjalj said, if you don't use -D skip, grep will try to read all your device files, socket files, and FIFO files. In particular, on a Linux system (and many Unix systems), it will try to read /dev/zero, which appears to be infinitely long.
You'll be waiting for a while.
If you're looking for a system log, starting from /var/log is probably the best approach.
If you're looking for something that really could be anywhere in your file system, you can do something like this:
find / -xdev -type f -print0 | xargs -0 grep -H pattern
The -xdev argument to find tells it to stay within a single filesystem; this will avoid /proc and /dev (as well as any mounted filesystems). -type f limits the search to ordinary files. -print0 prints the file names separated by null characters rather than newlines; this avoid problems with files having spaces or other funny characters in their names.
xargs reads a list of file names (or anything else) on its standard input and invokes the specified command on everything in the list. The -0 option works with find's -print0.
The -H option to grep tells it to prefix each match with the file name. By default, grep does this only if there are two or more file names on its command line. Since xargs splits its arguments into batches, it's possible that the last batch will have just one file, which would give you inconsistent results.
Consider using find ... -name '*.log' to limit the search to files with names ending in .log (assuming your log files have such names), and/or using grep -I ... to skip binary files.
Note that all this depends on GNU-specific features. Some of these options might not be available on MacOS (which is based on BSD) or on other Unix systems. Consult your local documentation, and consider installing GNU findutils (for find and xargs) and/or GNU grep.
Before trying any of this, use df to see just how big your root filesystem is. Mine is currently 268 gigabytes; searching all of it would probably take several hours. A few minutes spent (a) restricting the files you search and (b) making sure the command is correct will be well worth the time you spend.
By default, grep tries to read every file. Use -D skip to skip device files, socket files and FIFO files.
If you keep seeing error messages, then grep is not hanging. Keep iotop open in a second window to see how hard your system is working to pull all the contents off its storage media into main memory, piece by piece. This operation should be slow, or you have a very barebones system.
Now, my follow up question : In any case -- how can I grep the whole filesystem? Are there certain *NIX directories which we should leave out when doing this ? In particular, Im looking for all recently written log files.
Grepping the whole FS is very rarely a good idea. Try grepping the directory where the log files should have been written; likely /var/log. Even better, if you know anything about the names of the files you're looking for (say, they have the extension .log), then do a find or locate and grep the files reported by those programs.