List directories of different location in fish tab-completion - fish

I would like to create a fish function that acts as a shortcut into a directory.
The function looks like this:
function mydir -d "Navigate to mydir and its subdirectories from anywhere"
cd $HOME/some/path/mydir/$argv
end
The function works well, but I would like to add tab-completion. I currently have this for tab-completion:
complete --no-files --exclusive --command mydir --arguments "(__fish_complete_directories)"
The problem is that this will tab-complete to the directories in the current working directory. I would instead like it to tab-complete to the directories in $HOME/some/path/mydir/. I cannot find an example or flag in the documentation that would help me do this.
I do not know fish very well, but I thought one way to do this would be to manually loop over those directories and add them as options, but it doesn't feel right and I am struggling with that implementation anyway.

Honestly, your "looping over those directories" seems like the most "fish-like" way to me, but given that someone has already gone through the trouble of creating and debugging the existing __fish_complete_directories (Github link), I think I'd just use it as a starting point for your own completion function.
It seems to me that you could just add a pushd $HOME/some/path/mydir before the "non-existent command hack" and then popd at the end.

Related

Zip files with encryption in a remote share, keeping orignal names and location

My team faces the need to encrypt all files in a repository with AES256. For this purpose, we decided we are going to zip all files with such encryption, using the same key for all of them.
The problem we have is that these files sit in a NAS, so from windows boxes they are accessible by \ to them.
The directory structure is something like this:
Original Structure:
Root
-1
|--folder1
|---file1.ext
|---file2.ext
|--folder2
|---filea.ext
|---fileb.ext
|--folder2.a
|---filec.ext
and so on...
Essentially, what we need is to have all the original files contained in a zip file, keeping their original names, which would be something like this:
Desired Outcome:
|-Root
|-1
|--folder1
|---file1.zip
|---file2.zip
|--folder2
|---filea.zip
|---fileb.zip
|--folder2a
|---filec.zip
and so on...
To accomplish this, we tried a batch script that calls 7zip, but it only works if it's run from the root directory, which is something we cannot use as the files are not in a server.
Here is the syntax of the batch script we came up with:
FOR /R %%i IN ("*.wmv") DO "C:\Program Files\7-Zip\7z.exe" a -mx0 -tzip -pPasswordHere "%%~dpni.zip" "%%i"
But, as wrote previously, it only works when run from the root folder, which is something we cannot do as files sit on a network location.
Mapping the drive or making a symbolic link to it doesn't do the trick either.
I've also checked on 7zip to do this, namely, making use of its "-r" operator, but I couldn't find a way to get the desired outcome (namely, recurse through all folders in the remote tree structure -there are a lot of them...- and keep the original file name).
I'm open to any suggestions as any kind of script, trick or guizmo that gets the job done will be more than welcome. =)
Thanks a million in advance!,
Sebas.
----SOLUTION----
I actually found a sollution here, mapping the drive in a different way (it's so simple it just made me feel stupid(er), but it's altogheter beautiful).
Using the batch script below, the remote share can be mapped like so:
You can map a drive using
net use X: \\server\directory
and then you can change to that directory using
pushd X:
(Post from which the answer was taken from: Batch File Iterating through files on a local network server)

Delete files in a folder using Perl

I want to delete all files in a folder, which contain he word TRAR in their filename.. I hav etried the following :
CONFIG_DIR=`pwd`
VENDOR=ericsson-msc
RELEASE=v1
BASE_DIR=/appl/virtuo/gways
system ("cd /appl/virtuo/gways/config/ericsson-msc/v1/spool/input_d; rm-rf *TRAR");
remove all your config lines ( are they even perl? )
CONFIG_DIR=`pwd`
VENDOR=ericsson-msc
RELEASE=v1
BASE_DIR=/appl/virtuo/gways
and
system ("cd /appl/virtuo/gways/config/ericsson-msc/v1/spool/input_d; rm -rf *TRAR")
should work but you should really be using perl code (unlink, etc)
I suspect you are confusing the usage of perl with how you will use awk in bash scripts.
As #Steffen Ullrich said, that isn't Perl or Shell. But I'll try to make it a little more Perlish for you:
First, note that
variables in Perl start with a $
strings need "quotes around them"
statements end with a ;
spaces around = are ok and make it all easier to read
so
$CONFIG_DIR = `pwd`;
$VENDOR = "ericsson-msc";
$RELEASE = "v1";
$BASE_DIR = "/appl/virtuo/gways";
Next, see how you can combine these into a single string like this (I'm guessing that's what you want to do)
$DIR_FOR_CLEANING = "$BASE_DIR/config/$VENDOR/$RELEASE/spool/input_d";
Lastly, you should be really careful whenever using the -r command to rm along with a wildcard like *. Look up the man page for rm and see if -r is something you want to do. I don't think you need it here, unless you have directories named *TRAR that you want to recurse into to remove. I'll bet you only have files named *TRAR in that input_d directory.
Also, the command the way you wrote it could fail the cd if that directory doesn't exist, and would then proceed to recursively remove *TRAR from whatever directory you're running the script from. But you don't need to change directories at all. Try something like this
system ("echo rm -f $DIR_FOR_CLEANING/*TRAR");
If the echo command lists the files you do in fact want it to remove, then remove the "echo" and the rm will start deleting stuff.

Current working directory for SXPG_COMMAND_EXECUTE?

Is there a way to specify the current working directory for the system command executed by the function module SXPG_COMMAND_EXECUTE?
I do not see any parameter which would allow me to do that either by defining the command in transaction SM69 or on the list of IMPORTING parameters in SE37.
It looks like by default such commands are started in DIR_HOME which can be viewed by the transaction AL11. Do I have any control over that?
There isn't a way of doing it via `SM69' unfortunately. I think the only solution is to create a script and call that.
I was going to suggest wrapping the statements in a SM69 command defined as a call to sh with parameters of -c 'cd <dir> && /path/to/command' but unfortunately that doesn't work. According to note 401095 wildcards are not permitted. When I tested, && was translated into a single &, causing the command to fail.
Would be good if you access this information using FM FILE_GET_NAME_USING_PATH (export the script name for which you want to find the physical directory).
The recieving path can be used in SXPG_COMMAND_EXECUTE.
Because the external commands I called were actually .bat files I solved this by putting the following expression at the beginning of each and every one.
cd /d %~dp0
This Stackoverflow question helped a lot actually.

Running a script in bash

I have a script in one of my application folders.Usually I just cd into that locatin in Unix box and run the script e.g.
UNIX> cd My\Folder\
My\Folder> MyScript
This prints the required result.
I am not sure how do I do this in Bash script.I have done the following
#!/bin/bash
mydir=My\Folder\
cd $mydir
echo $(pwd)
This basically puts me in the right folder to run the required script . But I am not sure how to run the script in the code?
If you can call MyScript (as opposed to ./MyScript), obviously the current directory (".") is part of your PATH. (Which, by the way, isn't a good idea.)
That means you can call MyScript in your script just like that:
#!/bin/bash
mydir=My/Folder/
cd $mydir
echo $(pwd)
MyScript
As I said, ./MyScript would be better (not as ambiguous). See Michael Wild's comment about directory separators.
Generally speaking, Bash considers everything that does not resolve to a builtin keyword (like if, while, do etc.) as a call to an executable or script (*) located somewhere in your PATH. It will check each directory in the PATH, in turn, for a so-named executable / script, and execute the first one it finds (which might or might not be the MyScript you are intending to run). That's why specifying that you mean the very MyScript in this directory (./) is the better choice.
(*): Unless, of course, there is a function of that name defined.
#!/bin/bash
mydir=My/Folder/
cd $mydir
echo $(pwd)
MyScript
I would rather put the name in quotes. This makes it easier to read and save against mistakes.
#!/bin/bash
mydir="My Folder"
cd "$mydir"
echo $(pwd)
./MyScript
Your nickname says it all ;-)
When a command is entered at the prompt that doesn't contain a /, Bash first checks whether it is a alias or a function. Then it checks whether it is a built-in command, and only then it starts searching on the PATH. This is a shell variable that contains a list of directories to search for commands. It appears that in your case . (i.e. the current directory) is in the PATH, which is generally considered to be a pretty bad idea.
If the command contains a /, no look-up in the PATH is performed. Instead an exact match is required. If starting with a / it is an absolute path, and the file must exist. Otherwise it is a relative path, and the file must exist relative to the current working directory.
So, you have two acceptable options:
Put your script in some directory that is on your PATH. Alternatively, add the directory containing the script to the PATH variable.
Use an absolute or relative path to invoke your script.

How do I copy from numerous release directories to a single folder

Okay this is and isn't programming related I guess...
I've got a whole bunch of little useful console utilities scattered across a suite of projects that I wrote and I want to dump them all to a single directory to make using them simpler. The only issue is that I have them all compiled in both Debug and Release mode.
Given that I only want the release mode versions in my utilities directory, what switch would allow me to specify that I want all executables from my tree structure but only from within Release folders:
Example:
Projects\
Project1\
Bin\
Debug\
Project1.exe
Release\
Project1.exe
Project2\
etc etc...
To
Utilities\
Project1.exe
Project2.exe
Project3.exe
Project4.exe
...
etc etc...
I figured this would be a cinch with XCopy - but it doesn't seem to allow me to exclude the Debug directories - or rather - only include items in my Release directories.
Any ideas?
You can restrict it to only release executables with the following. However, I do not believe the other requirement of flattening is possible using xcopy alone. To do the restriction:
First create a file such as exclude.txt and put this inside:
\Debug\
Then use the following command:
xcopy /e /EXCLUDE:exclude.txt *.exe C:\target
You can, however, accomplish what you want using xxcopy (free for non-commercial use). Read technical bulletin #16 for an explanation of the flattening features.
If the claim in that technical bulletin is correct, then it confirms that flattening cannot be accomplished with xcopy alone.
The following command will do exactly what you want using xxcopy:
xxcopy /sgfo /X:*\Debug\* .\Projects\*.exe .\Utilities
I recommend reading the technical bulletin, however, as it gives more sophisticated options for the flattening. I chose one of the most basic above.
Sorry, I haven't tried it yet, but shouldn't you be using:
xcopy release*.exe d:\destination /s
I am currently on my Mac so, I cant really check to be for sure.
This might not help you with assembling them all in one place now, but going forward have you considered adding a post-build event to the projects in Visual Studio (I'm assuming you are using it based on the directory names)
xcopy /Y /I /E "$(TargetDir)\$(TargetFileName)" "c:\somedirectory\$(TargetFileName)"
Ok, this is probably not going to work for you since you seem to be on a windows machine.
Here goes anyway, for the logic.
# From the base directory
mkdir Utilities
find . -type f | grep -w Release > utils.txt
for f in $(<utils.txt); do cp $f Utilities/; done
You can combine the find and cp lines into one, I split them for readability.
To do this on a windows machine you'll need Cygwin or some such Unix Utilities handy.
Maybe there are tools in the Windows shell to do this...
This may help get you started:
C:\>for %i in (*) do dir "%~dpi\*.exe"
Used in the dir command as a modifier to i, ~dp uses the drive and path of everything found in (*). If I run the above in a folder that has several subfolders containing executables, I get a dir list of all of the executables in each folder.
You should be able to modify that to add '\bin\release\' following the ~dpi portion and change dir to xcopy. A little experimentation should make it pretty easy.
To use the for statement above in a batch file, change '%' to '%%' in both places.