String inclusion in Fish Shell - fish

I've looked at all the similar questions on Stack Overflow, and tried a bunch of stuff in their documentation. My present attempt is below.
What I'm trying to accomplish is differential behavior based on the computer user. I have a work computer and a home computer, and I want the work computer to go to a different directory when I enter "code" into the command line than the one I want when I enter it at home. I tried this (my work username is afk, home is sfk):
function code
set result (dirh)
if [contains "sfk" $result]
cd ~/rails
else if [contains "afk" $result]
cd ~/code
else
echo $result
end
end
Currently, this is griping that "Unknown command contains Users/sfk/.config/fish/" (My PWD is Users/sfk/.config/fish/ when I'm entering this). I think maybe contains is just for lists, but the docs aren't particularly clear about this? Any idea how I might do this?
EDIT
Just tried this, and while it no longer errors, it ends up in the else case, which it shouldn't, cause my user IS sfk, which IS included in the dirh string:
function code
set result (dirh)
if test (contains "sfk" $result)
cd ~/rails
else if test (contains "afk" $result)
cd ~/code
else
echo $result
end
end

As said above in the comments, that is the best solution for this specific use case. But to answer the question in case someone wants to do something else.
You are right about contains, that it does an exact match on list items. You could use the switch function instead. Which supports wild-card matching.
function code
set -l result (dirh)
switch $result
case '*sfk*'
cd ~/rails
case '*afk*'
cd ~/code
case '*'
echo $result
end
end

Related

How can I pass unbound arguments from one script as parameters to another?

I have little experience with PowerShell in particular.
I'm trying to refactor some very commonly re-used code into a single script that can be sourced where it's needed, instead of copying and pasting this same code into n different scripts.
The scenario I'm trying to get looks (I think) like this:
#common.ps1:
param(
# Sure'd be great if clients didn't need to know about these
$some_params_here
...
)
function Common-Func-Uses-Params {
...
}
⋮
# foo/bar/bat.ps1:
# sure would love not to have to redefine all the common params() here...
. common.ps1 <pass-the-arguments>
Common-Func-Uses-Params $specific_Foo/Bar/Bat_Data
As the pseudo-comments above indicate, I've only been able to do this so far by capturing the params in the calling script as well.
I want to be in a situation where I can update the common code (say with a -Debug or -DryRun or -Url or whatever parameter) and not have to worry about updating all of the client code to match.
Is this possible?
You're missing two key things:
args - which captures all of (and only) the unbound arguments to the script
splatting (#) - which is used to pass arrays or hashtables to a command rather than flattening them like you'd get with $
When you combine these, you can easily pass all arguments onto another script, like so:
# foo.ps1
. common.ps1 #args
With a sourced file like this:
#common.ps1
param ([string]$foo = "foo")
echo "`$foo is $foo"
You get these output:
> foo.ps1 returns $foo is foo
> foo.ps1 -Foo bar returns $foo is bar
Note that, if you're trying to use the PowerShell ISE it might take you a while to figure this out or debug any of it. When you're in the debugger, both $args nor $MyInvocation.UnboundArguments will do their best to hide that information from you. They'll appear to be completely empty.
You can print the args with >> echo "$(#args)", but that also provides the very weird side effect of telling the Debugger to continue. I think the splatting is adding an extra newline and that's ending up in the Command Window.
The best workaround I have for that is to add $theargs = $args at the top of your script and remember to use $theargs in the debugger.

Manage inputs from external command in a powershell script

First, I would like to apologize in case that the title is not descriptive enough, I'm having a hard time dealing with this problem. I'm trying to build an automation for a svn merge using a powershell script that will be executed for another process. The function that I'm using looks like this:
function($target){
svn merge $target
}
Now, my problem occurs when there are conflicts in the merge. The default behavior of the command is request an input from the user and proceed accordingly. I would like to automatize this process using predefined values (show the differences and then postpone the merge), but I haven't found a way to do it. In summary, the workflow that I am looking to accomplish is the following:
Detect whether the command execution requires any input to proceed
Provide a default inputs (in my particular case "df" and then "p")
Is there any way to do this in powershell? Thank you so much in advance for any help/clue that you can provide me.
Edit:
To clarify my question: I would like to automatically provide a value when a command executed within a powershell script require it, like in the following example:
Requesting user input
Edit 2:
Here is a test using the snippet provided by #mklement0. Unfortunately, It didn't work as expected, but I thought it was wort to add this edition to clarify the question per complete
Expected behavior:
Actual result:
Note:
This answer does not solve the OP's problem, because the specific target utility, svn, apparently suppresses prompts when the process' stdin input isn't coming from a terminal (console).
For utilities that do still prompt, however, the solution below should work, within the constraints stated.
Generally, before attempting to simulate user input, it's worth investigating whether the target utility offers programmatic control over the behavior, via its command-line options, which is both simpler and more robust.
While it would be far from trivial to detect whether a given external command is prompting for user input:
you can blindly send the presumptive responses,
which assumes that no situational variations are needed (except if a particular calls happens not to prompt at all, in which case the input is ignored).
Let's assume the following batch file, foo.cmd, which puts up 2 prompts and echoes the input:
#echo off
echo begin
set /p "input1=prompt 1: "
echo [%input1%]
set /p "input2=prompt 2: "
echo [%input2%]
echo end
Now let's send responses one and two to that batch file:
C: PS> Set-Content tmp.txt -Value 'one', 'two'; ./foo.cmd '<' tmp.txt; Remove-Item tmp.txt
begin
prompt 1: one
[one]
prompt 2: two
[two]
end
Note:
For reasons unknown to me, the use of an intermediate file is necessary for this approach to work on Windows - 'one', 'two' | ./foo.cmd does not work.
Note how the < must be represented as '<' to ensure that it is passed through to cmd.exe and not interpreted by PowerShell up front (where < isn't supported).
By contrast, 'one', 'two' | ./foo does work on Unix platforms (PowerShell Core).
You can store the SVN command line output into a variable and parse through that and branch as you desire. Each line of output is stored into a new enumerator (cli output stored in PS variables is in array format)
$var = & svn merge $target
$var

Variables may not be used as commands

Using fish shell, I'm writing very simple script that checks the command execution
#!/usr/bin/fish
command
if $status
echo "Oops error"
else
echo "Worked OK"
#...
end
And get the error message:
fish: Variables may not be used as commands. Instead, define a function like “function status; 0 $argv; end”. See the help section for the function command by typing “help function”.
The message looks pretty straight forward but no "defining function like..." nor "help function" helps solving the problem.
There is also a 'test' command, that sounds promising. But docs say it is to be used to check files...
How this simple thing should be done with fish shell?
Heh... And why all documentation is SO misleading?..
P.S. Please, don't write about 'and' command.
Fish's test command currently works exactly like POSIX test (i.e. the one you'll find in bash or similar shells). It has a couple of operations, including "-gt", "-eq", "-lt" to check if a number is bigger, equal or less than another number, respectively.
So if you want to use test, you'll do if test $status -eq 0 (a 0 traditionally denotes success). Otherwise, you can check the return value of a command by putting it in the if clause directly like if command (which will be true if the command returns 0) - that's what fish is trying to do here, which is why it complains about a variable being used in place of a command.

Call custom vim completetion menu with Information from Perl-Script

I wrote a script analyzing perl-files (totally without PPI, because it will be used on Servers where the admins don't want PPI to be installed and so on and so forth, but let's not talk about that).
Now, let's say I have this code:
my $object = MySQL->new();
my $ob2 = $object;
$ob2->
(Where MySQL is one of our modules).
My script correctly identifies that $ob2 is a MySQL-Object and sees where it came from, and then returns a list of found subs in that module.
My idea was, that, since I use vim for editing, this could be a really cool way for "CTRL-n"-Completetion.
So, when...
$ob2->[CTRL-n]
It shows the CTRL-n-Box which opens my Perl-Script and gives it a few parameters (I would need: The line that I am actually on, the cursor position and the whole file as it is in vim).
I already found things like vim-perl, which allows me to write something like
if has('perl')
function DefPerl()
perl << EOF
use MyModule;
return call_to_my_function(); # returns all the methods from the object for example
EOF
endfunction
call DefPerl()
endif
But somehow this does not get executed (I tried writing something to a file with a system call just for the sake of testing)...
So, in short:
Does anyone here know how to achieve that? Calling a perl-function from vim by pressing CTRL-n with the full file-code and the line vim is actually in and the position, and then opening a completetion-menu with the results it got from the perl-script?
I hope someone knows what I mean here. Any help would be appreciated.
The details and tips for invoking embedded Perl code from Vim can be found in this Vim Tips Wiki article. Your attempts are already pretty close, but to return stuff from Perl, you need to use Vim's Perl API:
VIM::DoCommand "let retVal=". aMeaningfullThingToReturn
For the completion menu, your Perl code needs to return a List of Vim objects that adhere to the format as described by :help complete-items. And :help complete-functions shows how to trigger the completion. Basically, you define an insert-mode mapping that sets 'completefunc' and then trigger your function via <C-x><C-u>. Here's a skeleton to get your started:
function! ExampleComplete( findstart, base )
if a:findstart
" Locate the start of the keyword.
let l:startCol = searchpos('\k*\%#', 'bn', line('.'))[1]
if l:startCol == 0
let l:startCol = col('.')
endif
return l:startCol - 1 " Return byte index, not column.
else
" Find matches starting with a:base.
let l:matches = [{'word': 'example1'}, {'word': 'example2'}]
" TODO: Invoke your Perl function here, input: a:base, output: l:matches
return l:matches
endif
endfunction
function! ExampleCompleteExpr()
set completefunc=ExampleComplete
return "\<C-x>\<C-u>"
endfunction
inoremap <script> <expr> <Plug>(ExampleComplete) ExampleCompleteExpr()
if ! hasmapto('<Plug>(ExampleComplete)', 'i')
imap <C-x><C-z> <Plug>(ExampleComplete)
endif

zsh filename globbling/substitution

I am trying to create my first zsh completion script, in this case for the command netcfg.
Lame as it may sound I have stuck on the first hurdle, disclaimer, I know how to do this crudely, however I seek the "ZSH WAY" to do this.
I need to list the files in /etc/networking but only the files, not the directory component, so I do the following.
echo $(ls /etc/network.d/*(.))
/etc/network.d/ethernet-dhcp /etc/network.d/wireless-wpa-config
What I wanted was:
ethernet-dhcp wireless-wpa-config
So I try (excuse my naivity) :
echo ${(s/*\/)$(ls /etc/network.d/*(.))}
/etc/network.d/ethernet-dhcp /etc/network.d/wireless-wpa-config
It seems that this doesn't work, I'm sure there must be some clever way of doing this by splitting into an array and getting the last part but as I say, I'm complete noob at this.
Any advice gratefully received.
General note: There is no need to use ls to generate the filenames. You might as well use echo some*glob. But if you want to protect the possible embedded newline characters even that is a bad idea. The first example below globs directly into an array to protect embedded newlines. The second one uses printf to generate NUL terminated data to accomplish the same thing without using a variable.
It is easy to do if you are willing to use a variable:
typeset -a entries
entries=(/etc/network.d/*(.)) # generate the list
echo ${entries#/etc/network.d/} # strip the prefix from each one
You can also do it without a variable, but the extra stuff to isolate individual entries is a bit ugly:
# From the inside, to the outside:
# * glob the entries
# * NUL terminate them into a single string
# * split at NUL
# * strip the prefix from each one
echo ${${(0)"$(printf '%s\0' /etc/network.d/*(.))"}#/etc/network.d/}
Or, if you are going to use a subshell anyway (i.e. the command substitution in the previous example), just cd to the directory so it is not part of the glob expansion (plus, you do not have to repeat the directory name):
echo ${(0)"$(cd /etc/network.d && printf '%s\0' *(.))"}
Chris Johnsen's answer is full of useful information about zsh, however it doesn't mention the much simpler solution that works in this particular case:
echo /etc/network.d/*(:t)
This is using the t history modifier as a glob qualifier.
Thanks for your suggestions guys, having done yet more reading of ZSH and coming back to the problem a couple of days later, I think I've got a very terse solution which I would like to share for your benefit.
echo ${$(print /etc/network.d/*(.)):t}
I'm used to seeing basename(1) stripping off directory components; also, you can use echo /etc/network/* to get the file listing without running the external ls program. (Running external programs can slow down completion more than you'd like; I didn't find a zsh-builtin for basename, but that doesn't mean that there isn't one.)
Here's something I hope will help:
haig% for f in /etc/network/* ; do basename $f ; done
if-down.d
if-post-down.d
if-pre-up.d
if-up.d
interfaces