By default Powershell requires ".\" before any local executable. So to execute local command in Powershell you need type something like
.\foo arg1
How to turn requirement to type ".\" off, so I can run just like
foo arg1
And what is the correct name of the ".\"?
.is the current directory and \is the path separator. That means .\foo is the relative path to the foo program.
If .is not included in your PATH environment variable, you have to specify it as above. To avoid that, you could add . to your PATH variable. But as a rule of thumb, never do so, since it leads to security risks. Better use .\foo or even better provide the absolute path.
It is a security feature to make sure you run only the executables at the desired location. This ensures that you are running the right executable and not a hijacked one.
This is a language / shell implementation and you cannot turn it off, IMO.
you can't turn it off, however you could turn your could put your functions in a module (or even a script you load somewhere else), and load that then at least calling those functions looks nicer (without the .\foo
perhaps . isn't in the path, the tabcompletion function detect that, so if you write foo[TAB] then automatically is expanded to .\foo. For this reason i don't see where is the "extra work" putting the dot and the slash if tabcompletions takes care of that.
Related
I find myself missing the . in dot sourced files and spending a little time getting my footing when switching from C# to PS. Is there a way to alias the . or use a similar command that is more blatantly noticeable?
In short: No, you cannot define an alias for . :
., the (dot-)sourcing operator, is an operator and as such it cannot be the target of an alias (only commands can).
Defining a function is also not an option, because it would itself have to be dot-sourced when called, in order to be able to dot-source other commands (scripts whose definitions to load into the caller's scope).
I have some Makefiles that are flexible based on the existence of certain variables by using ifdef to check for them. It is a bit annoying that I have to actually set the variable equal to something on the command line. make all DEBUG does not trigger the ifdef but make all DEBUG=1 does. Perhaps I am just using the C pre-processor approach where it does not belong.
Q1) Is it possible to specify a variable on the command line to be empty? Without even more characters?
Q2) What is the preferred approach for such boolean parameters to a make?
I assume you mean make all DEBUG= here, right? Without the = make will consider DEBUG to be a target to build, not a variable assignment.
The manual specifies that a variable that has a non-empty value causes ifdef to return true. A variable that does not exist or exists but contains the empty string, causes ifdef to return false. Note ifdef does not expand the variable, it just tests whether the variable has any value.
You can use the $(origin ...) function to test whether a variable is really not defined at all, or is defined but empty, like this:
ifeq ($(origin DEBUG),undefined)
$(info Variable DEBUG is not defined)
else
$(info Variable DEBUG is defined)
endif
As #MadScientist explained few minutes ago,
make all DEBUG
adds a target DEBUG to your make. Luckily, there is a workaround:
ifneq (,$(filter DEBUG,$(MAKECMDGOALS)))
DEBUG:=1 # or do whatever you want
DEBUG: all; #echo -n
endif
It is essential to supply a dummy rule (e.g. echo nothing, as above) to the dummy target. And either put this statement at the bottom of your makefile, or specify the prerequisite target explicitly as in the example. Otherwise, make may wrongly choose DEBUG target instead of all.
Note that this is not a preferred approach; the convention is like using V=1 to turn echo on.
Another caveat is that make processes the command-line goals sequentially, e.g. make A B will first take care of A target, then of B target, whether these targets are independent, or depend one on the other. Therefore writing make DEBUG PERFECT and make PERFECT DEBUG could produce different results. But the order of parameters is irrelevant, therefore make PERFECT=1 DEBUG=1 and make DEBUG=1 PERFECT=1 are equivalent.
It is already clarified why you can't use just DEBUG. But I would like to add something.
You can use shell script before running make that setup all variables you need, so, for example in linux shell it will look like this:
$source debug_setup.sh
$make all
Make is starting...
Debug is enabled
...
where debug_setup.sh contains all environment variables you need to set up:
export DEBUG=1
export DEBUG_OPTION=some_option
This is nice since you can make comments there, you can comment out if you don't need something at the moment and would like to keep for the future, etc.
Then you can have several setup scripts that must/can be used as a part of standard routine. This all depends on how many variables you need to set up, how many sets of variables you would like to have, etc.
Note that it is a good idea to notify user somehow which set of variables is selected.
I have written a set of PowerShell helper functions for the Micrsosoft TFS and Micrsosoft TFPT command line tools (some which in turn use the Posh cmdlets included with TFPT). To shorten the commands from their standard naming conventions, like Get-TfsStatus and Invoke-TfsCommit, I created aliases as well (e.g. tf-status and tf-commit). I use PowerTab as well for tab completion, but v0.99.6 does not support tab completion with aliases by default.
How do you configure tab completion so that my aliases, which all start with tf- can show me the list of available commands?
I see that PowerTab includes an editor for modifying tab expansion behavior, but it is not clear to me exactly what I would need to configure. I also know with PowerTab turned off, the default posh tab completion works with aliases.
Example function and alias:
function Get-TfsStatus([switch]$all) {
# Do something
}
Set-Alias tf-status Get-TfsStatus
Set-Alias tf-st Get-TfsStatus
Note: Originally, I had the actual function names as tf-status, tf-commit, etc., but when you Import-Module, PowerShell complains that I was not following the naming standards for PowerShell functions.
This is not a use case I had anticipated, so it is not well supported. However, there is an easy way to hack this in so long as your aliases continue to have a "-" in their name.
Edit line 957 of TabExpansionCore.ps1 to add "Alias" to the list of command types.
Get-Command -CommandType Alias,Function,ExternalScript,Filter,Cmdlet -Name "$($Matches[1])*" |
First off, you can use non-standard names for your functions without getting warnings.
Import-Module <<path_to_your_module>> -DisableNameChecking
Although, from experience, it is a good habit to follow (in most cases) the naming convention.
By default, autocomplete does not work on aliases. What does work, even for aliases, is parameter completion. So whether you type tf-status -a or Get-TfsStatus -a, if you press Tab, it will autocomplete to -All.
Neither the Powershell console nor the ISE editor have an option to enable alias completion.
For your case, I would suggest to use the "wrong" name (no alias), import with -DisableNameChecking and the autocomplete will then work for tf-status.
Concerning PowerTab -- this is not fully supported in all hosts. Check this page to see which functionality is supported where. I don't know it well - I'm sure it's possible to add aliases, but it might be a tedious task and would need to be done for each user. If someone doesn't have powertab, they won't be able to use autocompletion for the aliases you define in your module.
I am optimizing a very time/memory consuming program by running it over a dataset and under multiple parameters. For each "run", I have a csv file, "setup.csv" set up with "runNumber","Command" for each run. I then import this into a perl script to read the command for the run number I would like, extrapolate the variables, then execute it on the system via the system command. Should I be worried about the potential for this to be exploited, (I am worried right now)? If so, what can I do to protect our server? My plan now is to change the file permissions of the "setup.csv" to read only and ownership to root, then go in as root whenever I need to append another run to the list.
Thank you very much for your time.
Run your code in taint mode with -T. That will force you to carefully launder your data. Only pass through strings that are ones you are expecting. Do not launder with .*, but rather check against a list of good strings.
Ideally, there a list of known acceptable values, and you validate against that.
Either way, you want to avoid the shell by using the multi-argument form of system or by using IPC::System::Simple's systemx.
If you can't avoid the shell, you must properly convert the text to pass to the command into shell literals.
Even then, you have to be careful of values that start with -. Lots of tools accept -- to denote the end options, allowing other values to be passed safely.
Finally, you might want to make sure the args don't contain the NUL character (\0).
systemx('tool', '--', #args)
Note: Passing arbitrary strings is not possible in Windows. Extra validation is required.
We're struggling to come up with a command name for our all purpose "developer helper" tool, which we are using on our project. It's like a wrapper for our existing tools like cmake and hg. The purpose of the command is really just to make our lives easier by combining multiple commands into one (for example, publishing packages). For example, we have commands like:
do conf
do build
do install
do publish
We've considered a few ambiguous names like do (as above) and run, but obviously, do is a Linux bash command and run is pretty ambiguous.
We'd like our command to be 2 chars short, preferably - but who thinks we're asking the impossible? Is there a practical way to check the availability of command names (other than just typing them into your terminal), or is it just a case of choose one and hope nobody else will use it? Are we worrying about nothing?
Since it's a "developer helper" tool why not use hm [run|build|port|deploy|test], Help Me ...
Give it a verbose name, then let everyone alias it to whatever they want. Make sure you use the verbose name in other scripts so that it removes ambiguity.
This way, each user gets to use whatever makes sense to him/her, and the scripts are more readable and more easily searchable (for example, grepping four "our_cool_tool" will usually yield better results than grepping for "run").
How many 2-character words are useful in this context? I think you need four. With that in mind, here are some suggestions.
omni
torq
fluf
mega
spif
crnk
splt
argh
quat
drul
scud
prun
sqat
zoom
sizl
I have more if you need them.
Pick one: http://en.wikipedia.org/wiki/List_of_all_two-letter_combinations
To check the availability of command names, I suggest looking for all two-letter filenames that are in the directories in your path. You can use a script like this
for item in `echo $PATH | sed 's/:/ /g'` ; do
ls -1d $item/??
done
It won't show builtins in your shell (like "do" as you mentioned) but it's a good start.
Change ?? to ??? for three-letter files, etc.
I'm going to vote for qp (quick package?) since it's easy to pronounce, easy to type, and easy to remember where the keys are on the keyboard.
I use "asd". it's short and most developers type it without thinking
(oh, and you can always claim later that it stands for some "Advanced Script for Developers" if you need to justify yourself a few years from now)
How about fu? As in Kung Fu. It's a special purpose tool. And it's really easy to type.
I think that run is a good name, at least anybody that will download your project will know what to do. Calling it without parameters should reveal your options.
Even 'do' will do, I think you can use backquotes to run it from bash scripts.
Also remember that running the tools without parameters will tell you what options you have.
Use makefiles to do everything for you.
How about calling it something descriptive, like 'build_runner', and then just aliasing it to 'br' (or preferred acronym) in your .bashrc?
There is a really crappy tool called cleartool (part of clearcase), and people will alias it on their machine to "ct". Perhaps you can have a longer command and suggest users alias it.
It would probably be best to do something like ire_and_curses suggested, name it descriptively then alias it to a 2 letter command. If I was choosing, I would name it dev_help and alias it to dh.
I think you're worrying about nothing. Install the program as 'the-command-to-do-evertyhing-and-if-you-dont-make-your-own-alias-for-it-you-should'. I don't think that will be too long for any modern filesystems, but you might need to shorten it to 'tctdeaiydmyoafiys'. See what common aliases are used, and then change the program's name to that. In other words: don't decide, let natural selection decide for you. If you are working with a team of < 10, this should not even remotely cause any problems.
Call it devtool alias to dt
Custom tools like that I like to start with the prefix 'jj-'. I can type (with big index-finger power) 'jj ' and see all my personal commands. Also, they group together in alphabetical lists. 'J' is not a very common character for built-inc commands, but you can pick your own.
Since you want two characters, you can use just 'zz', or something starting with 'z'.
Are you sure you want to put all your functionality in one command? That might be simultaneously over-constraining and over-loading the interface a little.
do conf
do build
do install
do publish