Directory Bookmarks in Powershell? - powershell

One of my favorite Bash tips involves creating aliases for marking and returning to directories as described here: http://www.huyng.com/archives/quick-bash-tip-directory-bookmarks/492/.
In Bash, it looks like this:
alias m1='alias g1="cd `pwd`"'
Is it possible to create a similar function in powershell?

This project works very well: http://www.powershellmagazine.com/2016/05/06/powershell-location-bookmark-for-easy-and-faster-navigation/
Install:
Install-Module -Name PSBookmark
Use:
#This will save $PWD as scripts
save scripts
#This will save C:\Documents as docs
save docs C:\Documents
#You don't have to type the alias name.
#Instead, you can just tab complete. This function uses dynamic parameters.
goto docs

You can add the following to the $profile:
$marks = #{};
$marksPath = Join-Path (split-path -parent $profile) .bookmarks
if(test-path $marksPath){
import-csv $marksPath | %{$marks[$_.key]=$_.value}
}
function m($number){
$marks["$number"] = (pwd).path
}
function g($number){
cd $marks["$number"]
}
function mdump{
$marks.getenumerator() | export-csv $marksPath -notype
}
function lma{
$marks
}
I didn't like the way of defining an alias for each like m1, m2 and so on. Instead you will be doing m 1 and g 1 etc.
You can also add the line
Register-EngineEvent PowerShell.Exiting –Action { mdump } | out-null
so that it will do mdump when you exit the Powershell session. Unfortunately, doesn't work if you close the console window, but when you type exit.
PS: Also have a look at CDPATH: CDPATH functionality in Powershell?

Long-time ago I created some module with such functionality. Now I think it quite stable to add a link to the Powershell gallery and the GitHub project :)
Install module function:
Install-Module -Name Bookmarks
Basic usage:
$pwd | ba -Name "BookmarkName" #add
bo BookmarkName #open
br BookmarkName #remove
bl #list
Entire functions list is listed on the GitHub project page

I use this trick:
im my $profile I wrote function like this:
function win { Set-Location c:\windows}
$ProfileDir = $PROFILE.Substring(0 , $PROFILE.LastIndexOf("\")+1)
function pro { Set-Location $profiledir}
and so on...

Related

Call to reload functions/aliases in a script after it runs in Windows PowerShell is not working

I wrote this small alias script to set up some settings but somehow I can not get this to work that the aliases are reloaded with one function/alias. Under Linux
I just do source file and it reloads it inside the script and I have the new aliases
available.
Under windows 10/powershell 7.2 I have tried with ., with &, with a function, with calling another script or whatever ..nothing works. So I have to type . repo for reloading the aliases after the script exits
manually.
So as state nothing worked inside that function eal. Examples I found include
. repo
. $profile
. aliases.ps1
& $profile
are not reloading the settings in the current session.
Any ideas how to edit a file with aliases and then reload after the editing in
the current shell?
#Set Paths
$Env:PATH += ";C:\Tools\git\usr\bin"
#Aliases
Set-Alias wo Write-Output
Set-Alias ll ls
Set-Alias lld ls
Set-Alias tc 'C:\Program Files (x86)\Total Commander\TCUP64.exe'
# Utilities since windows can not do multiple commands in one alias
#
#function to reload the profile
function repo {
#(
$Profile.AllUsersAllHosts,
$Profile.AllUsersCurrentHost,
$Profile.CurrentUserAllHosts,
$Profile.CurrentUserCurrentHost
) | % {
if(Test-Path $_){
Write-Verbose "Running $_"
. $_
}
}
}
#edit aliases, functions
function eal {
vim $env:USERPROFILE\.config\aliases.ps1
Write-Output "Please run . repo to reload the profile"
$alias_config = Join-Path $env:USERPROFILE .config\aliases.ps1
// any code to put here to reload the file?
// . $profile etc nothing works
// . or & path to aliases.ps1 does not work in any way either
}
function epro {
vim $env:USERPROFILE\.config\powershell\user_profile.ps1
Write-Output "Please run . repo to reload the profile"
}
function which ($command) {
Get-Command -Name $command -ErrorAction SilentlyContinue |
Select-Object -ExpandProperty Path -ErrorAction SilentlyContinue
}
function .. {
cd ..
}

PowerShell functions load from function

I have a module with several files with functions and module loader.
The example function:
Function1.ps1
function Init() {
echo "I am the module initialization logic"
}
function DoStuff() {
echo "Me performing important stuff"
}
Module loader file:
Module1.psm1:
$script:Functions = Get-ChildItem $PSScriptRoot\*.ps1
function LoadModule {
Param($path)
foreach ($import in #($path)) {
. $import.FullName
}
}
LoadModule script:Functions
Init # function doesn't found
So I'm trying to load functions from the file Function1.ps1 by procedure LoadModule.
Debugging LoadModule shows external functions loaded, but after finishing LoadModule procedure the functions become not accessible so script fails on Init row.
But rewritten module loader with no LoadModule function works fine
Module1.psm1:
Get-ChildItem $PSScriptRoot\*.ps1 | %{
. $import.FullName
}
Init # In this case - all works fine
So as I understand the file functions loaded from function placed in some isolated scope and to able to access them I need to add some scope flag.
Maybe somebody knows what I should add to make function Init() accessible from the module.pm1 script body but not make it accessible externally (without using Export-ModuleMember)?
Note: Edit 1, a clarification on what dot sourcing actually does, is included at the end.
First up, you are intermingling terminology and usage for Functions and Modules. Modules, which have the .psm1 extension, should be imported into the terminal using the Import-Module cmdlet. When dot sourcing, such as what you are doing here, you should only be targeting script files which contain functions, which are files with the .ps1 extension.
I too am relatively new to PowerShell, and I ran into the same problem. After spending around an hour reading up on the issue I was unable to find a solution, but a lot of the information I found points to it being an issue of scope. So I created a test, utilising three files.
foo.ps1
function foo {
Write-Output "foo"
}
bar.psm1
function bar {
Write-Output "bar"
}
scoping.ps1
function loader {
echo "dot sourcing file"
. ".\foo.ps1"
foo
echo "Importing module"
Import-Module -Name ".\bar.psm1"
bar
}
foo
bar
loader
foo
bar
pause
Lets walk through what this script does.
First we define a dummy loader function. This isn't a practical loader, but it is sufficient for testing scopes and the availability of functions within files that are loaded. This function dot sources the ps1 file containing the function foo, and uses Import-Module for the file containing the function bar.
Next, we call on the functions foo and bar, which will produce errors, in order to establish that neither are within the current scope. While not strictly necessary, this helps to illustrate their absence.
Next, we call the loader function. After dot sourcing foo.ps1, we see foo successfully executed because foo is within the current scope of the loader function. After using Import-Module for bar.psm1, we see bar also successfully executed. Now we exit the scope of the loader function and return to the main script.
Now we see the execution of foo fail with an error. This is because we dot sourced foo.ps1 within the scope of a function. However, because we imported bar.psm1, bar successfully executes. This is because modules are imported into the Global scope by default.
How can we use this to improve your LoadModule function? The main thing for this functionality is that you need to switch to using modules for your imported functions. Note that, from my testing, you cannot Import-Module the loader function; this only works if you dot source the loader.
LoadModule.ps1
function LoadModule($Path) {
Get-ChildItem -Path "$Path" -Filter "*.psm1" -Recurse -File -Name| ForEach-Object {
$File = "$Path$_"
echo "Import-Module -Name $File"
Import-Module -Name "$File" -Force
}
}
And now in a terminal:
. ".\LoadModule.ps1"
LoadModule ".\"
foo
bar
Edit 1: A further clarification on dot sourcing
Dot sourcing is equivalent to copy-pasting the contents of the specified file into the file preforming the dot source. The file performing the operation "imports" the contents of the target verbatim, performing no additional actions before proceeding to execute the "imported" code. e.g.
foo.ps1
Write-Output "I am foo"
. ".\bar.ps1"
bar.ps1
Write-Output "I am bar"
is effectively
Write-Output "I am foo"
Write-Output "I am bar"
Edit: You don't actually need to use Import-Module. So long as you have the modules in your $env:PSModulePath PowerShell will autoload any exported functions when they are first called. Source.
Depending on the specifics of your use case, there's another method you can use. This method addresses when you want to mass-import modules into a PowerShell session.
When you start PowerShell it looks at the values of the environment variable $PSModulePath in order to determine where it should look for modules. It then looks under this directory for directories containing psm1 and psd1 files. You can modify this variable during the session, and then import modules by name. Here's an example, using what I've added to my PowerShell profile.ps1 file:
$MyPSPath = [Environment]::GetFolderPath("MyDocuments") + "\WindowsPowerShell"
$env:PSModulePath = $env:PSModulePath + ";$MyPSPath\Custom\Modules"
Import-Module `
-Name Confirm-Directory, `
Confirm-File, `
Get-FileFromURL, `
Get-RedirectedURL, `
Get-RemoteFileName, `
Get-ReparseTarget, `
Get-ReparseType, `
Get-SpecialPath, `
Test-ReparsePoint
In the event that you're new to PowerShell profiles (they're pretty much the same as Unix's ~/.profile file), you can find:
more information about PowerShell profiles here.
a summary of what profile files are used and when here.
While this may not seem as convenient as an auto-loader, installing & importing modules is the intended and accepted approach for this. Unless you have a specific reason not to, you should try to follow the established standards so that you aren't later fighting your way out of bad habits.
You can also modify the registry to achieve this.
After some research, I found: During the execution of the LoadModule function, all registered functions will be added to Functions Provider
So from the LoadModule function body they can be enumerated via Get-ChildItem -Path Function:
[DBG]: PS > Get-ChildItem -Path Function:
CommandType Name Version Source
----------- ---- ------- ------
Function C:
Function Close-VSCodeHtmlContentView 0.2.0 PowerShellEditorServices.VSCode
Function Init 0.0 Module1
Function ConvertFrom-ScriptExtent 0.2.0
Function Module1 0.0 Module1
So we can store functions list to variable in the beginning of the invocation of the LoadModule
$loadedFunctions = Get-ChildItem -Path Function:
and after dot load notation retrieve the added function list
Get-ChildItem -Path Function: | where { $loadedFunctions -notcontains $_ }
So the modified LoadModule function will look like:
function LoadModule {
param ($path)
$loadRef = Get-PSCallStack
$loadedFunctions = Get-ChildItem -Path Function:
foreach ($import in #($path)) {
. $import.FullName
}
$functions= Get-ChildItem -Path Function: | `
Where-Object { $loadedFunctions -notcontains $_ } | `
ForEach-Object{ Get-Item function:$_ }
return $functions
}
the next step it just assigns the functions to list More about this
$script:functions = LoadModule $script:Private ##Function1.ps1
$script:functions += LoadModule $script:PublicFolder
After this step, we can
Invoke initalizer:
$initScripts = $script:functions| #here{ $_.Name -eq 'Initalize'} #filter
$initScripts | ForEach-Object{ & $_ } ##execute
and export Public functions:
$script:functions| `
where { $_.Name -notlike '_*' } | ` # do not extport _Name functions
%{ Export-ModuleMember -Function $_.Name}
Full code of the module load function I moved to the ModuleLoader.ps1 file. And it can be found in the GitHub repo PowershellScripts
And the complete version of the Moudule.psm1 file is
if($ModuleDevelopment){
. $PSScriptRoot\..\Shared-Functions\ModuleLoader.ps1 "$PSScriptRoot"
}
else {
. $PSScriptRoot\Shared\ModuleLoader.ps1 "$PSScriptRoot"
}

PowerShell: provide parameters in a file

Is there a way to provide powershell parameters with a file?
At the moment I have a script which is called My_Script.ps1. To start this script I have to provide the right parameters in the command:
.\My_Script.ps1 -param1="x" -param2="x" -param3="x" -param4="x" -param5="x" -param6="x" ...
This works but it isn't a very easy way to start the script. Is it possible in powershell to use a file in which you store your parameters and to use that file when you start the script?
Example
In My_Script.ps1 I add something like:
Param(
[string]$File="Path/to/file"
)
In my file I have something like
param1="x"
param2="x"
param3="x"
param4="x"
...
To execute the script you can edit the file and just start the script with .\My_Script.ps1
Another option:
Just use a ps1 file as config file and define your variables as you would do in your main script
$Param1 = "Value"
$Param2 = 42
Then you can use dot-sourcing or import-module to get the data from the config file
. .\configfile.ps1
or
Import-Module .\Configfile.ps1
afterwards you can just use the variables
In addition to splatting you can create variables from = separated values in a file.
param1=foo
param2=bar
param3=herp
param4=derp
Don't quote the values. The parameter names should be valid for a variable (no spaces etc.)
PowerShell 3 and newer:
(Get-Content c:\params.ini -raw | ConvertFrom-StringData).GetEnumerator() |
ForEach { Set-Variable $_.name $_.value }
PowerShell 2:
([IO.File]::ReadAllText('c:\params.ini') | ConvertFrom-StringData).GetEnumerator() |
ForEach { Set-Variable $_.name $_.value }
The code creates variables in current scope. It's possible to create in a global/script/parent scope.
You can use this blog posting
for a start and declare your parameters in an ini-like format.
For sure you could also use a csv-like format and work with import-csv cmdlet.

Powershell run script with Import-Module

I have created a module for my admin group with some functions that automate some procedures we commonly perform (add administrators to remote machines, C drive cleanup, etc...)
One of the prerequisites for these functions is the generation of a series of 7 credentials, one for each domain we work in.
Is there a way to get a scriptblock to run when you import a module, or is this something I should add to each persons profile?
A commenter mentioned I could just add it to the module.psm1 file, but that didn't work. Here is the code I am trying to run.
$creds = Import-Csv [csvfile]
$key = Get-Content [keyfile]
foreach ($cred in $creds) {
$user = $cred.User
$password = $cred.Hash | ConvertTo-SecureString -Key $key
$i = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $user,$password
New-Variable -Name ($cred.Domain + "_Cred") -Value $i -Force
}
Running this manually works fine, but it isn't creating the credentials when run from the Import-Module command.
Any code that's not a function will run when you import the module.
A handy tip when working with modules: & and . have what may be undocumented functionality. With either you can give two arguments, the first is a module reference (from get-module or similar) and second is a script. With the module reference parameter the script will run in the context of the module. So for example:
& $myMod {$usa_cred}
will output the value of $use_cred even if it hasn't been exported. This is useful for debugging scripts. Also modules can have embedded modules and & $myMod {gmo} will list those sub modules. By nesting & or . you can access sub-modules context.

Is it possible to override powershell's behavior on attempting to execute a file?

When you try to execute a file in powershell, ie .\file, it seems like it executes it as though through the Invoke-Item cmdlet. Is it possible to override, replace, or augment this functionality within my powershell profile? I'd like to be able to replace the inspection by extension/user prompt default open behavior with something more intelligent by default.
The canonical way for solving a problem like that in Windows would be to change the default action for the type in question to a custom script/program that implements the logic you want applied to files of that type.
Changing the action can easily be done by adding changing a couple registry entries. Example for zip files:
Windows Registry Editor Version 5.00
[HKEY_CLASSES_ROOT\CompressedFolder\shell]
;change default action
#="Open2"
[HKEY_CLASSES_ROOT\CompressedFolder\shell\Open2]
;change label of this action
#="My Custom Action"
[HKEY_CLASSES_ROOT\CompressedFolder\shell\Open2\command]
;command for this action
#="powershell.exe -File \"C:\path\to\zip-handler.ps1\" \"%1\""
The script could (for instance) look like this:
$zip = Get-Item $args[0]
if ($zip.Length -le 10MB) {
[Reflection.Assembly]::LoadWithPartialName('System.IO.Compression.FileSystem') | Out-Null
$targetFolder = Split-Path -Parent $zip.FullName
[IO.Compression.ZipFile]::ExtractToDirectory($zip.FullName, $targetFolder)
} else {
explorer.exe $zip.FullName
}