Run a PowerShell script from another one - powershell

What is the best and correct way to run a PowerShell script from another one?
I have a script a.ps1 from which I want to call b.ps1 which does different task.
Let me know your suggestions. Is dot sourcing is the best option here?

Dot sourcing will run the second script as if it is part of the caller—all script scope changes will affect the caller. If this is what you want then dot-source,
However it is more usual to call the other script as if it were a function (a script can use param and function level attributes just like a function). In many ways a script is a PowerShell function, with the name of the file replacing the naming of the function.

Dot sourcing makes it easier to at a later stage convert your script(s) into a module, you won't have to change the script(s) into functions.
Another advantage of dot sourcing is that you can add the function to your shell by adding the file that holds the functions to Microsoft.PowerShell_profile.ps1, meaning you have them available at all times (eliminating the need to worry about paths etc).
I have a short write-host at the top of my dot sourced files with the name of the function and common parameters and I dot source the functions in my profile. Each time I open PowerShell, the list of functions in my profile scrolls by (If like me, you frequently forget the exact names of your functions/files You'll appreciate this as over time as the number of functions start to pile up).

Old but still relevant.
I work with modules with "Import-Module ", this will import the module in the current powershell session.
To avoid keep in cache and to always have the last changes from the module I put a "Get-Module | Remove-Module" that will clear all the loaded modules in the current session.
Get-Module | Remove-Module
Import-Module '.\IIS\Functions.psm1'

Related

Can I make a module from a bunch of single-function scripts?

We've accumulated a bunch of scripts, each looks and feels like CmdLets, i.e. it has a set of declared params and then it immediately calls a Main function which does the work, calling private sub-functions within.
An example is Remove-ContentLine.ps1 which just spits out the contents of a file or piped input except for lines matching some pattern.
So they're like little "function-scripts".
Is there any way I can aggregate these scripts into a module while also keeping them exactly as they are in files?
Edit
If your hunch is that its easier to just copy paste and refactor them into a psm1 then just say ;)
You ask:
Is there any way I can aggregate these scripts into a module while
also keeping them exactly as they are in files?
But I am certain that is not what you really want. If so, then all of your code will immediately execute when you load the module! Rather, I think what you want is that each of your scripts should be contained within a function; that group of functions is then loaded when you import the module; and you can then execute any of your functions on demand.
The process is very straightforward, and I have written an extensive article on just how to do that (Further Down the Rabbit Hole: PowerShell Modules and Encapsulation) but I will summarize here:
(1) Edit each file to wrap the entire contents into a function and conclude with exporting the function. I would suggest name the function based on the file name. Thus, Remove-ContentLine.ps1 should now look like this:
function Remove-ContentLine()
{
# original content of Remove-ContentLine.ps1 here
}
Export-ModuleMember Remove-ContentLine
(2) Decide on a name for your module and create a directory of that name. Let's call it MyModule. Within the MyModule directory, create a subdirectory to place all your .ps1 files; let's call that ScriptCmdlets.
(3) Create a module file MyModule.psm1 within MyModule whose contents will be exactly this:
Resolve-Path $PSScriptRoot\ScriptCmdlets\*.ps1 |
? { -not ($_.ProviderPath.Contains(".Tests.")) } |
% { . $_.ProviderPath }
Yes, every module (.psm1) file I write contains that identical code!
(4) Create a module manifest MyModule.psd1 within MyModule using the New-ModuleManifest cmdlet.
Then to use your module, just use Import-Module. But I urge you to review my article for more details to gain a better understanding of the process.
I doubt you can if the scripts already executing something ("main"). If they just expose a function like Remove-ContentLine for the Remove-ContentLine.ps1 you could dot source all the scripts in a single script to aggregate them or use the ScriptsToProcess = #() section when working with a module manifest.
I think it would be best to refactor the functions from within each .ps1 into a proper module. It should be essentially just copy/pasting the scripts into a single .psm1 file and creating a .psd1 for it. Be sure to check for and properly handle anything that is set in the script or global scopes, and there are no naming conflicts between functions.
If you have Sapien PowerShell Studio, there is a 'New Module from Functions' option in the File menu which would help automate the bulk of this for you.

Execute PowerShell function on change of path

Is it possible to execute some code (eg. a function, script block, cmdlet etc) whenever the current path changes in the PowerShell console or ISE?
The scenario I am thinking of is to modify some environment variables and dot source some location-specific PowerShell functions depending on the current folder.
You have several options. You can remove the "cd" alias and then write a "cd" function and add the desired logic. The downside to this approach is that if someone uses Set-Location, your function is bypassed. Another option is to create a proxy command for Set-Location. Shay (and Kirk) have a video on how to do this here. The nice thing about this approach is that the built-in ways to change dir (cd and Set-Location) will go through your proxy command.

Powershell: is there some hacky subroutine implementation?

I was wondering if there is some way to call (or load) a function located lower in the script execution path.
I wrote a script to run deployment and as one of the last steps, the script parses web.config making a ton of changes based on configuration file. A feature request came in, asking for a switch to generate the web.config without actual deployment.
The only way I can think of doing it, is making all the parsing logic into a gigantic function, and loading it at the start of the script. However, that approach will make the script horribly ugly. Nor do I want to carve out all the logic into another script and dot sourcing it.
Any suggestions?
Thank you.
Make it 2 functions one for deploy one for webconfig use a seperate function to check for switches and call functions based on those variables.
dc
I ended up reading another 30 or so articles and decided to simply bracket the functionality and move it up higher in the script, then dot source the function from inside the script.
Thanks.

Powershell: Include another script that has includes?

I want to make my life easier when making scripts. I'm staring a little framework that will have a hierarchy of include files. The problem is dot sourcing a ps1 script that already has other files dot sourced brakes the scope in the original calling scripts.
It looks like this:
\config\loadvariables.ps1
$var = "shpc0001"
\config\config.ps1
. '.\loadvariables.ps1'
\test.ps1
. '.\config\config.ps1'
echo $var
The problem is that test.ps1 tries to load loadvariables.ps1 as it is located beside test.ps1 script.
How can I solve this?
The easiest way to manage a collection of scripts which have inter-dependencies is to convert them to modules. This feature is only available in 2.0 but it allows you to separate a group of scripts into independent components with declared dependencies.
Here is a link to a tutorial on getting modules up and running
https://learn.microsoft.com/en-us/powershell/scripting/developer/module/how-to-write-a-powershell-script-module
As Jared said, modules are the way to go. But since you may even dot-source inside your modules, it is best to use full paths (which can still be calculated at run time) like so.
## Inside modules, you can refer to the module's location like so
. "$PSScriptRoot\loadvariables.ps1"
## Outside a module, you can do this
$ScriptRoot = Split-Path $MyInvocation.MyCommand.Path
. "$ScriptRoot\loadvariables.ps1"

Please help me with a Power shell Script which rearranges Paths

I have both Sybase and MSFT SQL Servers installed. There is a time when Sybase interferes with MS SQL because they have they have some overlapping commands.
So, I need two scripts:
A) When runs, script A backs up the current path, grabs all paths that contain sybase or SYBASE or SyBASE (you get the point) in them and move them all at the very end of the path, while preserving the order.
B) When it runs, script B restores the path from back-up.
Both script a and script b should affect the path immediately. So, if a.bat that calls patha.ps1, pathb.ps1 looks like so:
#REM Old path here
call patha.ps1
#REM At this point the effective path should be different.
call pathb.ps1
#REM Effective old path again
Please let me know if this does not make sense. I am not sure if call command is the best one to use.
I have never used P.S. before. I can try to formulate the same thing in Python (I know S.O. users tend to ask for "What have you tried so far"). Well, at this point I am VERY slow at writing anything in Power Shell language.
Please help.
First of all: call will be of no use here as you are apparently writing a batch file and PowerShell scripts have no association to run them by default. call is for batch files or subroutines.
Secondly, any PowerShell script you call from a batch file cannot change environment variables of the caller's environment. That's a fundamental property of how processes behave and since you are calling another process, this is never going to work.
I'm not so sure why you are even using a batch file here in the first place if you have PowerShell. You might just as well solve this in PowerShell completely.
However, what I get from your problem is that the best way to resolve this is probably the following: Create two batch files that each set the PATH appropriately. You can probably leave out both the MSSQL and Sybase paths from your usual PATH and add them solely in the batch files. Then create shortcuts to
cmd /k set_mssql_path.cmd
and
cmd /k set_sybase_path.cmd
each of which now is a shortcut to a shell to work with the appropriate database's tools. This is how the Visual Studio Command Prompt works and it's probably the cleanest solution you have. You can use the color and prompt commands in those batches to make the two different shells distinct so you always know what environment you have. For example the following two lines will color the console white on blue and set a prompt indicating MSSQL:
color 1f
prompt MSSQL$S$P$G
This can be quite handy, actually.
Generally, trying to rearrange the PATH environment variable isn't exactly easy. While you could trivially split at a ; this will fail for paths that itself contain a semicolon (and which need to be quoted then). Even in PowerShell this will take a while to get right so I think creating shortcuts specific to the tools is probably the nicest way to deal with this.