PowerShell middleware to wrap powershell module commands - powershell

I am looking for a middleware pipeline option for PowerShell. That means I want to provide each function with pre and post statements for a module that is not a C# cmlet.
Is there already something in this direction?
The background is that I don't want to store debug functions at every command but want to measure all my functions at a central place.
Thanks a lot

There is no way to put code in front of a call to a cmdlet, or after like you can when writing your own functions or using something like try\catch\finally. You cant really emulate that type of a work flow with cmdlet calls that I have found.

Related

Using dot source function in params

I'm going to start by saying I'm still pretty much a rookie at PowerShell and hoping there is a way to do this.
We have a utils.ps1 script that contains just functions that we dot source with in other scripts. One of the functions returns back a default value if a value is not passed in. I know I could check $args and such but what I wanted was to use the function for the default value in the parameters.
param(
[string]$dbServer=$(Get-DefaultParam "dbServer"),
[string]$appServer=$(Get-DefaultParam "appServer")
)
This doesn't work since the Util script hasn't been sourced yet. I can't put the dot source first because then params doesn't work as it's not the top line. The utils isn't a module and I can't use the #require.
What I got working was this
param(
[ValidateScript({ return $false; })]
[bool]$loadScript=$(. ./Utils.ps1; $true),
[string]$dbServer=$(Get-DefaultParam "dbServer"),
[string]$appServer=$(Get-DefaultParam "appServer")
)
Create a parameter that loads the script and prevent passing a value into that parameter. This will load the script in the correct scope, if I load it in the ValidateScript it's not in the correct scope. Then the rest of the parameters have access to the functions in the Utils.ps1. This probably is not a supported side effect, aka hack, as if I move the loadScript below the other parameters fail since the script hasn't been loaded.
PowerShell guarantee parameters will always load sequential?
Instead should we put all the functions in Utils.ps1 in global scope? this would need to run Utils.ps1 before the other scripts - which seems ok in scripting but less than ideal when running the scripts by hand
Is there a more supported way of doing this besides modules and #require?
Better to not use default value of params and just code all the checks after sourcing and check $args if we need to run the function?
It would be beneficial to instead turn that script into a PowerShell Module, despite your statement that you desire to avoid one. This way, your functions are always available for use as long as the module is installed. Also, despite not wanting to use it, the #Require directive is how you put execution constraints on your script, such as PowerShell version or modules that must be installed for the script to function.
If you really don't want to put this into a module, you can dot-source utils.ps1 from the executing user's $profile. As long as you don't run powershell.exe with the -NoProfile parameter, the profile loads with each session and your functions will be available for use.

How to share data between cmdlets in a module?

I'm currently working on a module in PowerShell which uses a standard REST API in the background. For that, I wrote a Connect-Server cmdlet that retrieves an auth key for later calls.
My question is: Is there any best practice regarding sharing the data with other cmdlets? I know I could easily just return it from the Connect function and pass it to the following cmdlet, but that's not what I'm looking for.
Until now, I've been using global variables for that exchange of data. But as I've read in some best practice guidelines you should try not to pollute the global scope.
Other solutions I've seen use Get and Set cmdlets, but I don't think that's the best PowerShell way of doing it.
So are there any other ways of solving that?
The normal way is to return data from one cmdlet and store it either in a variable or forward it to the pipeline. Another way of sharing data might be serializing (ConverTo-Json, ConvertTo-Csv, ...) it to a file (located in e.g. $env:TEMP, or created via New-Temporaryfile), and deserializing it back again in another cmdlet (at cost of DISK I/O). Personally I'm always the result in a variable for lather usage and inject it in the next cmdlet (or use the pipeline).
Using global variables is not the best idea since you don't know on which parameters your cmdlet/function depends on.
So as the guys at PoshCode stated, the best way to do such a thing is using a variable in script scope as this is available for all cmdlets in the module but not visible for users.

FileMaker MissingFunction

Set Variable [$Write; Value: <Function Missing>("filepath";$inputedText)]
I'm trying to determine what the missing function is. I'm trying to write data to an external file with this script, and this is one line of code from the script. I can't post the rest of the code for security reasons. Any direction as to what the missing function would be would be greatly appreciated.
The < Function Missing> message means that this code was written with the expectation that a now-missing plugin would be present. To resolve this, you'll need to determine which plugin this is, and install this on your development machine (and likely on all machines needing to use this script, unless you choose to write this to execute as a PSOS script running on the server).
My best guess based on functionality and the arguments being passed is that the missing plugin may be the Monkeybread Plugin.
It's the Write To File function in ScriptMaster

How to protect powershell file, and call single function

I'm having this problem for a while now and google have its limits.
I'm writing a powershell file that contain several generic function.
I use the function in vary scripts and now I want to let other personal in my work to use them as well.
the problem is, do to sensitive operation, I want to lock and protect the script (compile to a dll, exe etc').
how do I create powershell library like C# DLL?
one option I try but did not find out how to continue is to compile the script using powerGUI to executable file ( .exe) but then I canot access the function in it let alone pass on parameters to that function.
hope you understood me :)
thank you.
You don't. Rather than trying to obscure this information (if you compile them, they can be decompiled and your "protected" resources will no longer be), remove them entirely and make those parameters for your functions. This both protects your "sensitive" data and makes the code much more reusable.
You can then package your functions into a module

Powershell in SQLCLR?

In the past I've been able to embed a sripting languate (like JScript) inside the SQLCLR, so scripts can be passed as parameters of functions, to perform certain calculations. Here is a simplistic example (the function ssScriptExecute returns a concatenation of all the print's in the script):
select dbo.ssScriptExecute( 'print("Calculation: "+(1+2/3) );' )
-- Calculation: 1.6666666666666665
I'd love to be able to embed a Powershell runtime in the same way. But I've had all sort of problems because the runtime tries to find assemblies by path, and there are no paths inside the SQlCLR. I'm happy to provide more information on the errors I get, but I was wondering if anybody has tried this!
Thanks!
I use il code injection to modified System.Automation.Management.
make variable version in GetPSVersionTable() be "2.0"
then i can run Powershell Code in SQL Server.
Be sure reference this modified dll in your visual studio project.
http://www.box.net/shared/57122v6erv9ss3aopq7p
btw, automated registering all dll you needed with running powershell in SQL
you can use this ps1 code
http://www.box.net/shared/tdlpu1875clsu8azxq4b
I think the only way to do this is to create a WCF service hosting powershell, and let SQLCLR send the request dbo.ssScriptExecute(...) to that service for execution.
Besides from that, I've also successfully embedded paxScript.net in the SQLCLR (an interpreter that does not have the memory leak problems of the DLR languages).
I thought SQLCLR was restricted to just a certain set of assemblies and PS Automation is not one of them.