We've written a powershell script which processes images from an internal system and sends them off to another system. Now another part of the business would like to hook into this and do their own processing of the indata and push it to yet another system. Asking around, there are several intrested parties around the company, so I'd like to make it simple to add these new systems.
A first prototype simple opens all .ps1 files in a folder and runs a specially named function in there and hopes for the best, basically. However, this seems like it could be improved. Is there some established powershell best practice to do some plugin-like system? If not, given that this executes in a quite secured environment, and new modules will be checked in by administrators, are there any problems with my above approach?
Why wouldn't you use a config file for your main script, explicitly telling what script and what function to call? Something like this (warning: this is copy/pasted and adapted code from something I wrote. Might contain a few glitches, but this gives you the general idea):
<?xml version="1.0"?>
<configuration>
<Plugins>
<Plugin Path="c:\blah\plugin1.ps1" PowerShellFunction="My-Plugin-Function" />
</Plugins>
</configuration>
In your main script:
function Load-Plugins
{
param (
[parameter(Mandatory = $true)][xml] $config,
[parameter(Mandatory = $true)][string] $nodeType
)
$plugins = #{}
foreach ($pluginNode in $config.SelectNodes($nodeType))
{
if ($pluginNode)
{
$Path = $pluginNode.Path
$powerShellFunction = $pluginNode.PowerShellFunction
$plugin = New-Object Object |
Add-Member -MemberType NoteProperty -Name "Path" -Value $Path -PassThru |
Add-Member -MemberType NoteProperty -Name "PowerShellFunction" -Value $powerShellFunction -PassThru
$plugins[$Path] = $plugin
}
}
return $plugins
}
function Execute-Plugins
{
param (
[parameter(Mandatory = $true)][hashtable] $plugins
)
$Error.Clear()
if (!$plugins.Values)
{ return }
foreach ($plugin in $plugins.Values)
{
& .\$plugin.Path
Invoke-Expression "$($plugin.PowerShellFunction)"
}
}
function Load-Script-Config
{
param (
[parameter(Mandatory = $false)][string] $configFile
)
if (!$configFile)
{ $configFile = (Get-PSCallStack)[1].Location.Split(':')[0].Replace(".ps1", ".config") }
return [xml](Get-Content $configFile)
}
$pluginConfig = Load-Script-Config
$plugins = Load-Plugins $config "configuration/Plugins/Plugin"
Execute-Plugins $plugins
Related
I have this function I found online which updates an IIS web.config XML file based on values I am feeding it from CSV files. I would like to prevent the output of every XML entry modification from being printed when the function is executed. I have tried numerous times to place to utilize Out-Null but I cannot seem to locate which portion of this XML update function is causing the output.
I would like to remove all XML update output like shown below after the line "Setting web.config appSettings for D:\Inetpub\WWWRoot\test". The output seems to be the key/value pair the function is reading from the CSV file and updating the web.config file with.
Setting web.config appSettings for D:\Inetpub\WWWRoot\test
#text
----- allowFrame TRUE
authenticationType SSO
cacheProviderType Shared
Here is the function I am utilizing:
function Set-Webconfig-AppSettings
{
param (
# Physical path for the IIS Endpoint on the machine without the "web.config" part.
# Example: 'D:\inetpub\wwwroot\cmweb510\'
[parameter(mandatory = $true)]
[ValidateNotNullOrEmpty()]
[String] $path,
# web.config key that you want to create or change
[parameter(mandatory = $true)]
[ValidateNotNullOrEmpty()]
[String] $key,
# Value of the key you want to create or change
[parameter(mandatory = $true)]
[ValidateNotNullOrEmpty()]
[String] $value
)
Write-Host "Setting web.config appSettings for $path" -ForegroundColor DarkCyan
$webconfig = Join-Path $path "web.config"
[bool] $found = $false
if (Test-Path $webconfig)
{
$xml = [xml](get-content $webconfig);
$root = $xml.get_DocumentElement();
foreach ($item in $root.appSettings.add)
{
if ($item.key -eq $key)
{
$item.value = $value;
$found = $true;
}
}
if (-not $found)
{
$newElement = $xml.CreateElement("add");
$nameAtt1 = $xml.CreateAttribute("key")
$nameAtt1.psbase.value = $key;
$newElement.SetAttributeNode($nameAtt1);
$nameAtt2 = $xml.CreateAttribute("value");
$nameAtt2.psbase.value = $value;
$newElement.SetAttributeNode($nameAtt2);
$xml.configuration["appSettings"].AppendChild($newElement);
}
$xml.Save($webconfig)
}
else
{
Write-Error -Message "Error: File not found '$webconfig'"
}
}
Both SetAttributeNode and AppendChild output information so we just need to $null or [void] those out
function Set-Webconfig-AppSettings
{
param (
# Physical path for the IIS Endpoint on the machine without the "web.config" part.
# Example: 'D:\inetpub\wwwroot\cmweb510\'
[parameter(mandatory = $true)]
[ValidateNotNullOrEmpty()]
[String] $path,
# web.config key that you want to create or change
[parameter(mandatory = $true)]
[ValidateNotNullOrEmpty()]
[String] $key,
# Value of the key you want to create or change
[parameter(mandatory = $true)]
[ValidateNotNullOrEmpty()]
[String] $value
)
Write-Host "Setting web.config appSettings for $path" -ForegroundColor DarkCyan
$webconfig = Join-Path $path "web.config"
[bool] $found = $false
if (Test-Path $webconfig)
{
$xml = [xml](get-content $webconfig);
$root = $xml.get_DocumentElement();
foreach ($item in $root.appSettings.add)
{
if ($item.key -eq $key)
{
$item.value = $value;
$found = $true;
}
}
if (-not $found)
{
$newElement = $xml.CreateElement("add");
$nameAtt1 = $xml.CreateAttribute("key")
$nameAtt1.psbase.value = $key;
$null = $newElement.SetAttributeNode($nameAtt1);
$nameAtt2 = $xml.CreateAttribute("value");
$nameAtt2.psbase.value = $value;
$null = $newElement.SetAttributeNode($nameAtt2);
$null = $xml.configuration["appSettings"].AppendChild($newElement);
}
$xml.Save($webconfig)
}
else
{
Write-Error -Message "Error: File not found '$webconfig'"
}
}
In PowerShell, return values from .NET methods (in general, all output) that are neither assigned to a variable, nor sent through the pipeline to another command, nor redirected to a file
are implicitly output ("returned") from functions - even without an explicit output command such as Write-Output or a return statement.
See the bottom section of this answer for the rationale behind this implicit-output feature.
It follows that, in PowerShell:
When you call .NET methods, you need to be aware of whether or not they return a value (technically, not returning a value is indicated by a return type of void).
If they do and if you don't need this return value, you must explicitly discard (suppress) it, so that it doesn't accidentally become part of the function's output.
In your case, it is the calls to the System.Xml.XmlElement.SetAttributeNode and System.Xml.XmlElement.AppendChild methods that cause your problems, because they have return values that you're not suppressing.
Doug Maurer's helpful answer shows how you to discard these return values by assigning the method calls to $null ($null = ...).
While two other methods for suppressing output exist as well - casting to [void] and piping to Out-Null - $null = ...
is the best overall choice, as discussed in this answer.
There is a script for users to log in, it calls other scripts in turn, depending on the conditions.
In order to call scripts separately manually, the [switch]$Silent parameter has been added. Question - how to pass this parameter inside Start-Job? I tried to add to the list of arguments in different ways - the value always falls into the neighboring parameter, regardless of the order.
Main script example
Param(
[string]$location = 'C:\Users',
[switch]$Silent
)
Start-Job -FilePath ".\Fonts_Install.ps1" -ArgumentList ($Silent,$location) | Wait-Job
Fonts_Install.ps1
Param(
[switch]$Silent = $false,
[string]$location = '.'
)
$path_fonts = "$env:LOCALAPPDATA\Microsoft\Windows\Fonts"
$Registry = "HKCU:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Fonts"
function WriteLog {
Param ([string]$LogString)
$Stamp = (Get-Date).toString("yyyy/MM/dd HH:mm:ss")
$LogMessage = "$Stamp $LogString"
Add-content $LogFile -value $LogMessage
}
$Logfile = "$env:LOCALAPPDATA\Temp\fonts_install.log"
WriteLog "Silent $Silent"
WriteLog "location $location"
Add-Type -AssemblyName System.Windows.Forms
Add-Type -AssemblyName PresentationFramework
Add-Type -AssemblyName System.Drawing
Add-Type -AssemblyName PresentationCore
$SourceFolder = "$location\Fonts_Install"
$WindowsFonts = [System.Drawing.Text.PrivateFontCollection]::new()
$Fonts = Get-ChildItem -Path $SourceFolder -Include *.ttf, *.otf -Recurse -File
ForEach ($Font in $Fonts) {
$Font_Name = $Font.Name
$font_fullname = $Font.fullname
if (Test-Path -PathType Leaf -Path "$path_fonts\$Font_Name") {
WriteLog "Previously installed $Font_Name"
}
else {
Copy-Item $Font -Destination "$path_fonts" -Force -Confirm:$false -PassThru
$WindowsFonts.AddFontFile("$font_fullname")
$ValueFont = "$path_fonts" + "\" + "$Font_Name"
$Typeface = New-Object -TypeName Windows.Media.GlyphTypeface -ArgumentList "$font_fullname"
[string]$FamilyFaceNames = $Typeface.FamilyNames.Values + $Typeface.FaceNames.Values
$RegistryValue = #{
Path = $Registry
Name = $FamilyFaceNames
Value = $ValueFont
}
if (Test-Path $Registry\$FamilyFaceNames) {
Remove-ItemProperty -name $FamilyFaceNames -path $Registry
}
New-ItemProperty #RegistryValue
WriteLog "New fonts installed $Font_Name"
}
}
switch ($Silent) {
$false {
if ($Error.Count -gt 0) {
for ($i = 0; $i -le ($Error.Items.Count + 1); $i++) {
$errMSG = "$Error"
}
[System.Windows.Forms.MessageBox]::Show("$errMSG", "Error", "OK", "Error")
}
else {
[System.Windows.Forms.MessageBox]::Show("ок", "Fonts", "OK", "Asterisk") | out-null
}
}
}
Unfortunately, specifying pass-through arguments via Start-Job's -ArgumentList (-Args) is limited to positional arguments, which prevents binding [switch] parameters, whose arguments must by definition be named.
As a workaround, instead of using -FilePath, invoke your script via the -ScriptBlock parameter. Inside of a script block ({ ... }, named arguments may be used in script calls, as usual:
Start-Job -ScriptBlock {
# Set the current location to the same location as the caller.
# Note: Only needed in *Windows PowerShell*.
Set-Location -LiteralPath ($using:PWD).ProviderPath
.\Fonts_Install.ps1 -Silent:$using:Silent $using:Location
} | Receive-Job -Wait -AutoRemoveJob
Note the use of the $using: scope in order to embed variable values from the caller's scope in the script block that will execute in the background.
You still need to refer to the -Silent parameter by name, and the whether the switch is on or off can be communicated by appending :$true or :$false to it, which is what :$using:Silent does.
In Windows PowerShell, background jobs execute in a fixed location (working directory), namely the user's Documents folder, hence the Set-Location call to explicitly use the same location as the caller, so that the script file can be referenced by a relative path (.\). This is no longer necessary in PowerShell (Core) 7+, which now thankfully uses the same location as the calller.
Here is a different alternative to mklement0's helpful answer, this answer does not use Start-Job and uses a PowerShell instance instead, using this method we can leverage the automatic variable $PSBoundParameters.
Do note, that for this to work properly, both .ps1 scripts must share the same parameter names or Alias Attribute Declarations that matches the same parameter from the caller. See this answer for more details.
You can use these snippets below as a example for you to test how it works.
caller.ps1
param(
[string] $Path = 'C:\Users',
[switch] $Silent
)
try {
if(-not $PSBoundParameters.ContainsKey('Path')) {
$PSBoundParameters['Path'] = $Path
}
$ps = [powershell]::Create().
AddCommand('path\to\myScript.ps1').
AddParameters($PSBoundParameters)
$iasync = $ps.BeginInvoke()
# Do something else here while the .ps1 runs
# ...
# Get async result from the PS Instance
$ps.EndInvoke($iasync)
}
finally {
if($ps -is [IDisposable]) {
$ps.Dispose()
}
}
myScript.ps1
# Note, since we're bounding this parameters from the caller.ps1,
# We don't want to assign Default Values here!
param(
[string] $Path,
[switch] $Silent
)
foreach($param in $MyInvocation.MyCommand.Parameters.Keys) {
[pscustomobject]#{
Parameter = $param
Value = Get-Variable $param -ValueOnly
}
}
A few examples:
PS /> .\caller.ps1
Parameter Value
--------- -----
Path C:\Users
Silent False
PS /> .\caller.ps1 -Path hello
Parameter Value
--------- -----
Path hello
Silent False
PS /> .\caller.ps1 -Path world -Silent
Parameter Value
--------- -----
Path world
Silent True
I want to add a custom method to an existing object. My problem is I may not find out how to make it accept parameters.
In this greatly simplified example I want to add a script block to a System.IO.FileInfo-Object to output a specific parameter to the screen:
$NewMethodScript = {
param(
[String] $Param1
)
write-host $this.$Param1
#Do lots of more stuff, call functions, etc...
}
$FInfo = [System.IO.FileInfo]::new("C:\File.txt")
$FInfo | Add-Member -MemberType ScriptMethod -Name NewMethod -Value $NewMethodScript
$FInfo.NewMethod "DirectoryName"
I am working on writing Pester tests for our PowerShell scripts that are used during task sequences. Several of them work with the task sequence variables and so I wrote a mock that allows for testing reading variables and am now trying to figure out how to do it for writing variables.
This is the code to read a task sequence variable:
$TsEnv = New-Object -COMObject Microsoft.SMS.TSEnvironment
$Value = $TsEnv.Value('VariableNameToRead')
By passing in the $TsEnv to a function I can then mock it with the following:
$TsEnv = #{
'VariableNameToRead' = 'TestValue'
}
Add-Member -InputObject $TsEnv -MemberType ScriptMethod -Name Value -Value {
Param( [String]$Key )
$This[$Key]
}
This is the code for writing a task sequence variable:
$TsEnv = New-Object -COMObject Microsoft.SMS.TSEnvironment
$TsEnv.Value('VariableNameToWrite') = 'ValueToWrite'
With it being in parentheses after the $TsEnv.Value I am thinking it is treating it as a method, but I am unable to find any examples on how to assign values to a method.
With Pester 4.3.3+, you might be able to use New-MockObject to create a usable mock of that COM object.
Alternatively, you can do something similar to the below to allow you to mock the functionality of the COM object.
If that COM object is available on the machines where your CI is running, I might consider skipping the mocks and writing an integration test.
# functions.ps1
Set-StrictMode -Version Latest
$ErrorActionPreference = "Stop";
function GetTaskSequenceValue(
[Parameter(Mandatory=$true)]
[string] $varNameToRead,
[Parameter(Mandatory=$false)]
[System.Management.Automation.ScriptBlock] $readAction = {
param([string] $readKey)
$tsEnv = New-Object -COMObject 'Microsoft.SMS.TSEnvironment'
$tsEnv.Value($readKey)
}
) {
$value = Invoke-Command `
-ScriptBlock $readAction `
-ArgumentList #($varNameToRead)
return $value
}
function SetTaskSequenceValue(
[Parameter(Mandatory=$true)]
[string] $varNameToWrite,
[Parameter(Mandatory=$false)]
[System.Management.Automation.ScriptBlock] $writeAction = {
param([string] $writeKey, [string] $value)
$tsEnv = New-Object -COMObject 'Microsoft.SMS.TSEnvironment'
$TsEnv.Value($writeKey) = $value
}
) {
try {
Invoke-Command `
-ScriptBlock $writeAction `
-ArgumentList #($varNameToWrite)
return $true
}
catch {
# Swallow it
}
return $false
}
Tests for the functions abov. The tests manually mock out the COM objects
# functions.test.ps1
$here = Split-Path -Parent $MyInvocation.MyCommand.Path
$sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path) -replace '\.Tests\.', '.'
. "$here\$sut"
Set-StrictMode -Version Latest
$ErrorActionPreference = "Stop";
Describe "GetTaskSequenceValue" {
It "gets the expected value" {
$expected = 'value'
$mockAction = {
param($dummy)
return 'value'
}
$actual = GetTaskSequenceValue `
-varNameToRead 'dummyName' `
-readAction $mockAction
$actual | Should Be $expected
}
}
Describe "SetTaskSequenceValue" {
It "sets the expected value" {
$expected = 'value'
$mockAction = {
param($dummy)
return 'value'
}
$actual = SetTaskSequenceValue `
-varNameToWrite 'dummyValue' `
-writeAction $mockAction
$actual | Should Be $true
}
}
Anything to deal with getting environment variables, WMI, or dotnet static method calls, I like to contain within a small helper function, then it's very easy to mock it. Here's what that helper could look like.
Function Get-SMSTsVariable{($VariableName)
return $TSEnv.Value($VariableName)
}
Then you can easily mock this in various contexts to check and see how your code acts when various environmental variables are set.
For instance, if you want it to return a value of BitLockerProvisioning when you run Get-SMSTsVariable -VariableName _SMSTSCurrentActionName, and to return 'C:' when you run _OSDDetectedWinDir you setup a mock like this:
mock Get-SMSTsVariable `
-parameterFilter { $VariableName -eq '_SMSTSCurrentActionName'} `
-mockWith {return 'BitLockerProvisioning'}
mock Get-SMSTsVariable `
-parameterFilter { $VariableName -eq '_OSDDetectedWinDir'} `
-mockWith {return 'C:'}
In this way, you can begin your test setting up a handful of responses for the various ways your functions operate. It's really a breeze.
I have a few functions stored in a .psm1 file that is used by several different ps1 scripts. I created a logging function (shown below), which I use throughout these ps1 scripts. Typically, I import the module within a script by simply calling something like:
Import-Module $PSScriptRoot\Module_Name.psm1
Then within the module, I have a write logger function:
Write-Logger -Message "Insert log message here." #logParams
This function is used throughout the main script and the module itself. The splatting parameter #logParams is defined in my main .ps1 file and is not explicitly passed to the module, I suppose the variables are implicitly within the module's scope upon being imported. What I have works, but I feel like it isn't a great practice. Would it be better practice to add a param block within my module to require #logParams to be explicitly passed from the main .ps1 script? Thanks!
function Write-Logger() {
[cmdletbinding()]
Param (
[Parameter(Mandatory=$true)]
[string]$Path,
[Parameter(Mandatory=$true)]
[string]$Message,
[Parameter(Mandatory=$false)]
[string]$FileName = "Scheduled_IDX_Backup_Transcript",
[switch]$Warning,
[switch]$Error
)
# Creates new log directory if it does not exist
if (-Not (Test-Path ($path))) {
New-Item ($path) -type directory | Out-Null
}
if ($error) {
$label = "Error"
}
elseif ($warning) {
$label = "Warning"
}
else {
$label = "Normal"
}
# Mutex allows for writing to single log file from multiple runspaces
$mutex = new-object System.Threading.Mutex $false,'MutexTest'
[void]$mutex.WaitOne()
Write-Host "$(Format-LogTimeStamp) $label`: $message"
"$(Format-LogTimeStamp) $label`: $message" | Out-file "$path\$fileName.log" -encoding UTF8 -append
[void]$mutex.ReleaseMutex()
}
I have this code in ps1 that I dot source into scripts where I want to produce my own logs. The ps1 contains the simpleLogger class as well as the routine below that creates a global variable. The script can be dot sourced again and the global variable value passed to subsequently spawned jobs to maintain a single log file.
class simpleLogger
{
[string]$source
[string]$target
[string]$action
[datetime]$datetime
hidden [string]$logFile = $global:simpleLog
simpleLogger()
{
$this.datetime = Get-Date
}
simpleLogger( [string]$source, [string]$target, [string]$action )
{
$this.action = $action
$this.source = $source
$this.target = $target
$this.datetime = Get-Date
}
static [simpleLogger] log( [string]$source, [string]$target, [string]$action )
{
$newLogger = [simpleLogger]::new( [string]$source, [string]$target, [string]$action )
do {
$done = $true
try {
$newLogger | export-csv -Path $global:simpleLog -Append -NoTypeInformation
}
catch {
$done = $false
start-sleep -milliseconds $(get-random -min 1000 -max 10000)
}
} until ( $done )
return $newLogger
}
}
if( -not $LogSession ){
$global:logUser = $env:USERNAME
$global:logDir = $env:TEMP + "\"
$startLog = (get-date).tostring("MMddyyyyHHmmss")
$global:LogSessionGuid = (New-Guid)
$global:simpleLog = $script:logDir+$script:logUser+"-"+$LogSessionGuid+".log"
[simpleLogger]::new() | export-csv -Path $script:simpleLog -NoTypeInformation
$global:LogSession = [simpleLogger]::log( $script:logUser, $LogSessionGuid, 'Log init' )
}