Remove Class from Memory in PowerShell - class

I've created a class called "Application" and loaded it in my main script with:
Import-Module -NAME "C:\PowerShell_Scripts\Class\Application.ps1" -GLOBAL -FORCE;
However if I ONLY make changes to the class file and run the code in PowerShell ISE none of the changes are applied. It's almost as if the class is still in memory even though I've used -FORCE.
I've also tried to remove the module before loading it and the same issue happens:
Remove-Module "Application" -ErrorAction Ignore -FORCE;
Import-Module -NAME "C:\PowerShell_Scripts\Class\Application.ps1" -GLOBAL -FORCE;
If I make a single character change in my main script then it reloads the class! But I shouldn't have to modify the main script to force PowerShell to reload the class, that just seems silly.
Is there a way to remove the Application class from memory if it exists?
NOTE: Files with just functions in them work file. This only applies to Class imports.
Addition: In the console, if I run the Remove-Module command it runs successfully but I can STILL create new objects with:
$appDetails = [Application]::new($applicationID);
Doesn't make sense to me...
MAIN SCRIPT:
# Application Details
# -----------------
#ID
$applicationID = 1;
############################################
#
# Load Supporting Scripts
#
############################################
try
{
Remove-Module "Application" -ErrorAction Ignore -FORCE;
Remove-Module "Common" -ErrorAction Ignore -FORCE;
Remove-Module "ServerData" -ErrorAction Ignore -FORCE;
Import-Module -NAME "C:\PowerShell_Scripts\Common.ps1" -GLOBAL -FORCE;
Import-Module -NAME "C:\PowerShell_Scripts\ServerData.ps1" -GLOBAL -FORCE;
Import-Module -NAME "C:\PowerShell_Scripts\Class\Application.ps1" -GLOBAL -FORCE;
}
catch
{
Write-Host "`nError: Cannot load required PowerShell scripts. Ensure C:\PowerShell_Scripts\ exists and has the required files." -ForegroundColor Red;
EXIT;
}
############################################
#
# Load the SharePoint Snapin Module.
#
############################################
LoadSharePointModule;
############################################
#
# Display component details to user.
#
############################################
#Create object of "Application" to get app details based on the ID.
$appDetails = [Application]::new($applicationID);
Write-Host "Ending ......";
APPLICATION CLASS FILE
Class Application
{
#Class Properties
[STRING] $appName;
[INT32] $appID;
[INT32] $versionMajor;
[INT32] $versionOS;
[INT32] $versionCentraAdmin;
[INT32] $versionMain;
[INT32] $versionGUI;
[INT32] $versionWorkflow;
[INT32] $versionForm;
[INT32] $versionVS;
[INT32] $versionOther;
[INT32] $versionFull;
[OBJECT] $spDevSite;
[OBJECT] $versionList;
#Constructor: Setup class properties.
Application ([INT32] $appID)
{
Write-Host "`nGathering application details ..." -ForegroundColor Yellow;
try
{
#Get the SharePoint Developer site Object.
$this.spDevSite = Get-SPWeb -ErrorAction Stop $GLOBAL:spDevURL;
}
catch
{
Write-Host "`nUnable to connect to SharePoint Developer site!: $($GLOBAL:spDevURL)";
#EXIT;
}
#Assign class property.
$this.appID = $appID;
}
}
I have deliberately set the URL for $GLOBAL:spDevURL; so that the Constructor fails for this test. It fails normally and displays
Write-Host "`nUnable to connect to SharePoint Developer site!: $($GLOBAL:spDevURL)";
But if I make a change to this line and run the script, the change is not applied.

The Known Issue
There is a known issue in PowerShell 5.0 and 5.1 that explains this behavior. The issue was acknowledged by DongBo Wang on the PowerShell 6 team in November 2016. He wrote the following:
"The module analysis result is stored in a cache with the module file path as the key and the PSModuleInfo object as the value. The cache entries are not properly invalidated based on the LastWriteTime of the module file, and thus same cached value got reused."
In other words, PowerShell 5.0, 5.1, and 6.0 keeps (and uses) old copies of classes in memory when it shouldn't.
Implications
This issue causes considerable problems for development using PowerShell classes if you do not compensate for it. I wrote a test that covers about 100 of the scenarios where class reloading is important. Vaguely speaking, in about 17 of those scenarios PowerShell 5.0 and 5.1 doesn't reload the class when it should. This means using the same session across edits creates a real likelihood the interpreter will have cached duplicate copies of the same or similar classes. That makes behavior unpredictable and causes strange results that cannot be troubleshot.
Workaround
I have found that you can still be productive developing using PowerShell classes. You just need to perform each test run in a fresh PowerShell session when a project involves PowerShell classes whose source the PowerShell interpreter may consider to have changed. The customary way to do this is to invoke your test command from your PowerShell console by invoking powershell.exe:
powershell.exe -Command { Invoke-Pester }
That's not a terribly inefficient test-edit-test cycle if you've got tight unit tests. If you need to step through code, you'll need to launch a fresh copy of ISE each time you make an edit.
With this workaround, I have found the productivity impact of this bug to be manageable. I developed this and this entirely using this workaround. Each of those projects involve a significant amount of code involving PowerShell classes.

Related

I created a custom powershell .psm1 module but it won't update after an edit

I created a custom powershell module in the
C:\Program Files\WindowsPowerShell\Modules\PennoniAppManagement directory. Whenever I make changes to a function in the module, then import the module into a script, the updated code won't take effect. Any solutions?
Make sure you remove the already-loaded version of the module from the session before re-importing it:
Remove-Module PennoniAppManagement -Force
Import-Module PennoniAppManagement
Normally, Import-Module-Force - by itself - is enough to force reloading of an updated module into the current session.
Import-Module -Force implicitly performs Remove-Module before reloading the module (if the module isn't currently loaded, -Force just loads the module normally).
Also note that force-reloading a module is not an option if you're loading it via a using module statement (at least as of PowerShell 7.1.2). Notably the using module method of importing is required if a module exports custom class definitions that the caller should see - see this answer for details.
Mathias' two-step approach - Remove-Module -Force, followed by Import-Module - is apparently needed in some cases, and seems to be required in yours.
It would be good to understand when the two-step approach is needed. Mathias thinks it is related to cached versions of custom class definitions (used module-internally) lingering instead of getting reloaded and redefined when Import-Module -Force is called. That is, while the module overall may get reloaded, it may be operating on stale classes. At least in the simple scenario below I was not able to reproduce this problem, neither in Windows PowerShell 5.1, nor in PowerShell (Core) 7.2.1, but there may be scenarios where the problem does surface.
The Remove-Module documentation describes the -Force parameter solely as relating to the - rarely used - .AccessMode property available on a loaded module's module-information object (you can inspect it with (Get-Module ...).AccessMode). The default value is ReadWrite, which allows unloading (removal) of the module anytime. If the property value is ReadOnly, Remove-Module -Force is needed to unload; if it is Constant, the module cannot be removed from the session at all, once loaded - at least not with Remove-Module.
Notably, the implicit unloading that happens with Import-Module -Force is not subject to these restrictions and implicitly unloads a module even if its .AccessMode is Constant (as of PowerShell 7.1.2; I am unclear on whether that is by design).
Test code involving reloading a module with a modified class definition, to see if Import-Module -Force is enough:
# Create a template for the content of a sample script module.
# Note: The doubled { and } are needed for use of the string with
# with the -f operator later.
$moduleContent = #'
class MyClass {{
[string] $Foo{0}
}}
function Get-Foo {{
# Print the property names of custom class [MyClass]
[MyClass]::new().psobject.Properties.Name
}}
'#
# Create the module with property name .Foo1 in the [MyClass] class.
$moduleContent -f 1 > .\Foo.psm1
# Import the module and call Get-Foo to echo the property name.
Import-Module .\Foo.psm1; Get-Foo
# Now update the module on disk by changing the property name
# to .Foo2
$moduleContent -f 2 > .\Foo.psm1
# Force-import (reload) the module and
# see if the property name changed.
Import-Module -Force .\Foo.psm1; Get-Foo
# Clean up.
Remove-Item .\Foo.psm1
In both Windows PowerShell (whose latest and last version is v5.1) and PowerShell (Core) 7.2.1 (current as of this writing), the above yields, as expected:
Foo1 # Original import.
Foo2 # After modifying the class and force-reloading

PowerShell ignoring Write-Verbose while running Import-Module

For presenting the problem, I have this simple script saved as PowerShell module (test.psm1)
Write-Verbose 'Verbose message'
In real life, it includes command to import additional functions, but that is irrelevant at the moment.
If I run Import-Module .\test.psm1 -Verbose -Force I get only
VERBOSE: Loading module from path 'C:\tmp\test.psm1'.
My Write-Verbose is ignored 😟
I tried adding cmdletbinging but it also did not work.
[cmdletbinding()]
param()
Write-Verbose 'Verbose message'
Any clue how to provide Verbose output while importing the PowerShell module?
P.S. I do not want to display Verbose information always, but only if -Verbose is specified. Here would be my expected output for these two different cases:
PS C:\> Import-Module .\test.psm1 -Verbose -Force # with verbose output
VERBOSE: Loading module from path 'C:\tmp\test.psm1'.
VERBOSE: Verbose message
PS C:\> Import-Module .\test.psm1 -Force # without verbose output
PS C:\>
That is an interesting situation. I have a theory, but if anyone can prove me wrong, I would be more than happy.
The short answer: you probably cannot do what you want by playing with -Verbose only. There may be some workarounds, but the shortest path could be setting $VerbosePreference.
First of all, we need to understand the lifetime of a module when it is imported:
When a module is imported, a new session state is created for the
module, and a System.Management.Automation.PSModuleInfo object is
created in memory. A session-state is created for each module that is
imported (this includes the root module and any nested modules). The
members that are exported from the root module, including any members
that were exported to the root module by any nested modules, are then
imported into the caller's session state. [..] To send output to the host, users should run the Write-Host cmdlet.
The last line is the first hint that pointed me to a solution: when a module is imported, a new session state is created, but only exported elements are attached to the global session state. This means that test.psm1 code is executed in a session different than the one where you run Import-Module, therefore the -Verbose option, related to that single command, is not propagated.
Instead, and this is an assumption of mine, since I did not find it on the documentation, configurations from the global session state are visible to all the child sessions. Why is this important? Because there are two ways to turn on verbosity:
-Verbose option, not working in this case because it is local to the command
$VerbosePreference, that sets the verbosity for the entire session using a preference variable.
I tried the second approached and it worked, despite not being so elegant.
$VerbosePreference = "Continue" # print all the verbose messages, disabled by default
Import-Module .\test.psm1 -Force
$VerbosePreference = "SilentlyContinue" # restore default value
Now some considerations:
Specifying -Verbose on the Import-Module command is redundant
You can still override the verbosity configuration inside your module script, by using
Write-Verbose -Message "Verbose message" -Verbose:$false
As #Vesper pointed out, $false will always suppress the Write-Verbose output. Instead, you may want to parameterized that option with a boolean variable assigned in a previous check, perhaps. Something like:
if (...)
{
$forceVerbose=$true
}
else
{
$forceVerbose=$false
}
Write-Verbose -Message "Verbose message" -Verbose:$forceVerbose
There might be other less invasive workarounds (for instance centered on Write-Host), or even a real solution. As I said, it is just a theory.
Marco Luzzara's answer is spot on (and deserves the bounty in my opinion) in regards to the module being run in its own session state, and that by design you can't access those variables.
An alternative solution to setting $VerbosePreference and restoring it, is to have your module take a parameter specifically for this purpose. You touched on this a little bit by trying to add [CmdletBinding()] to your module; the problem is you have no way to pass in named parameters, only unnamed arguments, via Import-Module -ArgumentList, so you can't specifically pass in a $true for -Verbose.
Instead you can specify your own parameter and use it.
(psm1)
[CmdletBinding()]param([bool]$myverbose)
Write-Verbose "Message" -Verbose:$myverbose
followed with:
Import-Module test.psm1 -Force -ArgumentList $true
In the above example, it would apply only to a specific command, where you were setting -Verbose:$myverbose every time.
But you could apply it to the module's $VerbosePreference:
[CmdletBinding()]param([bool]$myverbose)
$VerbosePreference = if ($myverbose) { 'Continue' } else { 'SilentlyContinue' }
Write-Verbose "Message"
That way it applies throughout.
At this point I should mention the drawback of what I'm showing: you might notice I didn't include -Verbose in the Import-Module call, and that's because, it doesn't change the behavior inside the module. The verbose messages from inside will be shown purely based on the argument you passed in, regardless of the -Verbose setting on Import-Module.
An all-in-one solution then goes back to Marco's answer: manipulating $VerbosePreference on the caller's side. I think it's the only way to get both behaviors aligned, but only if you don't use -Verbose switch on Import-Module to override.
On the other hand, within a scope, like within an advanced function that can take -Verbose, setting the switch changes the local value of $VerbosePreference. That can lead us to wrap Import-Module in our own function:
function Import-ModuleVerbosely {
[CmdletBinding()]
param($Name, [Switch]$Force)
Import-Module $Name -Force:$Force
}
Great! Now we can call Import-ModuleVerbosely test.psm1 -Force -Verbose. But... it didn't work. Import-Module did recognize the verbose setting but it didn't make it down into the module this time.
Although I haven't been able to find a way to see it, I suspect it's because the variable is set to Private (even though Get-Variable seems to say otherwise) and so that value doesn't make it this time. Whatever the reason.. we could go back to making our module accept a value. This time let's make it the same type for ease of use:
(psm1)
[CmdletBinding()]param([System.Management.Automation.ActionPreference]$myverbose)
if ($myverbose) { $VerbosePreference = $myverbose }
Write-Verbose "message"
Then let's change the function:
function Import-ModuleVerbosely {
[CmdletBinding()]
param($Name, [Switch]$Force)
Import-Module $Name -Force:$Force -ArgumentList $VerbosePreference
}
Hey now we're getting somewhere! But.. it's kind of clunky isn't it?
You could go farther with it, making a full on proxy function for Import-Module, then making an alias to it called Import-Module to replace the real one.
Ultimately you're trying to do something not really supported, so it depends how far you want to go.

Install software from a list using Powershell

I am building a script to automate computer build and configuration: The idea is that from WDS it comes as clean as possible, automatically runs this script which will check the serial number, query our Workday database of assets and configure the OS according to what the user assigned to that system needs.
Right now I am focusing on 3 big groups: Laptop, Desktop, and Lab. All 3 will have some SW that will be the same and some that will be specific for each. My issue is with msiexec: Initially, I hard-coded all the installations for each group. but this means that I will have to change the script each time something is updated (say a new app is rolled out as default). which is not ideal.
function Install-Desktop {
#Write-Output "Here will be the install Desktop computer script"
$IPATH="<Path To root sw folder>"
#Software List
<# SOFTWARE LIST #>
$office="$IPATH\script\o365"
$webex="$IPATH\script\webex"
$chrome="$IPATH\script\chrome"
#install Ofice:
Invoke-Expression "$office\setup.exe /configure $office\O365.xml"
$params = '/i', "$webex\webexapp.msi",'/qb!','/norestart'
Start-Process msiexec -ArgumentList "$params" -Wait -PassThru
$params = '/i', "$chrome\GoogleChromeStandaloneEnterprise64.msi",'/qb!','/norestart'
Start-Process msiexec -ArgumentList $params -Wait -PassThru
}
This piece of code works well.
Now my idea was to import from a list the software to be installed (it is easier to maintain a list than to modify the script every time). something like:
function install-software {
param (
[String]$Type
)
$IPATH=<ROOT SW Folder>
$SoftWares=Import-Csv -Path "$IPath\script\$Type`.csv" #there will be a Laptop.csv in that path
foreach ($Software in $SoftWares) {
#detect if it is msiexect or other:
# (this has to do with how the csv is built, the first parameter is '/i' if it is an msi installer)
if ($Software.param1 -eq "'/i'") {
Start-Process msiexec -ArgumentList $Software -Wait -PassThru
}
else {
$Params=[string]::Join(" ",$Software.param1,$Software.param2,$Software.param3,$Software.param4)
Invoke-Expression "$Params"
}
}
}
This only works on the else part. However on the msiexec side of the if, the MSI opens as without arguments. I tried a lot of ways to pass the args, none worked. I am not a PowerShell guru in any way, so there is probably something that I am missing to see here.
Well, it looks like you have to pass the full path, it doesn't even let you use mounted net drive: so the answer was on the csv. instead of S:\<path to installer> it had to be \<Full path to installer> and i had to get rid of all the quotes and double quotes as well.

Powershell run script with Import-Module

I have created a module for my admin group with some functions that automate some procedures we commonly perform (add administrators to remote machines, C drive cleanup, etc...)
One of the prerequisites for these functions is the generation of a series of 7 credentials, one for each domain we work in.
Is there a way to get a scriptblock to run when you import a module, or is this something I should add to each persons profile?
A commenter mentioned I could just add it to the module.psm1 file, but that didn't work. Here is the code I am trying to run.
$creds = Import-Csv [csvfile]
$key = Get-Content [keyfile]
foreach ($cred in $creds) {
$user = $cred.User
$password = $cred.Hash | ConvertTo-SecureString -Key $key
$i = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $user,$password
New-Variable -Name ($cred.Domain + "_Cred") -Value $i -Force
}
Running this manually works fine, but it isn't creating the credentials when run from the Import-Module command.
Any code that's not a function will run when you import the module.
A handy tip when working with modules: & and . have what may be undocumented functionality. With either you can give two arguments, the first is a module reference (from get-module or similar) and second is a script. With the module reference parameter the script will run in the context of the module. So for example:
& $myMod {$usa_cred}
will output the value of $use_cred even if it hasn't been exported. This is useful for debugging scripts. Also modules can have embedded modules and & $myMod {gmo} will list those sub modules. By nesting & or . you can access sub-modules context.

Refresh PowerShell DSC resource

I am debugging PowerShell DSC resources that come with v4.0.
More specifically, I am testing MSFT_ProcessResource by adding a diagnostic log.
After I make change to the resource, and run my configuration that exercise the resource, I don't see the logging I just added. Eventually after several minutes, PowerShell seems to refresh whatever cache of resource it has.
I've tried
Get-DscResource, and
Import-Module MSFT_ProcessResource
None of which worked.
Is there a way to force re-loading the resource?
DSC engine caches resources to increase performance.
There are two ways to reload the resource:
1) Restart process hosting DSC engine (kill WMI Provider Host and re-run the configuration)
2) Use debug mode which will cause DSC to reload resources automatically (useful when developing resources, but not recommended for regular work):
LocalConfigurationManager
{
DebugMode = $true
}
You can read more about debug mode here:
http://blogs.msdn.com/b/powershell/archive/2014/04/22/debug-mode-in-desired-state-configuration.aspx
DSC has a caching model which frankly seems buggy and poorlyl designed as of Sep 2016
The blog entries indicating the mechanisms to get around the caching don't always work
In your configuration include the following configuration line
Also perform a full restart of the winmgt service. Simply killing the dsctimer process doesn't appear to always work.
{
LocalConfigurationManager
{
DebugMode = "All"
}
}
A PowerShell script to clear the cache is:
$dscProcessID = Get-WmiObject msft_providers |
Where-Object {$_.provider -like 'dsctimer'} |
Select-Object -ExpandProperty HostProcessIdentifier
if ($dscProcessID -eq $null) {
Write-Host "DSC timer is not running."
return
}
Write-Host "Process ID: $dscProcessID"
Get-Process -Id $dscProcessID | Stop-Process -Force
Restart-Service -Name winmgmt -Force -Verbose
This has now changed with WMF 5, instead of $true debug mode has the following options.
None - Signifies that DebugMode is False and not applicable.
ForceModuleImport - Enforce the resource module to be reloaded instead of using the cache. This is similar to "true" value in
previous versions.
ResourceScriptBrealAll - Helps in debugging DSC resources when Local configuration manager tries to execute their functions. More on
it in subsequent blog posts!
All - Signifies that debugging as well as reloading of modules are both enabled.
Using this in an example DSC config would look like this:
Configuration myChocoConfig2
{
Import-DscResource -Module cChoco
Node "localhost"
{
LocalConfigurationManager
{
DebugMode = 'All'
}
cChocoInstaller installChoco
{
InstallDir = "c:\choco"
}
cChocoPackageInstaller installChrome
{
Name = "sysinternals"
DependsOn = "[cChocoInstaller]installChoco"
}
}
}
https://techstronghold.com/blogs/scripting/how-to-setup-debug-mode-in-windows-powershell-desired-state-configuration-dsc
I have a set of scripts that load on start of PowerShell and I often needed the same. I would edit one of my scripts and need it to be updated in the current session.
Because I have these scripts loading via a series of scripts in the $profile I am able to use one command to refresh for any of the scripts that I load on init.
C:> powershell
This command will refresh your session and keep you in the same folder you are currently in. If you are not loading your module on startup, you will need to use the answer from Karol.