Import PowerShell module after install - powershell

I'm working on automating an AppFabric installation using PowerShell, and I've run into a problem where the script is calling the installer, waiting for it to complete, but I am unable to import the installed modules after from the same context. i.e:
Start-Process "C:\provision\WindowsServerAppFabricSetup_x64.exe" -ArgumentList "/i /GAC" -Wait
Import-Module DistributedCacheConfiguration
# ...do configuration things...
Which errors: The specified module 'DistributedCacheConfiguration' was not loaded because no valid module file was found in any module directory.
If you close and re-open PowerShell, the script runs fine. Adding a Start-Sleep 60 between the installer and the configuration didn't help, so I tried calling it as though powershell were restarting:
C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe C:\provision\appfabric_config.ps1
The same errors were thrown. How do I get PowerShell to recognize the newly installed modules?

PowerShell looks for modules in subdirectories of the directories listed in the PSModulePath environment variable. Environment variables are read from the registry key HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment when the session is initialized.
If the installer places the new module in a directory that's not already in PSModulePath and then adds that directory to the environment variable, it's modifying the environment variable in the registry, not in the current PowerShell console session's environment, so only PowerShell sessions started after the installation will have the updated PSModulePath.
You can manually update the value from the registry by adding the following line after the installation and before attempting to import the module:
$env:PSModulePath = (Get-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Session Manager\Environment' -Name PSModulePath).PSModulePath
Note that although it may appear redundant, the reason you need
(Get-ItemProperty -Path [...] -Name PSModulePath).PSModulePath
rather than just
Get-ItemProperty -Path [...] -Name PSModulePath
is that Get-ItemProperty doesn't return the data of the named registry value, it returns a PSCustomObject that contains information about the registry value, and the data is in a property of that PSCustomObject that has the name of the registry value (i.e. PSModulePath in this case). If you prefer, you could also do it this way:
$env:PSModulePath = Get-ItemProperty -Path 'HKLM:\System\CurrentControlSet\Control\Session Manager\Environment' -Name PSModulePath | select -ExpandProperty PSModulePath
(There's no practical difference, it's six or a half dozen.)

You can use the .NET System.Environment library to access your environment variables. Your new module is most likely added to your "User" environment variable. You can also experiment with specifying "Machine" and "Process" targets. See Environment.GetEnvironmentVariable for more info. Here is a good article on modifying your paths, including adding to and removing entries. It can be easily adapted to the PSModulePath environment variable.
This example adds the PSModulePath environment variable for the User to the end of the PSModulePath environment variable in your session. It will result in some duplicate entries, but should work just fine.
$env:PSModulePath = $env:PSModulePath+';'+[System.Environment]::GetEnvironmentVariable("PSModulePath","User")
Your code would now look like this:
Start-Process "C:\provision\WindowsServerAppFabricSetup_x64.exe" -ArgumentList "/i /GAC" -Wait
$env:PSModulePath = $env:PSModulePath+';'+[System.Environment]::GetEnvironmentVariable("PSModulePath","User")
Import-Module DistributedCacheConfiguration
# ...do configuration things...

Related

How to permanently remove a UNC path from $env:PSModulePath for PowerShell?

Where I work, users have their folders redirected to a UNC path to save data. This seems to have affected my PowerShell because every time I start up powershell, it attempts to load modules from the UNC path and it takes a long time. This also affects how I use cmdlets because it tries to search the UNC path for cmdlet context.
When I look at the output of $env:PSModulePath I can see the UNC directory. But it doesn't show up in System's Environment Variables dialog editor.
How can I get rid of this so that Powershell doesn't keep looking for module support from the UNC directory? I understand that I can edit an existing sessions $env:PSModulePath, but I want it gone forever.
The Microsoft documentation page about_PSModulePath explains it. By default, PowerShell builds the $env:PSModulePath value by concatenating:
The user's "Documents" folder, likely the UNC path you are mentioning.
The contents of the PSModulePath system environment variable.
You can find out the path of the "Documents" folder as follows:
[Environment]::GetFolderPath('MyDocuments')
If you define a user-level environment variable PSModulePath in addition to the system environment variable PSModulePath, it will replace your "Documents" folder in $env:PSModulePath, which won't contain the UNC path anymore.
To permanently change the PSModulePath environment variable, you can use the .NET method SetEnvironmentVariable on the System.Environment class.
Example:
[Environment]::SetEnvironmentVariable("PSModulePath", "Some_Path", "Machine")
Keep in mind, this will overwrite what you had before on this variable. So make sure before you run the method as the PSModulePath will certainly have paths that you might need.
Also, if your company is using GPOs to set the PSModulePath variable, then the only way to remove the UNC path is to talk to your administrator that handles that.
I don't know why it is not shown in your system variables GUI. You should be able to remove it with something along these lines:
$yourvalueyouwanttodelete = 'somepath'
$oldval = [environment]::GetEnvironmentVariable("PSModulePath")
$arr_oldval = $oldval -split ";"
$arr_newval = $arr_oldval | ? {$_ -ne $yourvalueyouwanttodelete}
$newval = $arr_newval -join ";"
[environment]::SetEnvironmentVariable("PSModulePath", $newval)
The way I resolved this was to set the Module Path in the All users profile
For me $PROFILE.AllUsersAllHosts returned:
C:\Program Files\PowerShell\7\profile.ps1
I created that file and added the following:
$ModulePath = "c:\program files\powershell\7\Modules"
[Environment]::SetEnvironmentVariable("PSModulePath", $ModulePath)
I ran into the same issue after removing PowerShell 7 from my device. What appears to be happening is that the PSModulePath variable to My Documents is removed from the environment variables under Advanced System Settings. Powershell then uses the UNC path, which is configured by group policy.
The solution, that worked for me was to add a new Environment Variable called PSModulePath with the value of C:\Users\YourUserNameHere\Documents. Now powershell has a place to look for those PS Modules.

wget not found by PowerShell script?

I have an old notebook with Windows 7 64-bit that executes a PowerShell script perfectly every Sunday. Unfortunately it starts to crash as soon as the load increases and I decided to get a new PC. On this PC I previously installed Windows&nbspe;10 Pro 64-bit and even here the script was executed every Sunday. Due to the update policy of Microsoft I removed Windows 10 from the new PC and installed Windows 7 64-bit. But now the same script crashes as it does not find wget:
$wg = Start-Process wget.exe -wait -NoNewWindow -PassThru -ArgumentList $argList
Gnu Wget is installed correctly (I think). It is placed at:
C:\Program Files (x86)\GnuWin32\bin\wget.exe
It is even entered in the registry under HKEY_LOCAL_MACHINE → SOFTWARE → Wow6432Node → GnuWin32|Wget|1.11.4-1|setup|InstallPath: C:\Program Files (x86)\GnuWin32.
But despite this if I open the CMD console and enter wget (or wget.exe) I get:
The order "wget" is either misspelled or could not be found.
What do I have to do that PowerShell finds wget constantly even after a restart of the PC? Even e.g. Notepad++ cannot be found by the CMD console despite it is installed properly(?). What's wrong here?
If you want to be able to run a command without specifying its path you need to add the directory it resides in to the PATH environment variable. The install path in the SOFTWARE branch of the registry has nothing to do with it.
To add a directory to the PATH for the current and all future sessions you need to do something like this:
$dir = "${env:ProgramFiles(x86)}\GnuWin32\bin"
# set PATH environment variable for current session
$env:Path += ";${dir}"
# set PATH environment variable for future sessions
$path = [Environment]::GetEnvironmentVariable('PATH', 'Machine')
$path += ";{$dir}"
[Environment]::SetEnvironmentVariable('PATH', $path, 'Machine')
Note, however, that the second step (setting the variable for future sessions) only works correctly if there are no Windows environment variables (%something%) used in $path, because the method saves the value as a REG_SZ in the registry. Windows only expands environment variables in the PATH variable if it's stored as a REG_EXPAND_SZ value.
If you do have regular Windows environment variables somewhere in $path you must manually write the value to the registry with the correct type.
$key = 'HKLM:\SYSTEM\CurrentControlSet\Control\Session Manager\Environment'
Set-ItemProperty -Path $key -Name 'Path' -Value $path -Type ExpandString
Addendum:
All of the above applies only if you want to do this programmatically, of course. For a manual approach you can always edit the environment variables via the GUI and restart PowerShell.

How to properly set path in Powershell and 7zip?

I have a Powershell script to create a self-extracting archive via 7zip. But it's receiving this error:
cannot find specified SFX module
The Powershell code is:
set-alias sz "$env:ProgramFiles\7-Zip\7z.exe
sz a -t7z -sfx -ppassword $fullpath $filetostore
Both variables are valid. I've tried -sfx and -sfx7z.sfx, same error. The 7z.sfx file is indeed in the correct folder with 7zip. I can also verify the alias is working, as the 7zip copyright appears when running the code (so 7zip commandline is being initiated). This command works outside Powershell.
I'm also tried Set-Location into the 7zip folder, but same error. What am I missing?
It seems you should add the 7-zip folder to your PATH environment variable to make things easier :
#find the 7-zip folder path
$7zPath = (Get-ChildItem "C:\Program Files","C:\Program Files (x86)" -Include "7-zip" -Recurse -ErrorAction SilentlyContinue).FullName
#add it to PATH environment variable
$env:Path += ";$7zPath;"
Then you can run 7z -sfx with no errors about the SFX module.
While Sodawillow has an answer that will work for the active session, a more permanent answer would be to add 7zip to the path for the Environment you are working in:
[Environment]::SetEnvironmentVariable("Path",$env:Path+";C:\Program Files\7-zip", [EnvironmentVariableTarget]::User)
The above one-liner should add 7zip to the active user account's path. Change 'User' to 'Machine' for the whole computer, or 'Process' for the currently running window. If you set 'User' or 'Machine', you will need to open a new powershell instance to see the change reflected.

Remove alias in script

I can remove an alias like so:
Remove-Item Alias:wget
Then trying the alias gives the expected result:
PS > wget
wget : The term 'wget' is not recognized as the name of a cmdlet, function,
script file, or operable program.
However, if I put the same into a script,
PS > cat wget.ps1
Remove-Item Alias:wget
wget
it gives unexpected result
cmdlet Invoke-WebRequest at command pipeline position 1
Supply values for the following parameters:
Uri:
The following seems to work
If (Test-Path Alias:wget) {Remove-Item Alias:wget}
If (Test-Path Alias:wget) {Remove-Item Alias:wget}
wget
The first line removes the Script level alias, and the second removes the Global level alias. Note that both need to be tested, because if the Global doesn't exist the Script isn't created.
Also this misbehaves in that it modifies the parent without being dot sourced.
Alternatively you can just dot source your original and that would work
. .\wget.ps1
The reason behind this is scope...
There are four scopes: local, script, private, and global
The rules for variables, functions and aliases say that if they are not defined in the current scope then PowerShell will search the parent scopes.
The default for an alias is set to allscope, which make it visible from any child scopes and also they are inherited into any new child scope (I think that would be a fair definition).
Get-Alias -Name wget | select Name,options
Name Options
wget AllScope
Once you remove the alias that is the script scope (scope 0), it will then find the alias from the global/parent scope (scope 1).
When you dot source it, you are just saying to run the script in the calling/global scope in the first place, so you are removing the Global alias by default.
Try these...
E.g #1.
1.1) Remove the alias from the global scope.
Remove-Item -Path Alias:\wget
1.2) Create a new one (global scope) and make it private.
New-Alias -Name wget -Value dir -Scope private
1.3) Now the alias is not visible from the nested scope.
So try run the script and it will not be found.
E.g #2.
2.1) Remove the alias from the global scope.
Remove-Item -Path Alias:\wget
2.2) Create a new one and make it AllScope
New-Alias -Name wget -Value dir -Option AllScope
2.3) Now run your script and it will work fine (using the new alias dir from the parent scope)
You could try the same with variables as well. They should be easier to demo (and play around with) since you can more easily play with the scope parameter when you use New/Get/Set/Remove -scope 1 or scope 0 OR use the scope modifiers, for example, $global:a.
Get-Command -Noun variable -ParameterName scope
Clearly, some of the other answers are on-track and have identified the fact that the alias is present in several scopes.
But if you're trying to invoke wget.exe rather than Invoke-WebRequest (or wget.ps1 ) in some script, the simplest answer might be to put the ".exe" extension on the command name when you invoke it. That way you don't have to fiddle with the aliases.
It looks like when you run the script from the PowerShell console it inherits the alias drive from that session. So if wget is there it works, and if it's not it fails. If instead you run the script in a new process it gets removed normally.
Here's the code I used for my demo:
Remove-Item -Path Alias:\wget -Force
wget
Start-Sleep -Seconds 100
The start-sleep will keep your console from closing immediately. Try running that script in a new process. I used the run prompt and PowerShell itself by typing `powershell.exe -file pathtoscript '

Why don't .NET objects in PowerShell use the current directory?

When you use a .NET object from PowerShell, and it takes a filename, it always seems to be relative to C:\Windows\System32.
For example:
[IO.File]::WriteAllText('hello.txt', 'Hello World')
...will write C:\Windows\System32\hello.txt, rather than C:\Current\Directory\hello.txt
Why does PowerShell do this? Can this behaviour be changed? If it can't be changed, how do I work around it?
I've tried Resolve-Path, but that only works with files that already exist, and it's far too verbose to be doing all the time.
You can change .net working dir to powershell working dir: [Environment]::CurrentDirectory = (Get-Location -PSProvider FileSystem).ProviderPath
After this line all .net methods like [io.path]::GetFullPath and [IO.File]::WriteAllText will work without problems
The reasons PowerShell doesn't keep the .NET notion of current working directory in sync with PowerShell's notion of the working dir are:
PowerShell working dirs can be in a provider that isn't even file system
based e.g. HKLM:\Software
A single PowerShell process can have
multiple runspaces. Each runspace can be cd`d into a different file
system location. However the .NET/process "working directory" is
essentially a global for the process and wouldn't work for a
scenario where there can be multiple working dirs (one per runspace).
For convenience, I added the following to my prompt function, so that it runs whenever a command finishes:
# Make .NET's current directory follow PowerShell's
# current directory, if possible.
if ($PWD.Provider.Name -eq 'FileSystem') {
[System.IO.Directory]::SetCurrentDirectory($PWD)
}
This is not necessarily a great idea, because it means that some scripts (that assume that the Win32 working directory tracks the PowerShell working directory) will work on my machine, but not necessarily on others.
When you use filenames in .Net methods, the best practice is to use fully-qualified path names. Or use
$pwd\foo.cer
If you do in powershell console from:
C:\> [Environment]::CurrentDirectory
C:\WINDOWS\system32\WindowsPowerShell\v1.0
you can see what folder .net use.
That's probably because PowerShell is running in System32. When you cd to a directory in PowerShell, it doesn't actually change the working directory of powershell.exe.
See:
PowerTip article on syncing the two directories
Channel9 forum thread
I ran into the same problem a long time ago and now I add the following to the beginning of my profile:
# Setup user environment when running session under alternate credentials and
# logged in as a normal user.
if ((Get-PSProvider FileSystem).Home -eq "")
{
Set-Variable HOME $env:USERPROFILE -Force
$env:HOMEDRIVE = Split-Path $HOME -Qualifier
$env:HOMEPATH = Split-Path $HOME -NoQualifier
(Get-PSProvider FileSystem).Home = $HOME
Set-Location $HOME
}