can I load a ps1 file from within a ps1 file.
The end goal is to make a ps1 file that I put in my profile on all my computers and have a floating profile that I can put paths to pocket utilities in.
I'll probably put this in a code repo, or some outside sharing mechanism.
This is most definitely supported in powershell and I have this exact same setup on my machine
Normal Profile.ps1 Contents
. ~\winconfig\PowerShell\Profile.ps1
The Profile.ps1 in WinConfig\PowerShell is my version controlled profile which has all of my custom fun inside of it. I have a script which simply generates the standard Profile.ps1 in the normal powershell directory whenever I get a new machine.
I have all my scripts primarily on a flash drive (and some backups on computers at home/work). At the flash drive I have my profile script that creates all my custom conversion functions, drives etc.
What I want is to load the profile script from flash disk every time I run PowerShell.
So, the code in my $profile at all the computers I work with looks like this:
#drive name of my flash; obviously different on each computer
$global:psflash = "g:\"
# if my flash disk is available, load my profile script from the flash disk
if (test-path $psflash) {
. (join-path $psflash 'dev\powershell\PsProfile.ps1')
}
The good side effect is that all my scripts can use the global variable $psflash to import other scripts they depend on or other modules in the same way as done in my profile (using join-path) any time later.
Related
I'm trying to create a powershell module to store some reusable utility functions. I created script module PsUtils.psm1 and script module manifest PsUtils.psd1 (used these docs). My problem is that when I import this module in another script Visual Code does not suggest parameters names. Here's a screenshot:
When I hover cursor over the function I only get this:
PsUtils.psm1
function Get-Filelist {
Param(
[Parameter(Mandatory=$true)]
[string[]]
$DirectoryPath
)
Write-Host "DIR PATH: $DirectoryPath"
}
PsUtils.psd1 (excerpt)
...
FunctionsToExport = '*'
I have Powershell extension installed. Do I need to install anything else to make the suggestions work? What am I missing?
Generally speaking, only auto-loading modules - i.e., those in one of the directories listed in environment variable $env:PSModulePath - are automatically discovered.
As of version v2022.7.2 of the PowerShell extension, the underlying PowerShell editor services make no attempt to infer from the current source-code file what modules in nonstandard directories are being imported via source code in that file, whether via Import-Module or using module
Doing so would be the prerequisite for discovering the commands exported by the modules being imported.
Doing so robustly sounds virtually impossible to do with the static analysis that the editor services are limited to performing, although it could work in simple cases; if I were to guess, such a feature request wouldn't be entertained, but you can always ask.
Workarounds:
Once you have imported a given module from a nonstandard location into the current session - either manually via the PIC (PowerShell Integrated Console) or by running your script (assuming the Import-Module call succeeds), the editor will provide IntelliSense for its exported commands from that point on, so your options are (use one of them):
Run your script in the debugger at least once before you start editing. You can place a breakpoint right after the Import-Module call and abort the run afterwards - the only prerequisite is that the file must be syntactically valid.
Run your Import-Module command manually in the PIC, replacing $PSScriptRoot with your script file's directory path.
Note: It is tempting to place the cursor on the Import-Module line in the script in order to use F8 to run just this statement, but, as of v2022.7.2, this won't work in your case, because $PSScriptRoot is only valid in the context of running an entire script.
GitHub issue #633 suggests adding special support for $PSScriptRoot; while the proposal has been green-lighted, no one has stepped up to implement it since.
(Temporarily) modify the $env:PSModulePath variable to include the path of your script file's directory.
The most convenient way to do that is via the $PROFILE file that is specific to the PowerShell extension, which you can open for editing with psedit $PROFILE from the PIC.
Note: Make sure that profile loading is enabled in the PowerShell extension's settings.
E.g., if your directory path is /path/to/my/module, add the following:
$env:PSModulePath+="$([IO.Path]::PathSeparator)/path/to/my/module"
The caveat is that all scripts / code that is run in the PIC will see this updated $env:PSModulePath value, so at least hypothetically other code could end up mistakenly importing your module instead of one expected to be in the standard locations.
Note that GitHub issue #880 is an (old) proposal to allow specifying $env:PSModulePath entries as part of the PowerShell extension settings instead.
On a somewhat related note:
Even when a module is auto-discovered / has been imported, IntelliSense only covers its exported commands, whereas while you're developing that module you'd also like to see its private commands. Overcoming this limitation is the subject of GitHub issue #104.
I'm fairly new to the world of Powershell and currently I'm trying to push a Powershell script via Intune to the company devices (all Windows 10 21H2 machines) that will show the file extensions in File Explorer.
So far, I've found this:
Set-Itemproperty -path 'HKCU:\Software\Microsoft\Windows\CurrentVersion\Explorer\Advanced' -Name 'HideFileExt' -value 0
The PS script is pushed via Intune to a test device and the monitor tells me the policy is applied successfuly but the file extensions are still not visible.
Is there something wrong with the line of code?
My original comment which helped:
The script works fine. I am positive that it is not applied successfully, despite Intune telling you it did. While it is not part of that question, I suppose you should check the user context in which the script is applied and if the eventvwr or any other possible source tells you why the script did not apply correctly. Also, after trying the script locally for myself, you need to refresh the explorer tab via f5 for the change to apply.
Solution was to set it as system/device rights, since it was indeed run as user context, hence solving the problem.
This was the solution:
"The script works fine. I am positive that it is not applied successfully, despite Intune telling you it did. While it is not part of that question, I suppose you should check the user context in which the script is applied and if the eventvwr or any other possible source tells you why the script did not apply correctly. Also, after trying the script locally for myself, you need to refresh the explorer tab via f5 for the change to apply" –
Bowshock
I am building a simple Powershell script for AD management.
I need to run as Admin this script from the .exe file (portable between Domain Controllers and/or Enviroments). Any suggest to make the exe file to request the Admin Privileges to the end-user (PopUp shield "can this program modify...")?
Consider using ps2exe - https://github.com/MScholtes/PS2EXE
This can create an exe from a ps1 file and add things like required admin privilege's etc.
ps2exe .\source.ps1 .\target.exe -requireAdmin
Every day when I log into VDI my vscode extensions get removed on a daily basis.
So I need to install them every day. Is there any walk around to keep the extensions with persistence and that I don't have to download/install it again on a daily basis when ever I log in.
Any help would be much appreciated and thanks in advance.
It seems that you are using a nonpersistent VDI, so you should ask your IT to install the vscode extensions that you need in the image stored in the servers.
Excerpt retrieved here:
There are two main approaches to VDI: persistent and nonpersistent. Persistent VDI provides each user with his or her own desktop image, which can be customized and saved for future use, much like a traditional physical desktop. Nonpersistent VDI provides a pool of uniform desktops that users can access when needed. Nonpersistent desktops revert to their original state each time the user logs out.
Found this on GitHub and it is down near the bottom... https://github.com/microsoft/vscode/issues/17691
Create an environment variable named VSCODE_EXTENSIONS. Set the path you wish the extensions to be stored. We used a network share in our implementation to keep extensions persistent in a non-persistent VDI. (e.g VSCODE_EXTENSIONS = \\Server\Share\%USERNAME%\.vscode)
This environment variable must be in place before VSCode launches. We are utilizing this with VSCode 1.52.1 and it is working for us.
This is how I got it to work in my environment. You need to install the extensions using the .vis format, and then copy the extensions from that local profile to a location any user can access. After that, create a GPO that will run this script at every logon and set the scope to your VDI access AD group. The logon GPO is located at User Configuration > Policies > Windows Settings > Scripts > Logon > Powershell This may be a crude way of doing it, but it's working in my environment.
#This will not work unless there are extensions on the root of the default user
#folder. Install Visual Studio Code and its extensions first, then copy the
#entire "\.vscode" folder from the user profile you installed it into and onto
#the default user profile root folder.
#This script tests to see if the extensions already exist in the root folder of
#the user logging in using the path described in the below variable.
$vscodeextensions = "$env:USERPROFILE\.vscode\extensions\ms-vscode.cpptools-
1.13.2"
#This will just allow the script to run.
Set-ExecutionPolicy bypass
If(-not(Test-Path $vscodeextensions)){
Copy-Item -Path "C:\Users\Default\.vscode\" -Destination "$env:USERPROFILE\"
-Force -Recurse
}
else{
Write-Host "Extensions already copied"
}
I have created a custom module as a PowerShell class following, roughly, the instructions available at Writing a custom DSC resource with PowerShell classes. The intent is to connect to Azure File Storage and download some files. I am using Azure Automation DSC as my pull server.
Let me start by saying that, when run through the PowerShell ISE, the code works a treat. Something goes wrong when I upload it to Azure though - I get the error Unable to find type [CloudFileDirectory]. This type specifier comes from assemblies referenced in through the module Azure.Storage which is definitely in my list of automation assets.
At the tippy top of my psm1 file I have
Using namespace Microsoft.WindowsAzure.Storage.File
[DscResource()]
class tAzureStorageFileSync
{
...
# Create the search context
[CloudFileDirectory] GetBlobRoot()
{
...
}
...
}
I'm not sure whether this Using is supported in this scenario or not, so let's call that Question 1
To date I have tried:
Adding RequiredModules = #( "Azure.Storage" ) to the psd1 file
Adding RequiredAssemblies = #( "Microsoft.WindowsAzure.Storage.dll" ) to the psd1 file
Shipping the actual Microsoft.WindowsAzure.Storage.dll file in the root of the module zip that I upload (that has a terrible smell about it)
When I deploy the module to Azure with New-AzureRmAutomationModule it uploads and processes just fine. The Extracting activities... step works and gives no errors.
When I compile a configuration, however, the compilation process fails with the Unable to find type error I mentioned.
I have contemplated adding an Import-Module Azure.Storage above the class declaration, but I've never seen that done anywhere else before.
Question 2 Is there a way I can compile locally using a similar process to the one used by Azure DSC so I can test changes more quickly?
Question 3 Does anyone know what is going wrong here?
Question 1/3:
If you create classes in powershell and use other classes within, ensure that these classes are present BEFORE loading the scriptfile that contains your new class.
I.e.:
Loader.ps1:
Import-Module Azure.Storage
. .\MyDSC-Class.ps1
Powershell checks if it finds all types you refer while interpreting the script, so all types must be loaded before that happens. You can do this by creating a scriptfile that loads all dependencies first and loads your script after that.
For question 2, if you register your machine as a hybrid worker you'll be able to run the script faster and compile locally. (For more details on hybrid workers, https://azure.microsoft.com/en-us/documentation/articles/automation-hybrid-runbook-worker/).
If you want an easy way to register the hybrid worker, you can run this script on your local machine (https://github.com/azureautomation/runbooks/blob/master/Utility/ARM/New-OnPremiseHybridWorker.ps1). Just make sure you have WMF 5 installed on your machine beforehand.
For authoring DSC configurations and testing locally, I would look at the Azure Automation ISE Add-On available on https://www.powershellgallery.com/packages/AzureAutomationAuthoringToolkit/0.2.3.6 You can install it by running the below command from an Administrator PowerShell ISE window.
Install-Module AzureAutomationAuthoringToolkit -Scope CurrentUser
For loading libraries, I have also noticed that I need to call import-module in order to be able to call methods. I need to do some research to determine the requirement for this. You can see an example I wrote to copy files from Azure Storage using a storage key up on https://github.com/azureautomation/modules/tree/master/cAzureStorage
As you probably don't want to have to deploy the storage library on all nodes, I included the storage library in the sample module above so that it will be automatically distributed to all nodes by the automation service.
Hope this helps,
Eamon