I'm new to Visual Studio Code, I'm starting to use it to develop PowerShell scripts.
I see that when I let my cursor on the name of a function I get an overview of the parameters of the function, yet when I do it with my own custom functions, it does not do it.
How do I declare a documentation for my PowerShell functions that can be displayed in the overview by Visual Studio Code ?
I tried to do this :
<#
.Description
Get-Function displays the name and syntax of all functions in the session.
#>
function test([string]$print){}
<# Setup the working directories if they do not exist #>
If(!(Test-Path $WORKING_DIRECTORY)) {test}
But that doesn't seem to work.
Thanks a lot
Adding .PARAMETER to the description worked for me. Also note I believe you have to save and run the script once for it to show.
<#
.Description
Get-Function displays the name and syntax of all functions in the session.
.PARAMETER
#>
function test([string]$print){Write-Host $print}
If(Test-Path $env:userprofile) {test -print "Test123"}
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_comment_based_help?view=powershell-6
My previous answer was incorrect, though reading up on Comment Based Help is still a good idea for generally documenting your cmdlets.
In order for vscode to be able to know about your function definitions, the function needs to be defined in its internal Powershell host. If you can open the Powershell Integrated terminal, you can dot-source (run the script like . myScript.ps1 to read the function definitions in. You may need to make sure the script wasn't dot-sourced before running any code to execute (basically put your runtime code in an if block checking if the script was dot-sourced but leave your function definitions outside of this conditional).
Once the script has been dot-sourced in the Powershell Integrated terminal you will get the usage tooltip popup like you want. To make this "automatic", dot-source the script from $profile as resolved from the vsvode terminal.
This isn't particularly ideal and I am hoping to find a more streamlined solution, as having to dot-source every script I'm working with is cumbersome.
You need to save and run the script, so changes in the inline documentation is recognized. I also can recommend using the .SYNOPSIS 'tag', that way it is even possible to read the text.
Example:
<#
.SYNOPSIS
Returns something
#>
function Get-SomeThing {
[CmdletBinding()]
param(
[Parameter(Mandatory=$true)]
[string]$Path
)
}
shows something like this when hovering the function name or for code completion:
Related
As per the image showing below, I have two PS1 script files.
main.ps1 (from where we call the call.ps1 function)
call.ps1
Now, check the below SS for reference of structure - Both are on same Folder.
Now, When we call the main.ps1 1st time in New Powershell window it's working fine. (see blow)
Now the real issue is, If we call the same function again 2nd time it's not working and giving an error. (see below)
Am I missing something or understanding the things incorrectly? - Or this is not the way to call the PS1 from another PS1 file?
Surprisingly, If I close the existing Terminal and open New Terminal it works fine for the 1st time..!!!
Anyone has any clue what's wrong we're doing??
Someone might write this better then me.
You need to rename call.ps1 to call.psm1
Then change the code in main.ps1 to the following
Import-Module '.\call.psm1'
callingthefunction
Also in your main.ps1 you used './call.ps1' the slash direction is wrong. Look at my code above.
For more information about modules which is what you are creating. Please read.
https://learn.microsoft.com/en-us/powershell/scripting/developer/module/how-to-write-a-powershell-script-module?view=powershell-7.2
I've created some Modules and want to add some Get-Help files to give a little subscription or tutorial for my coworkers.
I've added the specific help-files in the right Directory-Folder and covert the files in the right Unicode Format like the description on the MSDN site.
Now I give the cmd Get-Help ModuleName but nothing loads except the normal help which is given from Powershell?
One way is to write the help inside the module file directly
<#
.Synopsis
Whatever you want to display.
.Description
A description for the same.
.Parameter Start
Whatever Parameter start you want .
.Parameter End
Whatever Parameter end you want to display.
.Example
# Show an example below.
Example of the function
#>
Other ways are : Listed
HERE and External Help for PS Modules
Hope it helps.
I am trying to create a windows form using PowerShell and then manipulate the data entered. The problem I am having is that I can get the code to work if I copy/paste it into a PowerShell window however I cannot get it to work if I save the exact same code to a PS1 file. I don't understand that.
If you try out the example in this article: http://technet.microsoft.com/en-us/library/ff730941.aspx it will work fine when you paste it into an open command prompt. If you try to save this code as a PS1 and run the PS1 in the PowerShell window I get nothing back when click OK on the dialog.
Can someone help me understand why it doesn't work as a PS1 file?
Variable assignment statement ($x=$objTextBox.Text) sets value within default scope, which is Local by default.
As assignment statement is within {...} the variable value is not visible out of the assignment scope.
You can change the assignment statement $x=$objTextBox.Text with the:
$global:x=$objTextBox.Text
More info:
Set-Variable
about_Scopes
As Matej stated this is a scope issue, and he provided an excellent solution to being able to manipulate variables from within a child scope.
I wanted to provide an alternative way to work around this issue. That is to declare variables at the beginning of the script with New-Variable and use the -Option AllScope argument. For example I will use the script that was referenced in the OP. Insert a line at line 3 to read:
New-Variable -Name x -Option AllScope
Now when you run the script it will output whatever you typed into the box because the variable $x is consistent across all scopes.
Is it possible to execute some code (eg. a function, script block, cmdlet etc) whenever the current path changes in the PowerShell console or ISE?
The scenario I am thinking of is to modify some environment variables and dot source some location-specific PowerShell functions depending on the current folder.
You have several options. You can remove the "cd" alias and then write a "cd" function and add the desired logic. The downside to this approach is that if someone uses Set-Location, your function is bypassed. Another option is to create a proxy command for Set-Location. Shay (and Kirk) have a video on how to do this here. The nice thing about this approach is that the built-in ways to change dir (cd and Set-Location) will go through your proxy command.
What is the best and correct way to run a PowerShell script from another one?
I have a script a.ps1 from which I want to call b.ps1 which does different task.
Let me know your suggestions. Is dot sourcing is the best option here?
Dot sourcing will run the second script as if it is part of the caller—all script scope changes will affect the caller. If this is what you want then dot-source,
However it is more usual to call the other script as if it were a function (a script can use param and function level attributes just like a function). In many ways a script is a PowerShell function, with the name of the file replacing the naming of the function.
Dot sourcing makes it easier to at a later stage convert your script(s) into a module, you won't have to change the script(s) into functions.
Another advantage of dot sourcing is that you can add the function to your shell by adding the file that holds the functions to Microsoft.PowerShell_profile.ps1, meaning you have them available at all times (eliminating the need to worry about paths etc).
I have a short write-host at the top of my dot sourced files with the name of the function and common parameters and I dot source the functions in my profile. Each time I open PowerShell, the list of functions in my profile scrolls by (If like me, you frequently forget the exact names of your functions/files You'll appreciate this as over time as the number of functions start to pile up).
Old but still relevant.
I work with modules with "Import-Module ", this will import the module in the current powershell session.
To avoid keep in cache and to always have the last changes from the module I put a "Get-Module | Remove-Module" that will clear all the loaded modules in the current session.
Get-Module | Remove-Module
Import-Module '.\IIS\Functions.psm1'