I am trying to create a windows form using PowerShell and then manipulate the data entered. The problem I am having is that I can get the code to work if I copy/paste it into a PowerShell window however I cannot get it to work if I save the exact same code to a PS1 file. I don't understand that.
If you try out the example in this article: http://technet.microsoft.com/en-us/library/ff730941.aspx it will work fine when you paste it into an open command prompt. If you try to save this code as a PS1 and run the PS1 in the PowerShell window I get nothing back when click OK on the dialog.
Can someone help me understand why it doesn't work as a PS1 file?
Variable assignment statement ($x=$objTextBox.Text) sets value within default scope, which is Local by default.
As assignment statement is within {...} the variable value is not visible out of the assignment scope.
You can change the assignment statement $x=$objTextBox.Text with the:
$global:x=$objTextBox.Text
More info:
Set-Variable
about_Scopes
As Matej stated this is a scope issue, and he provided an excellent solution to being able to manipulate variables from within a child scope.
I wanted to provide an alternative way to work around this issue. That is to declare variables at the beginning of the script with New-Variable and use the -Option AllScope argument. For example I will use the script that was referenced in the OP. Insert a line at line 3 to read:
New-Variable -Name x -Option AllScope
Now when you run the script it will output whatever you typed into the box because the variable $x is consistent across all scopes.
Related
I'm new to Visual Studio Code, I'm starting to use it to develop PowerShell scripts.
I see that when I let my cursor on the name of a function I get an overview of the parameters of the function, yet when I do it with my own custom functions, it does not do it.
How do I declare a documentation for my PowerShell functions that can be displayed in the overview by Visual Studio Code ?
I tried to do this :
<#
.Description
Get-Function displays the name and syntax of all functions in the session.
#>
function test([string]$print){}
<# Setup the working directories if they do not exist #>
If(!(Test-Path $WORKING_DIRECTORY)) {test}
But that doesn't seem to work.
Thanks a lot
Adding .PARAMETER to the description worked for me. Also note I believe you have to save and run the script once for it to show.
<#
.Description
Get-Function displays the name and syntax of all functions in the session.
.PARAMETER
#>
function test([string]$print){Write-Host $print}
If(Test-Path $env:userprofile) {test -print "Test123"}
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_comment_based_help?view=powershell-6
My previous answer was incorrect, though reading up on Comment Based Help is still a good idea for generally documenting your cmdlets.
In order for vscode to be able to know about your function definitions, the function needs to be defined in its internal Powershell host. If you can open the Powershell Integrated terminal, you can dot-source (run the script like . myScript.ps1 to read the function definitions in. You may need to make sure the script wasn't dot-sourced before running any code to execute (basically put your runtime code in an if block checking if the script was dot-sourced but leave your function definitions outside of this conditional).
Once the script has been dot-sourced in the Powershell Integrated terminal you will get the usage tooltip popup like you want. To make this "automatic", dot-source the script from $profile as resolved from the vsvode terminal.
This isn't particularly ideal and I am hoping to find a more streamlined solution, as having to dot-source every script I'm working with is cumbersome.
You need to save and run the script, so changes in the inline documentation is recognized. I also can recommend using the .SYNOPSIS 'tag', that way it is even possible to read the text.
Example:
<#
.SYNOPSIS
Returns something
#>
function Get-SomeThing {
[CmdletBinding()]
param(
[Parameter(Mandatory=$true)]
[string]$Path
)
}
shows something like this when hovering the function name or for code completion:
I have the following parameter defined in Team City:
I want to pass this parameter into a powershell script I have (that will update the xml file with the version number).
But this inserts the actual text %version% into the script (No substitution is made for the actual value of the parameter.)
However, I know my script is working because if I hardcode the values like this then it works:
Is there a way to get %version% to convert to the actual value when when used as a PowerShell script argument?
If you put the parameter in quotes, "%version%", and change the script execution mode to Execute ps1 script with "-File" argument then this should resolve and inject correctly
e.g.
Hope this helps
You need Environment Variables (env.), it's work to me
enter image description here
Is it possible to execute some code (eg. a function, script block, cmdlet etc) whenever the current path changes in the PowerShell console or ISE?
The scenario I am thinking of is to modify some environment variables and dot source some location-specific PowerShell functions depending on the current folder.
You have several options. You can remove the "cd" alias and then write a "cd" function and add the desired logic. The downside to this approach is that if someone uses Set-Location, your function is bypassed. Another option is to create a proxy command for Set-Location. Shay (and Kirk) have a video on how to do this here. The nice thing about this approach is that the built-in ways to change dir (cd and Set-Location) will go through your proxy command.
What is the best and correct way to run a PowerShell script from another one?
I have a script a.ps1 from which I want to call b.ps1 which does different task.
Let me know your suggestions. Is dot sourcing is the best option here?
Dot sourcing will run the second script as if it is part of the caller—all script scope changes will affect the caller. If this is what you want then dot-source,
However it is more usual to call the other script as if it were a function (a script can use param and function level attributes just like a function). In many ways a script is a PowerShell function, with the name of the file replacing the naming of the function.
Dot sourcing makes it easier to at a later stage convert your script(s) into a module, you won't have to change the script(s) into functions.
Another advantage of dot sourcing is that you can add the function to your shell by adding the file that holds the functions to Microsoft.PowerShell_profile.ps1, meaning you have them available at all times (eliminating the need to worry about paths etc).
I have a short write-host at the top of my dot sourced files with the name of the function and common parameters and I dot source the functions in my profile. Each time I open PowerShell, the list of functions in my profile scrolls by (If like me, you frequently forget the exact names of your functions/files You'll appreciate this as over time as the number of functions start to pile up).
Old but still relevant.
I work with modules with "Import-Module ", this will import the module in the current powershell session.
To avoid keep in cache and to always have the last changes from the module I put a "Get-Module | Remove-Module" that will clear all the loaded modules in the current session.
Get-Module | Remove-Module
Import-Module '.\IIS\Functions.psm1'
how can i add a custom function / object to the standard array of recognized functions in PowerShell so that i can call it from the shell of PowerShell?
Thanks
You can put the function into your profile script. You can find out where this is by looking into the variable $profile. That script will be run automatically on starting Powershell (if you are allowed to run scripts) and functions declared in it will be available in every session.