I've created some Modules and want to add some Get-Help files to give a little subscription or tutorial for my coworkers.
I've added the specific help-files in the right Directory-Folder and covert the files in the right Unicode Format like the description on the MSDN site.
Now I give the cmd Get-Help ModuleName but nothing loads except the normal help which is given from Powershell?
One way is to write the help inside the module file directly
<#
.Synopsis
Whatever you want to display.
.Description
A description for the same.
.Parameter Start
Whatever Parameter start you want .
.Parameter End
Whatever Parameter end you want to display.
.Example
# Show an example below.
Example of the function
#>
Other ways are : Listed
HERE and External Help for PS Modules
Hope it helps.
Related
I would like to know if it is possible to have all the arguments of the command / exe "winword".
In the official docs the are some examples:
https://support.microsoft.com/en-us/office/command-line-switches-for-microsoft-office-products-079164cd-4ef5-4178-b235-441737deb3a6
But there not all the commands, for example, there is "/mFilePrintDefault" or "/mFileExit"
The objective being to be able to print a pdf without user action.
The Macro Commands are going to vary from one version to the next and of course you can start with your own macro file or use an AutoMacro.
The examples you give are 2 from part of the "Fixed Commands"
There are three lists available but I have no idea which commands function in a specific version as a command line macro, you would need to try them out.
ECMA-376 Office Open XML File Formats [ECMA-376], Fixed commands are here https://learn.microsoft.com/en-us/openspecs/office_standards/ms-oe376/37bf80f7-1d74-47f6-8721-aa077cadca4d
ISO/IEC-29500 Office Open XML File Formats [ISO/IEC-29500:2012], Fixed commands are here
https://learn.microsoft.com/en-us/openspecs/office_standards/ms-oi29500/1ecf33cf-3601-45f0-89fb-0ab824739343
And listed under Basic Types as https://learn.microsoft.com/en-us/openspecs/office_file_formats/ms-doc/e86aecb4-bb67-4de2-9b06-37771115f274
A common example to show using more than one command(s) in sequence is
"path to\winword.exe" "path and filename.docx" /q /n /mFilePrintDefault /mFileExit
But your default printer needs to be a PDF one otherwise your macro needs to more complex
I'm new to Visual Studio Code, I'm starting to use it to develop PowerShell scripts.
I see that when I let my cursor on the name of a function I get an overview of the parameters of the function, yet when I do it with my own custom functions, it does not do it.
How do I declare a documentation for my PowerShell functions that can be displayed in the overview by Visual Studio Code ?
I tried to do this :
<#
.Description
Get-Function displays the name and syntax of all functions in the session.
#>
function test([string]$print){}
<# Setup the working directories if they do not exist #>
If(!(Test-Path $WORKING_DIRECTORY)) {test}
But that doesn't seem to work.
Thanks a lot
Adding .PARAMETER to the description worked for me. Also note I believe you have to save and run the script once for it to show.
<#
.Description
Get-Function displays the name and syntax of all functions in the session.
.PARAMETER
#>
function test([string]$print){Write-Host $print}
If(Test-Path $env:userprofile) {test -print "Test123"}
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_comment_based_help?view=powershell-6
My previous answer was incorrect, though reading up on Comment Based Help is still a good idea for generally documenting your cmdlets.
In order for vscode to be able to know about your function definitions, the function needs to be defined in its internal Powershell host. If you can open the Powershell Integrated terminal, you can dot-source (run the script like . myScript.ps1 to read the function definitions in. You may need to make sure the script wasn't dot-sourced before running any code to execute (basically put your runtime code in an if block checking if the script was dot-sourced but leave your function definitions outside of this conditional).
Once the script has been dot-sourced in the Powershell Integrated terminal you will get the usage tooltip popup like you want. To make this "automatic", dot-source the script from $profile as resolved from the vsvode terminal.
This isn't particularly ideal and I am hoping to find a more streamlined solution, as having to dot-source every script I'm working with is cumbersome.
You need to save and run the script, so changes in the inline documentation is recognized. I also can recommend using the .SYNOPSIS 'tag', that way it is even possible to read the text.
Example:
<#
.SYNOPSIS
Returns something
#>
function Get-SomeThing {
[CmdletBinding()]
param(
[Parameter(Mandatory=$true)]
[string]$Path
)
}
shows something like this when hovering the function name or for code completion:
When you do Get-Help SomeCommand -Full, under each parameter, after the description, there are some additional parameter properties. One of those properties is 'Accept Wildcard Characters?'. When I create my help information for a custom script cmdlet how do I specify that a parameter accepts wildcards?
In the param section of your script, add the attribute SupportsWildcards().
ex.:
param (
[SupportsWildcards()][String]$variable
)
If you want to be able to do this it will require a few things. First off, you either have to create a .dll file, which you are not doing, or you have to create a module. I am not going to go into all of the ins and outs of creating a module, there are already many well written guides on how to do that out there on the internet that you can go look up.
As a part of your module you can include .XML files that provide Help information similarly to the commented help available for individual scripts. The XML style does have some advantages, such as consistency and some advanced features, but does require more effort. Towards this end I would strongly suggest reading Writing Help for Windows PowerShell Modules, as it will explain where to place your XML files, how to structure them, and required headers and what not.
If it were me I'd probably copy an existing XML help file and edit it to suit my needs for the cmdlet, find and read one of the quick-and-dirty HowTo's about creating a module, and then give up on the idea since it's not worth the effort involved to just add that 'Supports Wildcards' flag (in my opinion) if this all started out just as a basic script with commented help.
But the answer is, create a module and supporting XML based Help file for your cmdlet. With that you can add support for the Accepts Wildcards flag for your parameters.
Is it possible to execute some code (eg. a function, script block, cmdlet etc) whenever the current path changes in the PowerShell console or ISE?
The scenario I am thinking of is to modify some environment variables and dot source some location-specific PowerShell functions depending on the current folder.
You have several options. You can remove the "cd" alias and then write a "cd" function and add the desired logic. The downside to this approach is that if someone uses Set-Location, your function is bypassed. Another option is to create a proxy command for Set-Location. Shay (and Kirk) have a video on how to do this here. The nice thing about this approach is that the built-in ways to change dir (cd and Set-Location) will go through your proxy command.
What is the best and correct way to run a PowerShell script from another one?
I have a script a.ps1 from which I want to call b.ps1 which does different task.
Let me know your suggestions. Is dot sourcing is the best option here?
Dot sourcing will run the second script as if it is part of the caller—all script scope changes will affect the caller. If this is what you want then dot-source,
However it is more usual to call the other script as if it were a function (a script can use param and function level attributes just like a function). In many ways a script is a PowerShell function, with the name of the file replacing the naming of the function.
Dot sourcing makes it easier to at a later stage convert your script(s) into a module, you won't have to change the script(s) into functions.
Another advantage of dot sourcing is that you can add the function to your shell by adding the file that holds the functions to Microsoft.PowerShell_profile.ps1, meaning you have them available at all times (eliminating the need to worry about paths etc).
I have a short write-host at the top of my dot sourced files with the name of the function and common parameters and I dot source the functions in my profile. Each time I open PowerShell, the list of functions in my profile scrolls by (If like me, you frequently forget the exact names of your functions/files You'll appreciate this as over time as the number of functions start to pile up).
Old but still relevant.
I work with modules with "Import-Module ", this will import the module in the current powershell session.
To avoid keep in cache and to always have the last changes from the module I put a "Get-Module | Remove-Module" that will clear all the loaded modules in the current session.
Get-Module | Remove-Module
Import-Module '.\IIS\Functions.psm1'