Assuming powershell has a limit of N characters in its command, how can I pass more than N chars to the powershell cmdlet? Based on https://support.microsoft.com/en-in/kb/830473 link, it seems that the character limit is 8191 but it says that for cmd.exe, not sure what is the size limit for powershell. So if I have an input of size more than >8k, can I redirect the input to the powershell to circumvent this problem (solution based on what is mentioned in the referenced document).
Eg:
powershell console $> echo “a very long string” // the whole command including the echo and the very long string totalling less than 8192 chars on the powershell console. When I execute this I get the whole string as the output on the console
powershell console $> echo “a very long string // Try to add characters to the very long string, powershell doesn’t allow me to add more chars to the very long string if the total goes above 8192 since I guess I have reached the limit on the number of characters I can enter.
What I want:
powershell console $> echo // Place my input (which is more than 8192 chars) in a file and provide that as an input to echo and echo should display the complete string on the console thereby circumventing the limitation of the number of chars in a command.
The command echo I have used is only for representation purpose and I want to use a custom cmdlet instead of that so please consider this a valid scenario.
Edit 2:
psm1 file:
Function DoSomething {
[CmdletBinding()]
Param(
[Parameter(
Mandatory = $False)
]
[string]$v1,
[Parameter(
Mandatory = $False)
]
[string]$v2)
Begin {}
Process {
Write-Output "hello $v1 | $v2"
}
}
Text File Content say content.txt(short for representation purpose but assume this can be more than 8k characters):
-v1 "t1" -v2 "qwe"
Now when I do
powershell Console$> DoSomething (Get-Content content.txt)
the output that I get is
hello -v1 "t1" -v2 "qwe" |
I expect the output to be
hello -v1 "t1" | -v2 "qwe"
so that the execution of the cmdlet can happen without any issues. I tried this with the example of more than 8k characters in the text file and it is able to print the output, it is just that the parameters aren't getting separated. The command to provide the input to the cmdlet doesn't have to be Get-Content, it can be anything as long as it works.
You misunderstand how parameters in PowerShell functions work. The output of Get-Content is an array of strings (one string for each line in the file), but the entire array is passed to the first parameter. Also, a string isn't magically split so that the substrings can go to several parameters. How should PowerShell know which way to split the string?
A better way to deal with such input data is to have your function accept input from the pipeline:
Function DoSomething {
[CmdletBinding()]
Param(
[Parameter(
Mandatory=$false,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true
)]
[string]$v1,
[Parameter(
Mandatory=$false,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true
)]
[string]$v2
)
Process {
Write-Output "hello $v1 | $v2"
}
}
and define the data as a CSV (column names matching the parameter names):
v1,v2
"-v1 ""t1""","-v2 ""qwe"""
so that you can pipe the data into the function:
Import-Csv content.csv | DoSomething
With the function built like this you could also define the data as a hashtable and splat it:
$data = #{
'v1' = '-v1 "t1"'
'v2' = '-v2 ""qwe"'
}
DoSomething #data
For more information about function parameters see about_Parameters and about_Functions_Advanced_Parameters.
Else you can pass a path file where content of this file is used in your script ps. No limit then...
Related
I want to use a string inside a PowerShell script. It should be handed over like a variable when executing the script like this:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -c C:\temp\myscript.ps1 "string"
That is working well with just one word. But my string looks like this and should be handed over:
**<UserInputs><UserInput Question="Gruppenname" Answer="<Values Count="1"><Value DisplayName="Humanresources" Id="af05c5d3-2312-c897-8439-08979d4d0a49" /></Values>" Type="System.SupportingItem.PortalControl.InstancePicker" /><UserInput Question="Ausgabe" Answer="Namen" Type="richtext" /></UserInputs>**
This string contains some quotation marks and I have problems injecting it into my script.
inside the script I have this:
$mystring = $Null if($args[0] -ne $Null) { $mystring = $args[0] } $result = $mystring | Select-String -Pattern "DisplayName="(.*?)"" $result= $result.Matches.Groups[1] $group = $result.value Write-Output "$group" | Out-file C:\temp\group.txt
PowerShell scripts can take parameters, which become variables that are just available to the rest of the script. Use this at the top of your script to setup three variables, one boolean (true false) type, one string type and one integer type.
#stack.ps1
param(
# Param1 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
[string]$MyString="MyDefaultValue",
[int]$MyIntInput,
[bool]$MyTrueFalseInput
)
"Script running!"
"Values recieved!"
"MyString = $MyString"
"MyIntInput = $MyIntInput"
"MyTrueFalse = $MyTrueFalseInput"
Then to pass in values
C:\temp> .\stack.ps1 -MyString Ham -MyIntInput 75 -MyTrueFalseInput $true
Script running!
Values recieved!
MyString = Ham
MyIntInput = 75
MyTrueFalse = True
How to work with odd or complex strings
Now, to pass in a complex string that has quotes, use this syntax:
#using single quotes as a delimiter, ignoring the doubles inside.
$myWeirdString = 'ThisIs"SomeString"WhichHasQuotes"WhichIsWeird'
## extra case, if it's really odd and has both quote types
$myWeirdString = #"
'ThisIs"SomeString"WhichHasQuotes"WhichIsWeird'
"#
I'm trying to get my head around powershell and write a function as cmdlet, found the following code sample in one of the articles, but it doesnt seem to want to work as cmdlet even though it has [cmdletbinding()] declaration on the top of the file.
When I try to do something like
1,2,3,4,5 | .\measure-data
it returns empty response (the function itself works just fine if I invoke it at the bottom of the file and run the file itself).
Here's the code that I am working with, any help will be appreciated :)
Function Measure-Data {
<#
.Synopsis
Calculate the median and range from a collection of numbers
.Description
This command takes a collection of numeric values and calculates the
median and range. The result is written as an object to the pipeline.
.Example
PS C:\> 1,4,7,2 | measure-data
Median Range
------ -----
3 6
.Example
PS C:\> dir c:\scripts\*.ps1 | select -expand Length | measure-data
Median Range
------ -----
1843 178435
#>
[cmdletbinding()]
Param (
[Parameter(Mandatory=$True,ValueFromPipeline=$True)]
[ValidateRange([int64]::MinValue,[int64]::MaxValue)]
[psobject]$InputObject
)
Begin {
#define an array to hold incoming data
Write-Verbose "Defining data array"
$Data=#()
} #close Begin
Process {
#add each incoming value to the $data array
Write-Verbose "Adding $inputobject"
$Data+=$InputObject
} #close process
End {
#take incoming data and sort it
Write-Verbose "Sorting data"
$sorted = $data | Sort-Object
#count how many elements in the array
$count = $data.Count
Write-Verbose "Counted $count elements"
#region calculate median
if ($sorted.count%2) {
<#
if the number of elements is odd, add one to the count
and divide by to get middle number. But arrays start
counting at 0 so subtract one
#>
Write-Verbose "processing odd number"
[int]$i = (($sorted.count+1)/2-1)
#get the corresponding element from the sorted array
$median = $sorted[$i]
}
else {
<#
if number of elements is even, find the average
of the two middle numbers
#>
Write-Verbose "processing even number"
$i = $sorted.count/2
#get the lower number
$x = $sorted[$i-1]
#get the upper number
$y = $sorted[-$i]
#average the two numbers to calculate the median
$median = ($x+$y)/2
} #else even
#endregion
#region calculate range
Write-Verbose "Calculating the range"
$range = $sorted[-1] - $sorted[0]
#endregion
#region write result
Write-Verbose "Median = $median"
Write-Verbose "Range = $range"
#define a hash table for the custom object
$hash = #{Median=$median;Range=$Range}
#write result object to pipeline
Write-Verbose "Writing result to the pipeline"
New-Object -TypeName PSobject -Property $hash
#endregion
} #close end
} #close measure-data
this the article where I took the code from:
https://mcpmag.com/articles/2013/10/15/blacksmith-part-4.aspx
edit: maybe I should add that versions of this code from previous parts of the article worked just fine, but after adding all the things that make it a proper cmdlet like the help section and verbose lines, this thing just doesnt want to work, and I believe there is something missing, I have a feeling that this could be because it was written for powershell 3 and I am testing it on win 10 ps 5-point-something, but honestly I dont even know in which direction I should look for, that's why I ask you for help
There is nothing wrong with the code (apart from possible optimizations), but the way how you call it can't work:
1,2,3,4,5 | .\measure-data
When you call a script file that contains a named function, it is expected that "nothing happens". Actually, the scripts runs, but PowerShell does not know which function it should call (there could be multiple). So it just runs any code outside of functions.
You have two options to fix the problem:
Option 1
Remove the function keyword and the curly braces that belong to it. Keep the [cmdletbinding()] and Param sections.
[cmdletbinding()]
Param (
[Parameter(Mandatory=$True,ValueFromPipeline=$True)]
[ValidateRange([int64]::MinValue,[int64]::MaxValue)]
[psobject]$InputObject
)
Begin {
# ... your code ...
} #close Begin
Process {
# ... your code ...
} #close process
End {
# ... your code ...
}
Now the script itself is the "function" and can be called as such:
1,2,3,4,5 | .\measure-data
Option 2
Turn the script into a module. Basically you just need to save it with .psm1 extension (there is more to it, but for getting started it will suffice).
In the script where you want to use the function you have to import the module before you can use its functions. If the module is not installed, you can import it by specifying its full path.
# Import module from directory where current script is located
Import-Module $PSScriptRoot\measure-data.psm1
# Call a function of the module
1,2,3,4,5 | Measure-Data
A module is the way when there are multiple functions in a single script file. It is also more efficient when a function will be called muliple times, because PowerShell needs to parse it only once (it remembers Import-Module calls).
It works as-is, you just need to call it properly. Since the code is now a function, you cannot call it like before when the codes was directly in the file
# method when code is directly in file with no Function Measure-Data {}
1,2,3,4,5 | .\measure-data
Now that you've defined the function you instead need to dot source the file so that it loads your function(s) into memory. Then you can call your function by its name (which happens to be the same as the filename, but doesn't have to be)
# Load the functions by dot-sourcing
. .\measure-data.ps1
# Use the function
1,2,3,4,5 | Measure-Data
You're not passing it an Object but an array of integers. If you change the parameter to:
Param (
[Parameter(Mandatory=$True,ValueFromPipeline=$True)]
[ValidateRange([int64]::MinValue,[int64]::MaxValue)]
[Int[]]$InputObject
)
Now things work:
PS> 1,2,3,4,5 | Measure-Data
Median Range
------ -----
3 4
My application should write it's errors as literal JSON objects on stderr. This is proving difficult with PowerShell (5, 6 or 7) since PowerShell seems to want to prevent you from writing to stderr and, if you do succeed, it changes what you write.
In all examples we are running the following from within a powershell/pwsh console:
./test.ps1 2> out.json
test.ps1
Write-Error '{"code": "foo"}'
out.json
[91mWrite-Error: [91m{"code": "foo"}[0m
PowerShell is changing my stderr output. Bad PowerShell.
test.ps1
$Host.UI.WriteErrorLine('{"code": "foo"}')
out.json
PowerShell not writing to stderr (or >2 is not capturing it). Bad PowerShell.
test.ps1
[Console]::Error.WriteLine('{"code": "foo"}')
out.json
PowerShell not writing to stderr (or >2 is not capturing it). Bad PowerShell.
Update
I now understand that PowerShell does not have a stderr but rather numbered streams of which 2 corresponds to Write-Error and [Console]::Error.WriteLine() output and is sent to stderr of the pwsh/powershell.exe process IFF that output is redirected.
In short, stderr only exists outside of powershell and you can only access it via redirection:
pwsh ./test.ps1 2> out.json
Inside of powershell you can only redirect 2> the output from Write-Error. [Console]::Error.WriteLine() is not captured internally but sent to the console.
Problem
When you write to the error stream, Powershell creates an ErrorRecord object for each message. When you redirect the error stream and output it, Powershell formats it like an error message by default. The sub strings like [91m are ANSI escape sequences that colorize the message when written to the console.
Solution
To output plain text messages, convert the error records to strings before redirecting them to the file:
./test.ps1 2>&1 | ForEach-Object {
if( $_ -is [System.Management.Automation.ErrorRecord] ) {
# Message from the error stream
# -> convert error message to plain text and redirect (append) to file
"$_" >> out.json
}
else {
# Message from the success stream
# -> already a String, so output it directly
$_ # Shortcut for Write-Output $_
}
}
Remarks:
2>&1 merges the error stream with the success stream, so we can process both by the pipeline.
$_ is the current object processed by ForEach-Object. It is of type ErrorRecord, when the message is from the error stream of "test.ps1". It is of type String, when the message is from the success stream of "test.ps1".
Using the -is operator we check the type of the message to handle messages originating from the error stream differently than those from the success stream.
"$_" uses string interpolation to convert the ErrorRecord to the plain text message.
The >> operator redirects to the given file, but appends instead of overwriting.
Bonus code - a reusable cmdlet
If we regularly need to redirect error streams as plain text to a file, it makes sense to wrap the whole thing in a reusable cmdlet:
Function Out-ErrorMessageToFile {
[CmdletBinding()]
param (
[Parameter( Mandatory )] [String] $FilePath,
[Parameter( Mandatory, ValueFromPipeline )] [PSObject] $InputObject,
[Parameter( )] [Switch] $Append
)
begin {
if( ! $Append ) {
$null > $FilePath # Create / clear the file
}
}
process {
if( $InputObject -is [System.Management.Automation.ErrorRecord] ) {
# Message from the error stream
# -> convert error message to plain text and redirect (append) to file
"$InputObject" >> $FilePath
}
else {
# Message from the success stream
# -> already a String, so output it directly
$InputObject # Shortcut for Write-Output $InputObject
}
}
}
Usage examples:
# Overwrite "out.json"
./test.ps1 2>&1 | Out-ErrorMessageToFile out.json
# Append to "out.json"
./test.ps1 2>&1 | Out-ErrorMessageToFile out.json -Append
I am porting a script from bash to PowerShell, and I would like to keep the same support for argument parsing in both. In the bash, one of the possible arguments is -- and I want to also detect that argument in PowerShell. However, nothing I've tried so far has worked. I cannot define it as an argument like param($-) as that causes a compile error. Also, if I decide to completely forego PowerShell argument processing, and just use $args everything appears good, but when I run the function, the -- argument is missing.
Function Test-Function {
Write-Host $args
}
Test-Function -- -args go -here # Prints "-args go -here"
I know about $PSBoundParameters as well, but the value isn't there, because I can't bind a parameter named $-. Are there any other mechanisms here that I can try, or any solution?
For a bit more context, note that me using PowerShell is a side effect. This isn't expected to be used as a normal PowerShell command, I have also written a batch wrapper around this, but the logic of the wrapper is more complex than I wanted to write in batch, so the batch wrapper just calls the PowerShell function, which then does the more complex processing.
I found a way to do so, but instead of double-hyphen you have to pass 3 of them.
This is a simple function, you can change the code as you want:
function Test-Hyphen {
param(
${-}
)
if (${-}) {
write-host "You used triple-hyphen"
} else {
write-host "You didn't use triple-hyphen"
}
}
Sample 1
Test-Hyphen
Output
You didn't use triple-hyphen
Sample 2
Test-Hyphen ---
Output
You used triple-hyphen
As an aside: PowerShell allows a surprising range of variable names, but you have to enclose them in {...} in order for them to be recognized; that is, ${-} technically works, but it doesn't solve your problem.
The challenge is that PowerShell quietly strips -- from the list of arguments - and the only way to preserve that token is you precede it with the PSv3+ stop-parsing symbol, --%, which, however, fundamentally changes how the arguments are passed and is obviously an extra requirement, which is what you're trying to avoid.
Your best bet is to try - suboptimal - workarounds:
Option A: In your batch-file wrapper, translate -- to a special argument that PowerShell does preserve and pass it instead; the PowerShell script will then have to re-translate that special argument to --.
Option B: Perform custom argument parsing in PowerShell:
You can analyze $MyInvocation.Line, which contains the raw command line that invoked your script, and look for the presence of -- there.
Getting this right and making it robust is nontrivial, however.
Here's a reasonably robust approach:
# Don't use `param()` or `$args` - instead, do your own argument parsing:
# Extract the argument list from the invocation command line.
$argList = ($MyInvocation.Line -replace ('^.*' + [regex]::Escape($MyInvocation.InvocationName)) -split '[;|]')[0].Trim()
# Use Invoke-Expression with a Write-Output call to parse the raw argument list,
# performing evaluation and splitting it into an array:
$customArgs = if ($argList) { #(Invoke-Expression "Write-Output -- $argList") } else { #() }
# Print the resulting arguments array for verification:
$i = 0
$customArgs | % { "Arg #$((++$i)): [$_]" }
Note:
There are undoubtedly edge cases where the argument list may not be correctly extracted or where the re-evaluation of the raw arguments causes side effect, but for the majority of cases - especially when called from outside PowerShell - this should do.
While useful here, Invoke-Expression should generally be avoided.
If your script is named foo.ps1 and you invoked it as ./foo.ps1 -- -args go -here, you'd see the following output:
Arg #1: [--]
Arg #2: [-args]
Arg #3: [go]
Arg #4: [-here]
I came up with the following solution, which works well also inside pipelines multi-line expressions. I am using the PowerShell Parser to parse the invocation expression string (while ignoring any incomplete tokens, which might be present at the end of $MyInfocation.Line value) and then Invoke-Expression with Write-Output to get the actual argument values:
# Parse the whole invocation line
$code = [System.Management.Automation.Language.Parser]::ParseInput($MyInvocation.Line.Substring($MyInvocation.OffsetInLine - 1), [ref]$null, [ref]$null)
# Find our invocation expression without redirections
$myline = $code.Find({$args[0].CommandElements}, $true).CommandElements | % { $_.ToString() } | Join-String -Separator ' '
# Get the argument values
$command, $arguments = Invoke-Expression ('Write-Output -- ' + $myline)
# Fine-tune arguments to be always an array
if ( $arguments -is [string] ) { $arguments = #($arguments) }
if ( $arguments -eq $null ) { $arguments = #() }
Please be aware that the original values in the function call are reevaluated in Invoke-Expression, so any local variables might shadow values of the actual arguments. Because of that, you can also use this (almost) one-liner at the top of your function, which prevents the pollution of local variables:
# Parse arguments
$command, $arguments = Invoke-Expression ('Write-Output -- ' + ([System.Management.Automation.Language.Parser]::ParseInput($MyInvocation.Line.Substring($MyInvocation.OffsetInLine - 1), [ref]$null, [ref]$null).Find({$args[0].CommandElements}, $true).CommandElements | % { $_.ToString() } | Join-String -Separator ' '))
# Fine-tune arguments to be always an array
if ( $arguments -is [string] ) { $arguments = #($arguments) }
if ( $arguments -eq $null ) { $arguments = #() }
I'm having an issue getting a loop in a function to properly. The goal is to compare the output of some JSON data to existing unified groups in Office 365, and if the group already exists, skip it, otherwise, create a new group. The tricky part is that as part of function that creates the group, it prepends "gr-" to the group name. Because the compare function is comparing the original JSON data without the prepended data to Office 365, the compare function has to have the logic to prepend "gr-" on the fly. If there is a better way to accomplish this last piece, I am certainly open to suggestions.
Here is the latest version of the function. There have been other variations, but none so far have worked. There are no errors, but the code does not identify lists that definitely do exist. I am using simple echo statements for the purpose of testing, the actual code will include the function to create a new group.
# Test variable that cycles through each .json file.
$jsonFiles = Get-ChildItem -Path "c:\tmp\json" -Filter *.json |
Get-Content -Raw
$allobjects = ForEach-Object {
$jsonFiles | ConvertFrom-Json
}
$alreadyCreatedGroup = ForEach-Object {Get-UnifiedGroup | select alias}
# Determine if list already exists in Office 365
function checkForExistingGroup {
[CmdletBinding()]
Param(
[Parameter(Position=0, Mandatory=$true, ValueFromPipeline=$true)]
$InputObject
)
Process {
if ("gr-$($InputObject.alias)" -like $alreadyCreatedGroup) {
echo "Group exists"
} else {
echo "Group does not exist"
}
}
}
$allobjects | checkForExistingGroup
#$alreadyCreatedGroup | checkForExistingGroup
The above code always produces "Group does not exist" for each alias from the JSON data.
The individual variables appear to be outputting correctly:
PS> $alreadyCreatedGroup
Alias
-----
gr-jsonoffice365grouptest1
gr-jsonoffice365grouptest2
gr-jsonoffice365grouptest3
PS> $allobjects.alias
jsonoffice365grouptest3
jsonoffice365grouptest4
If I run the following on its own:
"gr-$($allobjects.alias)"
I get the following output:
gr-jsonoffice365grouptest3 jsonoffice365grouptest4
So on its own it appends the output from the JSON files, but I had hoped by using $InputObject in the function, this would resolve that issue.
Well, a single group will never be -like a list of groups. You want to check if the list of groups contains the alias.
if ($alreadyCreatedGroup -contains "gr-$($InputObject.Alias)") {
echo "Group exists"
} else {
echo "Group does not exist"
}
In PowerShell v3 or newer you could also use the -in operator instead of the -contains operator, which feels more natural to most people:
if ("gr-$($InputObject.Alias)" -in $alreadyCreatedGroup) {
echo "Group exists"
} else {
echo "Group does not exist"
}
And I'd recommend to pass the group list to the function as a parameter rather than using a global variable:
function checkForExistingGroup {
[CmdletBinding()]
Param(
[Parameter(Position=0, Mandatory=$true, ValueFromPipeline=$true)]
$InputObject,
[Parameter(Mandatory=$true, ValueFromPipeline=$false)]
[array]$ExistingGroups
)
...
}
"gr-$($allobjects.Alias)" doesn't produce the result you expect, because the expression basically means: take the Alias properties of all elements in the array/collection $allobjects, concatenate their values with the $OFS character (output field separator), then insert the result after the substring "gr-". That doesn't affect your function, though, because the pipeline already unrolls the input array, so the function sees one input object at a time.