I have a PowerShell script:
param (
[Parameter(Mandatory=$true)][string]$input,
[Parameter(Mandatory=$true)][string]$table
)
Write-Host "Args:" $Args.Length
Get-Content $input |
% { [Regex]::Replace($_, ",(?!NULL)([^,]*[^\d,]+[^,]*)", ",'`$1'") } |
% { [Regex]::Replace($_, ".+", "INSERT INTO $table VALUES (`$1)") }
The Write-Host part is for debugging.
I run it as .\csvtosql.ps1 mycsv.csv dbo.MyTable (from powershell shell), and get
Args: 0
Get-Content : Cannot bind argument to parameter 'Path' because it is an empty s
tring.
At C:\temp\csvtosql.ps1:7 char:12
+ Get-Content <<<< $input |
+ CategoryInfo : InvalidData: (:) [Get-Content], ParameterBinding
ValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationErrorEmptyStringNotAl
lowed,Microsoft.PowerShell.Commands.GetContentCommand
I get exactly the same error with any parameters that I pass, also the same error if I try to use named parameters.
What can cause parameters not to be passed in?
UPDATE: PowerShell ISE asks me for these parameters using GUI prompts, then gives me the same error about them not being passed in.
Unless you marked a parameter with the ValueFromRemainingArguments attribute (indicates whether the cmdlet parameter accepts all the remaining command-line arguments that are associated with this parameter), Args is "disabled". If all you need is the arguments count call the special variable:
$PSBoundParameters.Count
Do not mix. Make use of $args or parameters.
Also do note that $input is a special variable, don't declare it as a parameter. http://dmitrysotnikov.wordpress.com/2008/11/26/input-gotchas/
You're calling your script with positional parameters (i.e. unnamed) and PowerShell doesn't know how to map them to your script parameters. You need to either call your script using the parameter names:
.\csvtosql.ps1 -input mycsv.csv -table dbo.MyTable
or update your script to specify your preferred order of positional parameters:
param (
[Parameter(Mandatory=$true,Position=0)]
[string]
$input,
[Parameter(Mandatory=$true,Position=1)]
[string]
$table
)
Related
I'm trying to write a script which uses the powershell cmdlet get-content tail and inserts the new lines into the sql server table. i can't get the syntax to pipe the tail to the sqlinsert.ps1 file that handles the table insert.
i'm looking for help on how to pipe "get-content tail" to a sqlinsert.ps1 file to do a sql database insert statement using the following :
$startTime = get-date
Write-Host "\\iisserver\logs\Logs-$("{0:yyyyMMdd}" -f (get-date)).txt"
get-content "\\iisserver\logs\Logs-$("{0:yyyyMMdd}" -f (get-date)).txt" -tail 1 -wait | & "sqlinsert.ps1" -stmp $("{0:yyyy-MM-dd hh:mm:ss.fff}" -f (get-date)) -method "Error" -msg $_
# % { "$_ read at $(Get-Date -Format "hh:mm:ss")" }
in the sqlinsert.ps1 :
param ([string]$stmp, [string]$method, [string]$msg )
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$serverName';database='$databaseName';User ID = $uid; Password = $pwd;"
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
$sql = "insert into [tbl_iiserrors] (errstamp, method, msg) values (#stmp , #method, #msg) "
.
.
.
error i get:
& : The term 'sqlinsert.ps1' is not recognized as the name of a
cmdlet, function, script file, or operable program. Check the spelling
of the name, or if a path was included, verify that the path is
correct and try again. At C:\Temp\ob\iislog\tst_tail.ps1:3 char:95
... Mdd}" -f (get-date)).txt" -tail 1 -wait | & "sqlinsert.ps1" -stmp $ ...
~~~~~~~~~~~~~~~
CategoryInfo : ObjectNotFound: (sqlinsert.ps1:String) [], CommandNotFoundException
FullyQualifiedErrorId : CommandNotFoundException
Suggestion [3,General]: The command sqlinsert.ps1 was not found, but
does exist in the current location. Windows PowerShell does not load
commands from the current location by default. If you trust this
command, instead type: ".\sqlinsert.ps1". See "get-help
about_Command_Precedence" for more details.
The sqlinsert.ps1 works when i run it from powershell command :
PS c:\temp> .\sqlinsert -stmp 2020-11-20 00:00:00 -method 'eek' -msg 'uh hello'
In order to bind pipeline input to a parameter, you need to decorate it with a [Parameter] attribute and specify that it accepts pipeline input, like this:
param (
[string]$stmp,
[string]$method,
[Parameter(ValueFromPipeline = $true)]
[string]$msg
)
See the about_Functions_Advanced_Parameters help file for more details about how to modify the behavior of parameters
By design, for security reasons, PowerShell requires you to signal the intent to execute a script located in the current directory explicitly, using a path - .\sqlinsert.ps1 - rather than a mere file name - sqlinsert.ps1; that is what the suggestion following the error message is trying to tell you.
Note that you only need &, the call operator, if the script path is quoted and/or contains variable references - and .\sqlinsert.ps1 doesn't require quoting.
You can only use the automatic $_ variable, which represents the current input object from the pipeline inside a script block ({ ... }), such as one passed to the ForEach-Object cmdlet, which invokes that block for each object received via the pipeline.
Re the content of your script: Inside expandable strings ("..."), you cannot use # to refer to variables to be expanded (interpolated); use regular, $-prefixed variable references or $(...), the subexpression operator to embed expressions; also, it looks like you're inserting string values into the SQL table, so you'll have to enclose the expanded variable values in embedded '...'
$startTime = get-date
Get-Content "\\iisserver\logs\Logs-$("{0:yyyyMMdd}" -f (get-date)).txt" -Tail 1 -Wait |
ForEach-Object {
.\sqlinsert.ps1 -stmp ("{0:yyyy-MM-dd hh:mm:ss.fff}" -f (get-date)) -method "Error" -msg $_
}
The alternative to using a ForEach-Object call is to modify your script to directly receive its -msg argument from the pipeline, as shown in Mathias' answer, in which case you must omit the -msg $_ argument from your script call:
Get-Content ... |
.\sqlinsert.ps1 -stmp ("{0:yyyy-MM-dd hh:mm:ss.fff}" -f (get-date)) -method "Error"
I have a PowerShell function I'm writing to build and execute a variety of logman.exe commands for me so I don't have to reference the provider GUIDs and type up the command each time I want to capture from a different source. One of the parameters is the file name and I am performing some validation on the parameter. Originally I used -match '.+?\.etl$' to check that the file name had the .etl extension and additionally did some validation on the path. I later decided to remove the path validation but neglected to change the validation attribute to ValidatePattern.
What I discovered was that while it worked perfectly on the machine I was using to author and validate it, on my Server 2016 Core machine it seemed to misbehave when calling the function but that if I just ran the same check at the prompt it worked as expected.
The PowerShell:
[Parameter(ParameterSetName="Server", Mandatory=$true)]
[Parameter(ParameterSetName="Client", Mandatory=$true)]
[ValidateScript({$FileName -match '.+?\.etl$'}]
[string] $FileName = $null
The Output:
PS C:\Users\Administrator> Start-TBLogging -ServerLogName HTTPSYS -FileName ".\TestLog.etl"
PS C:\Users\Administrator> Start-TBLogging : Cannot validate argument on parameter 'FileName'. The "$FileName -match '.+?\.etl$'" validation script
for the argument with value ".\TestLog.etl" did not return a result of True. Determine why the validation script failed,
and then try the command again.
At line:1 char:50
+ Start-TBLogging -ServerLogName HTTPSYS -FileName ".\TestLog.etl"
+ ~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidData: (:) [Start-TBLogging], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationError,Start-TBLogging
Trying it manually worked:
PS C:\Users\Administrator> $FileName = ".\TestLog.etl"
PS C:\Users\Administrator> $FileName -match '.+?\.etl$'
True
After changing the function to use ValidatePattern it works just fine everywhere but I was wondering if anyone could shed light on the discontinuity.
As Joshua Shearer points out in a comment on a question, you must use automatic variable $_ (or its alias form, $PSItem), not the parameter variable to refer to the argument to validate inside [ValidateScript({ ... })].
Therefore, instead of:
# !! WRONG: The argument at hand has NOT yet been assigned to parameter
# variable $FileName; by design, that assignment
# doesn't happen until AFTER (successful) validation.
[ValidateScript({ $FileName -match '.+?\.etl$' }]
[string] $FileName
use:
# OK: $_ (or $PSItem) represents the argument to validate inside { ... }
[ValidateScript({ $_ -match '.+?\.etl$' })]
[string] $FileName
As briantist points out in another comment on the question, inside the script block $FileName will have the value, if any, from the caller's scope (or its ancestral scopes).
I trying to run the this job with paramaters
$courses = {
param($securitytoken_path_a1 ,$EmailPasswordPath_a1 ,$EmailTo_a1)
Write-Host $securitytoken_path_a1 | Format-Table -Property *
C:\Users\so\Desktop\CanvasColleagueIntergration\PowerShells\DownloadInformation.ps1 -securitytoken_path ($securitytoken_path_a1) -emailPasswordPath $EmailPasswordPath_a1 -object "courses" -EmailTo $EmailTo_a1 -test $false
}
I am passing these parameters
$args1 = #{ "securitytoken_path_a1" = "C:\Credentials\CANVAS_API_PROD_FRANCO.TXT" ; "EmailPasswordPath_a1" = "C:\Credentials\EMAILFRANCO.txt"; "EmailTo_a1" = 'fpettigrosso#holyfamily.edu'}
when I invoke the job with this command it fails
Start-Job -ScriptBlock $courses -Name "Test" -ArgumentList $args1
when I try to see what is the issue I get the error back
Cannot bind argument to parameter 'emailPasswordPath' because it is an empty string.
+ CategoryInfo : InvalidData: (:) [DownloadInformation.ps1], ParameterBindingValidationException
+ FullyQualifiedErrorId : ParameterArgumentValidationErrorEmptyStringNotAllowed,DownloadInformation.ps1
+ PSComputerName : localhost
help
What you're looking for is splatting: the ability to pass a set of parameter values via a hashtable (or, less commonl, via an array) to a command.
Generally, in order to signal the intent to splat, a special sigil - # is required, so as to distinguish it from a single argument that just happens to be a hashtable:
$args1 passes a single argument that happens to be a hashtable.
#args1 - note how sigil $ has been replaced with # - tells PowerShell to apply splatting, i.e., to consider the hashtable's key-value pairs to be parameter-name-value pairs (note that the hashtable keys mustn't start with -, which is implied)
However, splatting only works directly for a given command, and you cannot relay a splatted hashtable via a command's single parameter.
That is, attempting to use -ArgumentList #args1 actually fails.
Your own solution works around that by passing the hashtable as-is to the script block and then explicitly accessing that hashtable's entries one by one.
An alternative solution is to use the hashtable argument to apply splatting inside the script block:
$courses = {
param([hashtable] $htArgs) # pass the hashtable - to be splatted later - as-is
$script = 'C:\Users\fpettigrosso\Desktop\CanvasColleagueIntergration\PowerShells\DownloadInformation.ps1'
& $script #htArgs # use $htArgs for splatting
}
Note, however, that the target command's parameter names must match the hashtable keys exactly (or as an unambiguous prefix, but that's ill-advised), so the _a1 suffix would have to be removed from the keys.
If modifying the input hashtable's keys is not an option, you can use the following command to create a modified copy whose keys have the _a1 suffix removed:
# Create a copy of $args1 in $htArgs with keys without the "_a1" suffix.
$args1.Keys | % { $htArgs = #{} } { $htArgs.($_ -replace '_a1$') = $args1.$_ }
I changed the parameters in the $courses so it will take a hashtable
$courses = {
param($a1)
Write-Host $a1.securitytoken_path_a1 | Format-Table -Property *
C:\Users\fpettigrosso\Desktop\CanvasColleagueIntergration\PowerShells\DownloadInformation.ps1 -securitytoken_path $a1.securitytoken_path_a1 -emailPasswordPath $a1.EmailPasswordPath_a1 -object "courses" -EmailTo $a1.EmailTo_a1 -test $false
}
Consider the following toy example script test.ps1:
Param(
[Parameter(ParameterSetName='readfile',Position=0,Mandatory=$True)]
[string] $FileName,
[Parameter(ParameterSetName='arg_pass',Mandatory=$True)]
[switch] $Ping
)
if ($Ping.isPresent) {
&$env:ComSpec /c ping $args
} else {
Get-Content $FileName
}
The desired effect would be that
.\test.ps1 FILE.TXT
displays the contents of FILE.TXT and
.\test.ps1 -Ping -n 5 127.0.0.1
pings localhost 5 times.
Unfortunately, the latter fails with the error
A parameter cannot be found that matches parameter name 'n'.
At line:1 char:18
+ .\test.ps1 -Ping -n 5 127.0.0.1
+ ~~
+ CategoryInfo : InvalidArgument: (:) [test.ps1], ParameterBindingException
+ FullyQualifiedErrorId : NamedParameterNotFound,test.ps1
This is just a minimal example, of course.
In general, I am looking for a way to introduce a [switch] parameter to my script that lives inside its own parameter set and when that switch is present, I want to consume all remaining arguments from the commandline and pass them on to another commandline application. What would be the way to do this in PowerShell?
You can use the ValueFromRemainingArguments parameter attribute. I would also recommend specifying a default parameter set name in CmdletBinding. Example:
[CmdletBinding(DefaultParameterSetName="readfile")]
param(
[parameter(ParameterSetName="readfile",Position=0,Mandatory=$true)]
[String] $FileName,
[parameter(ParameterSetName="arg_pass",Mandatory=$true)]
[Switch] $Ping,
[parameter(ParameterSetName="arg_pass",ValueFromRemainingArguments=$true)]
$RemainingArgs
)
if ( $Ping ) {
ping $RemainingArgs
}
else {
Get-Content $FileName
}
(Aside: I don't see a need for & $env:ComSpec /c. You can run commands in PowerShell without spawning a copy of cmd.exe.)
I have the following powershell script:
param (
[Parameter(Mandatory=$true)][int[]]$Ports
)
Write-Host $Ports.count
foreach($port in $Ports) {
Write-Host `n$port
}
When I run the script with $ powershell -File ./test1.ps1 -Ports 1,2,3,4 it works (but not as expected):
1
1234
When I try to use larger numbers, $ powershell -File .\test.ps1 -Ports 1,2,3,4,5,6,10,11,12, the script breaks entirely:
test.ps1 : Cannot process argument transformation on parameter 'Ports'. Cannot convert value "1,2,3,4,5,6,10,11,12" to type "System.Int32[]". Error: "Cannot convert value "1,2,3,4,5,6,10,11,12" to type "System.Int32". Error: "Input
string was not in a correct format.""
+ CategoryInfo : InvalidData: (:) [test.ps1], ParentContainsErrorRecordException
+ FullyQualifiedErrorId : ParameterArgumentTransformationError,test.ps1
It seems like powershell is trying to process any numbers passed via the Ports param as a single number, though I'm not sure why this is happening, or how to get past it.
The issue is that a parameter passed through powershell.exe -File is a [string].
So for your first example,
powershell -File ./test1.ps1 -Ports 1,2,3,4
$Ports is passed as [string]'1,2,3,4' which then attempts to get cast to [int[]]. You can see what happens with:
[int[]]'1,2,3,4'
1234
Knowing that it will be an just a regular [int32] with the comma's removed means that casting 1,2,3,4,5,6,10,11,12 will be too large for [int32] which causes your error.
[int[]]'123456101112'
Cannot convert value "123456101112" to type "System.Int32[]". Error: "Cannot convert value "123456101112" to type "System.Int32". Error: "Value was either too
large or too small for an Int32.""
To continue using -file you could parse the string yourself by splitting on commas.
param (
[Parameter(Mandatory=$true)]
$Ports
)
$PortIntArray = [int[]]($Ports -split ',')
$PortIntArray.count
foreach ($port in $PortIntArray ) {
Write-Host `n$port
}
But luckily that is unnecessary because there is also powershell.exe -command. You can call the script and use the PowerShell engine to parse the arguments. This would correctly see the Port parameter as an array.
powershell -Command "& .\test.ps1 -Ports 1,2,3,4,5,6,10,11,12"