I'm very new to PowerShell. I wrote a cmdlet which works fine. However, when I try and invoke it inside a job...
. .\MyCmdlet.ps1 # Dot Source
$GetProcesssJob = Start-Job -ScriptBlock {
MyCmdlet
} -Credential $specialCredentials
...I get the error that it's "not recognized as the name of a cmdlet, function, script file, or operable program". What am I doing wrong?
My problems were two-fold. As TheIncorrigible1 pointed out, I needed to put the dot-sourcing inside the ScriptBlock. However, I had tried that previously and it didn't work. I now realize that's because the credentials I was using in $specialCredentials didn't have access privileges to the file MyCmdlet.ps1!
Related
I made a PowerShell script that will send a toast notification via/arguments. In the script, I use the param function in the beginning of the script something like....this
param($UserID)
Now I open a command prompt to load a PowerShell script and adding these arguments
powershell C:\file\in\directory\PowershellScript.ps1 -UserID "Mikey (TEST)"
When I sent the prompt over to PowerShell, it gave me this red error that says
TEST : The term 'TEST' is not recognized as the name of a cmdlet, function,
script file, or operable program. Check the spelling of the name, or if a path
was included, verify that the path is correct and try again.
At line:1 char:64
Assumingly the parentheses ( ) could be the culprit as PowerShell is thinking of loading a module of some sort instead of applying as a string value. I even tried escaping the parentheses using the backslash \(TEST\) but that didn't work either as it said TEST\ : The term 'TEST\' is not recognized as the name of a cmdlet, function, script file, or operable program..
Is there something I'm missing or something part of a script I should add to?
Use single quotes inside the ps command:
powershell C:\file\in\directory\PowershellScript.ps1 -UserID 'Mikey (TEST)'
I have a lot of powershell scripts so I created a custom logging Powershell module with a couple of cmdlets. The problem is that we have had trouble getting all of the users to download and put the custom module in their PSModulePath.
Ideally I'd like the script to continue silently and not show any errors if the cmdlet cannot be found, but I haven't found a good way to do it. Right now we get messages like this:
C:> foo-bar
foo-bar : The term 'foo-bar' is not recognized as the name of a cmdlet, function, script file, or operable program...
Setting ErrorActionPreference will suppress the message, but it also suppresses all other error messages which isn't desirable.
I could write a custom function in every script that checks to see if the module is loading before calling the custom cmdlets, but it seems like there should be a better way to do it.
For a given cmdlet foo-bar, you can do the following, once, at the beginning of the script:
if (-not (Get-Command foo-bar -ErrorAction Ignore)) {
# Define a dummy function that will be a no-op when invoked.
function foo-bar { }
}
All subsequent calls to foo-bar then either call the actual cmdlet, if available, or the dummy function, resulting in a quiet no-op.
You can use try/catch in a different way to suppress that. Since it is not a cmdlet, -erroraction won't work directly. So you can try like this:
try
{
foo-bar
}
catch [Management.Automation.RuntimeException]
{
}
I am working on a PowerShell Deployment scripts for BizTalk. I want to import an itinerary in XML format using PowerShell. Commands available for this task is esbimportutil.exe. But this works only in Command Prompt and not in PowerShell.
The error shows is :
The term 'esbimportutil.exe' is not recognized as the name of a cmdlet, function, script file,
or operable program.
I run the PowerShell as an Administrator and even tried running the command from the source root location but still no use.
I got the solution. The problem was resolved by using a simple command:
Start-Process -FilePath "...\esbimportutil.exe" -ArgumentList $argument
The command "start-process" did the magic.
Say I have two scripts:
Script 1:
helper.ps1 with contents:
#helper.ps1
Function foo
{
# do something
}
Script 2:
worker.ps1 with contents:
#worker.ps1
. 'c:\helper.ps1' # This is the correct file location, verified
Write-Output "Starting foo..."
foo
Write-Output "Done"
These two files are already uploaded to the remote server, and I try to run these using remote session with Invoke-Command:
Invoke-Command -ScriptBlock {
param($script)
& $script
} -Args 'worker.ps1'
It turns out most part of worker.ps1 is working correctly, in the above example, we will be able to get the output of line 1 and line 3.
However, it cannot run the function foo, with exception says that it is not function/script/anything, which basically means the helper.ps1 is not loaded correctly:
The term 'foo' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included.
The question is, is this the expected behavior? Can we not load other script in one script using remote session control, even when both files are uploaded and existed in the remote server?
Invoke command below:
Invoke-Command -ScriptBlock {param($script) & $script} -Args 'worker.ps1' .
Exception: The term 'foo' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included. I think Invoke-Command does what it does, since I got other lines executed without problem
I messed with this a bit and found the following to work:
worker.ps1 file:
#worker.ps1
. "c:\helper.ps1"
Write-Output "Starting foo..."
foo
Write-Output "Done"
helper.ps1 file:
#helper.ps1
Write-Host "Loading foo function..."
Function foo
{
# do something
Write-Host "The foo is alive!"
}
Write-Host "Foo loaded"
Command on remote system:
Invoke-Command -ScriptBlock {param($script) & $script} `
-ArgumentList 'c:\Worker.ps1' -ComputerName Machine
Output:
Loading foo function...
Foo loaded
Starting foo...
The foo is alive!
Done
So it may just be an issue with the single quotes instead of double quotes around the filename. I was able to execute the Invoke-Command block from two different systems to this machine and had the same results on both. (All my systems are running PS v4 so you may see different results on PS v3.)
I have a Powershell script where the user passes in a script as a parameter. After that is passed in, I cannot call the script by using $scriptvariable. Is there any way to call a Powershell script from within another Powershell script, when the one script needs to be called from a variable.
param(
[string]hostval,
[string]$scriptpath
)
Invoke-Command -Computer $hostval -Scriptblock { $scriptpath } -credential $cred
This does not work, and I'm not sure if what I want is possible. Is there a parameter type (ex: [script]$scriptpath) that I can use so the script can be called from $scriptpath?
It sounds like you need to use the -FilePath parameter, instead of -Scriptblock:
-FilePath <String>
Runs the specified local script on one or more remote computers. Enter the path and file name of the script, or pipe a script path to Invoke-Command. The script must reside on the local computer or in a directory that the local computer can access. Use the ArgumentList parameter to specify the values of parameters in the script.