check_nrpe put $ at end of check string - powershell

i've a problem with my powershell nagios script, mounted on a MS Windows Server 2008 64bit, with a NRPE_NT daemon.
I've declared the command like this:
command[check_files]=cmd /c echo C:\nrpe\libexec\check_file.ps1 $ARG1$; exit($lastexitcode) | powershell.exe -command -
I've set ExecutionPolicy as unrestricted
I've restart NRPE_NT Services and declared command on console, like this:
$USER1$/check_nrpe -H $HOSTADDRESS$ -t 60 -c check_files -a $ARG1$
Now, why if i run it locally, it works good:
C:\>cmd /c echo C:\nrpe\libexec\_file.ps1 C:\nrpe; exit($lastexitcode)| powershell.exe -command -
No file/s present with this string
But if i run it by check_nrpe i receive this output:
'-' was specified with the -Command parameter: no other arguments to -Command are permitted.
In Debug mode, on NRPE.log i can see this:
Running command: cmd /c echo C:\nrpe\libexec\check_file.ps1 C:\nrpe;
exit($lastexitcode) | powershell.exe -command - $
Command completed with return code 0
Why in this way, check_nrpe add a dollar character ($) at end string, derailing the entire control ?
Thanks in advance

I'm not sure if this will help your situation, but I just figured out something in my environment that was causing something similar. Here was my NRPE command comfiguration:
command[foo]=grep file '^pattern$'
Everything was fine until I wanted to add another parameter after the '^pattern$' parameter... that new parameter (at the end of the command-line) would get an extra $ appended to the end.
It seems that NRPE requires $ to be escaped, otherwise it thinks that it's a variable reference and does odd things with it. I was expecting that the quoting would make it so that no escaping was required, but NRPE's configuration files don't follow shell-style quoting rules. So, changing my NRPE configuration to this solved my problem:
command[foo]=grep file '^pattern$$'
Note the double $$ in the revised NRPE command definition.

Related

Keep Windows Terminal tab open after invoked WSL command

I'm trying to open a WSL (Ubuntu) tab in Windows Terminal, and then run a command in that tab using WSL. I use the following PowerShell command for that:
wt new-tab -p "WSL (Ubuntu)" wsl echo "hallo"
The problem is, after echo has run, the tab closes immediately.
Is there a way to keep it open?
When you pass a command line to wt.exe, the Windows Terminal CLI, it is run instead of a shell, irrespective of whether you also specify a specific shell profile with -p.
Thus, to achieve what you want:
Your command line must explicitly invoke the shell of interest...
...and that shell must support starting an interactive, stay-open session in combination with startup commands.
While PowerShell supports this, POSIX-compatible shells such as bash - WSL's default shell - do not.
A - suboptimal - workaround is the following:
wt -p 'WSL (Ubuntu)' wsl -e bash -c 'echo ''hello''\; exec $BASH'
Note:
Inexplicably, as of Windows Terminal v1.13.11431.0,
the ; char. inside the quoted -c argument requires escaping as \; - otherwise it is interpreted by wt.exe as its separator for opening multiple tabs / windows.
The above executes the specified echo command first, and then uses exec to replace the non-interactive, auto-closing original shell with an interactive, stay-open session via exec. The limitation of this approach is that any state changes other than environment-variable definitions made by the startup command(s) are lost when the original shell is replaced.
A better, but more elaborate solution is possible: create a temporary copy of Bash's initialization file, ~/.bashrc, append your startup commands, and pass the temporary copy's file path to bash's --rcfile option; delete the temporary copy afterwards.

How to run Bash commands with a PowerShell Core alias?

I am trying to run a Bash command where an alias exists in PowerShell Core.
I want to clear the bash history. Example code below:
# Launch PowerShell core on Linux
pwsh
# Attempt 1
history -c
Get-History: Missing an argument for parameter 'Count'. Specify a parameter of type 'System.Int32' and try again.
# Attempt 2
bash history -c
/usr/bin/bash: history: No such file or directory
# Attempt 3
& "history -c"
&: The term 'history -c' is not recognized as the name of a cmdlet, function, script file, or operable program.
It seems the issue is related to history being an alias for Get-History - is there a way to run Bash commands in PowerShell core with an alias?
history is a Bash builtin, i.e. an internal command that can only be invoked from inside a Bash session; thus, by definition you cannot invoke it directly from PowerShell.
In PowerShell history is an alias of PowerShell's own Get-History cmdlet, where -c references the -Count parameter, which requires an argument (the number of history entries to retrieve).
Unfortunately, Clear-History is not enough to clear PowerShell's session history as of PowerShell 7.2, because it only clear's one history (PowerShell's own), not also the one provided by the PSReadLine module used for command-line editing by default - see this answer.
Your attempt to call bash explicitly with your command - bash history -c - is syntactically flawed (see bottom section).
However, even fixing the syntax problem - bash -c 'history -c' - does not clear Bash's history - it seemingly has no effect (and adding the -i option doesn't help) - I don't know why.
The workaround is to remove the file that underlies Bash's (persisted) command history directly:
if (Test-Path $HOME\.bash_history) { Remove-Item -Force $HOME\.bash_history }
To answer the general question implied by the post's title:
To pass a command with arguments to bash for execution, pass it to bash -c, as a single string; e.g.:
bash -c 'date +%s'
Without -c, the first argument would be interpreted as the name or path of a script file.
Note that any additional arguments following the first -c argument would become the arguments to the first argument; that is, the first argument acts as a mini-script that can receive arguments the way scripts usually do, via $1, ...:
# Note: the second argument, "-", becomes $0 in Bash terms,
# i.e. the name of the script
PS> bash -c 'echo $0; echo arg count: $#' self one two
self
arg count: 2

Running two cmd commands in one go?

cd "C:\Program Files\GPSoftware\Directory Opus\"
followed by
dopusrt.exe /info documents\filelist1.txt,listsel,0
Attempting to run it like so;
2::
Run, %comspec% /k cd "C:\Program Files\GPSoftware\Directory Opus\" && %ComSpec% /k dopusrt.exe /info documents\filelist1.txt,listsel,0,, Hide
Return
Gives me an error. ==> The following variable name contains an illegal character: ", Hide"
It thinks, the commas in the second CMD command are AHK parameters.
I've tried quoting the second command in its entirety but the CMD window seem to only receive the first command.
Thank you.
The commas indeed are one problem, another problem is your usage of %comspec% /k.
Right now, what you're trying to, is use the Run(docs) command, where the parameters would be as follows:
Target = C:\WINDOWS\system32\cmd.exe /k cd "C:\Program Files\GPSoftware\Directory Opus\" && C:\WINDOWS\system32\cmd.exe /k dopusrt.exe /info documents\filelist1.txt
WorkingDir = listsel
Options = 0
OutputVarPID = , Hide
The comspec(docs) variable contains the path to cmd.exe and the /k switch(docs) means to run the specified command.
So, you of course don't want to specify these things twice. Just one at the start of the command. (Run a program (cmd.exe) with the specified parameters (/k, cd, "C:\Program Files\...))
And about the commas, in legacy syntax (you're writing legacy syntax here) you'll need to escape(docs) them with `,.
So in legacy syntax your finished command would look like this:
Run, %ComSpec% /k cd "C:\Program Files\GPSoftware\Directory Opus\" && dopusrt.exe /info documents\filelist1.txt`,listsel`,0, , Hide
And in modern expression syntax it'd look like this:
Run, % A_ComSpec " /k cd ""C:\Program Files\GPSoftware\Directory Opus\"" && dopusrt.exe /info documents\filelist1.txt,listsel,0", , Hide
I'd recommend ditching the legacy syntax and starting to just write expression syntax.
Here's a documentation page to get you started about the differenced between legacy syntax and expression syntax, if you're interested.
But really, this whole approach with cding to the directory where dopusrt.exe is seems really silly to me. Not seeing the point of it.
Should be fine to just run the dopusrt.exe program directly?
Run, % """C:\Program Files\GPSoftware\Directory Opus\dopusrt.exe"" /info documents\filelist1.txt,listsel,0", , Hide

javac powershell classpath separator

So I'm aware that different operating systems require different classpath separators. I'm running a build of windows where CMD has been replaced with Powershell which is causing problems when using the semi-colon separator.
The command I'm trying to run begins with cmd /c to try and get it to run in command prompt instead but I think when PowerShell is parsing the whole command it sees the semi-colon and thinks that is the end!
My whole command is:
cmd /c javac -cp PATH1;PATH2 -d DESTINATION_PATH SOURCE_PATH
I have tried using a space, colon and period to no avail. Can anybody suggest a solution?
This is my first question on stackoverflow, hope the community can help and that it will eventually help others. :)
I suggest you start the process in the following way using Powershell
Start-Process cmd.exe -ArgumentList "/c javac -cp PATH1;PATH2 -d DESTINATION_PATH SOURCE_PATH" -NoNewWindow
Running javac in CMD shouldn't be required. Just put quotes around arguments that (may) contain whitespace or special characters. I'd also recommend using the call operator (&). It's optional in this case, but required if you put the executable in quotes (e.g. because the executable or path contains spaces or you want to put it in a variable).
& javac -cp "PATH1;PATH2" -d "DESTINATION_PATH" "SOURCE_PATH"
You could also use splatting for providing the arguments:
$javac = "$env:JAVA_HOME\bin\javac.exe"
$params = '-cp', "PATH1;PATH2",
'-d', "DESTINATION_PATH",
"SOURCE_PATH"
& $javac #params
javac -classpath "path1:path2:." main.java does the trick in powershell. the cmd doesn't need the doble quotes however while using powershell we need to put the quotes and it works smoothly.

How does PUTTY/PLINK determine a command has returned?

Is it newline? prompt? What exactly?
Trying to run powershell over plink and the command executes but plink doesn't recognise its finished and the session hangs. Most curiously though, the command executes successfully when sent through the shell (via Putty). However, when sent via plink, the same command hangs...
Any ideas?
Telnet is nearly a raw TCP connection. All Putty needs back is a response from the server. The rest is controlled by the shell and SSH/Telnet server.
While your task is running, it's not going to return a command prompt.
On Linux, Unix, and Mac OS X you could put a & after the command to run it in the background and return to the command prompt.
Try running it in the local terminal/command shell. You should basically see the same thing.
Ok, well I'm still not quite sure what the problem is, but I've found a workaround via the TeamCity forums.
Basically you want to echo some abitrary string and pipe that output into your powershell executable, like thus:
echo 'executing powershell...' | C:\windows\system32\windowspowershell \v1.0\powershell.exe exit 1
So then your full plink command becomes:
plink.exe user#someIp -i key.ppk -P 22 -batch -v "echo 'executing powershell...' | C:\windows\system32\windowspowershell\v1.0\powershell.exe exit 1"
Nb. Plink will still pass through return codes and console output using this method.
Link to TeamCity forum:
http://youtrack.jetbrains.net/issue/TW-6021
Hope this helps
I had the same problem with an other program. I used the >&2 (redirect output to std err) after the last command, this worked fine for me.
Just add "return XX" into remote shell script, it will be return value to local console. After plink has been finished, type echo %errorlevel% to see return code XX.