I use the following PowerShell command to run a Perl command (pp):
pp --% -u -x -g --link openssl.exe --link libpng16-16_.dll -o D:\Dati\file.exe -F Bleach="^(AT_|DB_)" "G:\Scripts work\script.pl"
Copy and Paste it in PowerShell, and it works like a charme. Now I want to run the same from a bat file. I tried:
set CMD_LINE_ARGS="%*"
powershell -Command "{pp --% -u -x -g --link openssl.exe --link libpng16-16_.dll -o D:\Dati\file.exe -F Bleach="^(AT_^|DB_)" "G:\Scripts work\script.pl"}"
I tried to escape the pipe "|" with `.
My bat is run, no errors, but my script is not lunched. Any idea?
You need to escape special CMD characters in batch in order for them to parsed by PowerShell.exe. Some characters include ",^. Inner double quotes can be double quoted or escaped with a backslash. Command characters need to be escaped with ^.
powershell.exe -Command "{pp --% -u -x -g --link openssl.exe --link libpng16-16_.dll -o D:\Dati\file.exe -F Bleach=\"^^(AT_^|DB_)\" \"G:\Scripts work\script.pl\"}"
Related
I'm experiencing an issue running a pg_dump command from within a package.json script
My hypothesis is that it has something to do with escaping quotes.
Environment = Windows
Here is the setup to reproduce.
Try running this command from a Bash terminal on Windows. Replace your_database, your_schema and YourTable with your own settings relavant to your own environment..
pg_dump -U postgres -d your_database -n your_schema -a -F p -f ./your_file.sql -t '"YourTable"'
For me, this works fine. ✅
Try using the same command from within a package.json file (note the need to escape the double quotes surrounding the name of YourTable to conform to proper json string)...
{
"name": "test_pg_dump_script",
"version": "0.0.1",
"scripts": {
"db-dump-test": "pg_dump -U postgres -d your_database -n your_schema -a -F p -f ./your_file.sql -t '\"YourTable\"'"
}
}
Then try run this script using the command from your terminal npm run db-dump-test
For me, this does not work. ❌ It results in the following
> test_pg_dump_script#0.0.1 db-backup-test-data
> pg_dump -U postgres -d your_database -n your_schema -a -F p -f ./your_file.sql -t '"YourTable"'
pg_dump: error: no matching tables were found
To try and solve this, I have attempted various combinations of quotes around the YourTable section in the package.json script, alas to no avail. e.g. I've tried removing this single quotes, removing the double quotes, having no quotes at all, etc..
I'm wondering if anyone else might have an insight as to what is going on here?
I need to expand variables before running the SCP command as a result I can't use single quote. If I run the script using double quotes in Powershell ISE it works fine.
But doesn't work if I run the script through command prompt.
I'm using zabbix to run the script which calls the script as [cmd /C "powershell -NoProfile -ExecutionPolicy Bypass -File .\myscript.ps1"]
Here is the code that needs to run SCP using Cygwin bash.
if ((test-path "$zipFile"))
{
C:\cygwin\bin\bash.exe -l "set -x; scp /cygdrive/e/logs/$foldername/dir1/$foldername.zip root#10.10.10.10:~/"
}
Output:
/usr/bin/bash: set -x; /cygdrive/e/logs/myfolder/dir1/server.zip root#10.10.10.10:~/: No such file or directory
If I run the same command above in Cygwin manually it works.
I even tried to use bash -l -c but then the SSH session is stuck maybe because the root#10.10.10.10 becomes $1 according to the documentation.
Documentation link
-c If the -c option is present, then commands are read from
the first non-option argument command_string. If there are
arguments after the command_string, the first argument is
assigned to $0 and any remaining arguments are assigned to
the positional parameters. The assignment to $0 sets the
name of the shell, which is used in warning and error
messages.
Figured it out. It was halting when using bash -c was due to StrictHostKeyChecking, the known hosts thing (where you get a prompt to type yes/no). I set the -v switch to SCP and it showed me the Debug logs where it was halting.
Had to set scp -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null options.
The complete line now looks like the following:
c:\$cygwin_folder\bin\bash.exe -c ("/usr/bin/scp -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -v -i /cygdrive/c/cygwin/home/myuser/.ssh/id_rsa /cygdrive/e/logs/$foldername/dir1/$foldername.zip root#10.10.10.10:~/")
I need to execute the following command in WSL:
sudo curl -L "https://github.com/docker/compose/releases/download/1.23.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
In order to execute it from powershell, i tried to run:
Ubuntu1804 run "sudo curl -L 'https://github.com/docker/compose/releases/download/1.23.2/docker-compose-$(uname -s)-$(uname -m)' -o /usr/local/bin/docker-compose"
But errors occur as it cannot find the value of uname -s and uname -m
uname : The term 'uname' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling
of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:107
+ ... ocker/compose/releases/download/1.23.2/docker-compose-$(uname -s)-$(u ...
+ ~~~~~
+ CategoryInfo : ObjectNotFound: (uname:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
But the following command works as i manually entered the value of uname -s and uname -m in that command.
Ubuntu1804 run "sudo curl -L 'https://github.com/docker/compose/releases/download/1.23.2/docker-compose-Linux-x86_64' -o /usr/local/bin/docker-compose"
Can anyone please help me to find when using powershell, how to incorporate the results of some commands to another command and execute it in WSL?
Also, how can i incorporate the value of environment variables like $USER in WSL commands and execute from powershell?
The problem is your subexpressions $( ) in the argument to the command. PowerShell interprets these in expression mode (because of the double-quotes which allow expansion) and is looking for an executable named uname before passing the argument to Ubuntu1804.
Solutions:
Use the stop-parser operator after run: --%
Flip your quotes so expansion doesn't happen: ... 'sudo curl -L " ...
Escape the subexpressions: `$`(`)
To answer how you include environment variables in the WSL command:
& Ubuntu1804.exe ... $Env:USER ...
about_Parsing
about_Environment_Variables
For normal commands like sudo apt-get update -y just use stop-parsing operator as:
Ubuntu1804 run --% sudo apt-get update -y
For commands like curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add - which includes pipeline characters (|) , using only --% will not work. In such cases enclosing the command within double quotes works as:
Ubuntu1804 run --% "curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -"
Also, avoid using usage of backslash or new line in commands when using --% operator. For example, instead of using like this:
Ubuntu1804 run --% sudo apt-get install \
apt-transport-https \
ca-certificates \
curl \
gnupg-agent \
software-properties-common
Execute like:
Ubuntu1804 run --% sudo apt-get install apt-transport-https ca-certificates curl gnupg-agent software-properties-common
Including environment variables in the WSL command by using like $Env:USER doesn't worked for the command sudo usermod -aG docker $USER.
It is an error if i use it like this:
Ubuntu1804 run sudo usermod -aG docker $Env:USER
or like this:
Ubuntu1804 run --% sudo usermod -aG docker $Env:USER
What worked for me to include the environment variables of WSL in powershell commands is using the stop-parsing operator --% like this:
Ubuntu1804 run --% sudo usermod -aG docker $USER
I've read the documentation (sqlcmd Utility), but I can't get sqlcmd's -v parameter to work as expected.
Assuming this SQL script (echo.sql):
:setvar THE_PATH C:\Users\Craig\Desktop
PRINT 'THE_PATH: $(THE_PATH)'
:setvar THE_TOP 10
PRINT 'THE_TOP: $(THE_TOP)'
When run at the PowerShell prompt without the -v argument set:
PS> sqlcmd -E -S 'server' -d 'database' -i '.\echo.sql'
THE_PATH: C:\Users\Craig\Desktop
THE_TOP: 10
Setting the numeric variable (THE_TOP) are ignored:
PS> sqlcmd -E -S 'server' -d 'database' -i '.\echo.sql' -v THE_TOP=5
PS> sqlcmd -E -S 'server' -d 'database' -i '.\echo.sql'
THE_PATH: C:\Users\Craig\Desktop
THE_TOP: 10
If I eliminate the default value for THE_TOP in echo.sql, it reinforced the assertion that the parameter is being ignored:
:setvar THE_TOP
PRINT 'THE_TOP: $(THE_TOP)'
PS> sqlcmd -E -S 'server' -d 'database' -i '.\echo.sql' -v THE_TOP=5
THE_PATH: C:\Users\Craig\Desktop
THE_TOP: $(THE_TOP)
If I attempt to set the THE_PATH parameter, I get:
PS> sqlcmd -E -S 'server' -d 'database' -i '.\echo.sql' -v THE_PATH="C:\path"
Sqlcmd: ':\path': Invalid argument. Enter '-?' for help.
What is the correct -v syntax?
OK, this is retarded.
If :setvar is used in the script:
:setvar THE_TOP
PRINT 'THE_TOP: $(THE_TOP)'
then you attempt to set it at the PowerShell prompt, you'll get an error:
PS> sqlcmd -E -S server -d database -i .\echo.sql -v THE_TOP=5
'THE_TOP' scripting variable not defined.
However, if you DO NOT set it in the script:
-- variable disabled
-- :setvar THE_PATH
PRINT 'THE_TOP: $(THE_TOP)'
then you attempt to set it at the PowerShell prompt, then it will work as (not really) expected:
PS> sqlcmd -E -S server -d database -i .\echo.sql -v THE_TOP=5
THE_TOP: 5
** edit **
To supply a dynamically-generated path as a parameter to a script file (extract.sql):
:out $(THE_PATH)\extract.csv
SELECT *
FROM the_table
WHERE the_key = $(THE_ID)
...
Execute in PowerShell session:
PS> $cd = Get-Location
PS> sqlcmd -E -S 'server' -d 'database' -i '.\extract.sql' -v THE_ID=5 THE_PATH=`"$cd\build`"
I am writing a batch job for Postgres for first time. I have return ".sh" file, which has a command with out any out put in the log or console.
Code
export PGPASSWORD=<password>
psql -h <host> -p <port> -U <user> -d <database> --file cleardata.sql > log\cleardata.log 2>&1
What I did at cammond line
su postgres
and run ./cleardatasetup.sh
Nothing is happening.
Please note : When I try psql command in Unix command line, I am getting message as some SQL exception which is valid.
Can any one please help me in this regard.
You probably wanted to create log/cleardata.log but you have a backslash where you need a slash. You will find that the result is a file named log\cleardata.log instead.
The backslash is just a regular character in the file's name, but it's special to the shell, so you'll need to quote or escape it to (unambiguously) manipulate it from the shell;
ls -l log\\cleardata.log # escaped
mv 'log\cleardata.log' log/cleardata.log # quoted