kdb string path miss quotation mark - kdb

I would like to join strings for file name and shell script so I can run the command line in kdb for ftp transfer.
But there are quotations in quotation marks. not sure how to add / in there.
This is the code I have:
host:"abc.com";
usr:"def";
path:"get /home/eddie/abc.csv /home/terry/";
cmd:" " sv ("/home/kdb/eddie/ftp.sh";host;usr;path);
system cmd;
So the path will not have quotation mark and will be running error. How can I solve this problem?

You can escape quotes with \ e.g. "\"Matt\"" but I don't think that's your issue. It looks like you are attempting to use get in the system command. This is a kdb keyword and your OS will not recognise it. You should just be passing the location of the csv to your ftp script.
Edit:
You may also need sh in the system command.
cat test.sh
echo $1
system "test.sh hello"
sh: ./test.sh: Permission denied
'os
system "sh test.sh hello"
"hello"

Assuming that you simply want quotes within a string, it may just be as simple as using .Q.s1 aka -3!, see https://code.kx.com/q/ref/dotq/#qs1-string-representation
q)" " sv ("/home/kdb/eddie/ftp.sh";host;usr;.Q.s1 path)
"/home/kdb/eddie/ftp.sh abc.com def \"get /home/eddie/abc.csv /home/terry/\""
q)" " sv ("/home/kdb/eddie/ftp.sh";host;usr;-3!path)
"/home/kdb/eddie/ftp.sh abc.com def \"get /home/eddie/abc.csv /home/terry/\""

Related

Perl SYSTEM command fails with "Bad file descriptor" when running via Jenkins

I have a simple system command to copy file from one folder to another:
my $cmd = "xcopy /Y c:\DBs\Support\db.bak c:\jenkins\workdir\sql-bak-files";
When I try to run the following system commands, all fails:
1. my $res = qx/$cmd/;
2. my $res = qx($cmd);
3. using back ticks
All tries returned the error: Error number -1, error message: "Bad file descriptor".
When trying to use system($cmd) the error was Error number 65280, error message: "No such file or directory".
This perl code is running via Jenkins (ver 2.190.1) and perl v5.26.0.
This problem started after migrating the code from mercurial to git, but I don't think it's related.
It worked before, but now always fail :(
A backslash has a special meaning in a Perl quoted string. It is used to escape the following character - to "turn off" any special meaning. If you want to use a backslash in a Perl quoted string, then you need to use another backslash to escape it.
my $cmd = 'xcopy /Y c:\\DBs\\Support\\db.bak c:\\jenkins\\workdir\\sql-bak-files';
Alternatively, Perl recognises forward slashes in Windows paths, so it might be easier to replace your code with this:
my $cmd = 'xcopy /Y c:/DBs/Support/db.bak c:/jenkins/workdir/sql-bak-files';
Note that in both cases I have replaced your double-quotes with single-quotes. This has no effect on your problem, but it seems strange to use double-quoted strings if you're not using any of their special characteristics (like the expansion of variables).
Update: To debug a problem like this, you can try printing the string.
$ perl -E'say "xcopy /Y c:\DBs\Support\db.bak c:\jenkins\workdir\sql-bak-files"'
xcopy /Y c:DBsSupportdb.bak c:jenkinsworkdirsql-bak-files

Why does robocopy use its own command line parser?

If I execute the following command on a Windows 8.1 machine:
robocopy "C:\Temp\A\" "C:\Temp\B\"
Robocopy fails due to the following problem:
Source : C:\Temp\A" C:\Temp\B"\
Dest -
...
ERROR : No Destination Directory Specified.
It looks like \ is used as some kind of escape character (which is not normal behavior in the windows command line) The final \" is even transformed to "\ which I do not understand at all. Why's that so?
Note: this is not the default behavior of the command line, if they would have used argv[1] and argv[2] within robocopy, they would've retrieved the correct arguments.
Why are they using their own command line parsing? It really confused me for the last hour...
You should omit the trailing backslashes.
From http://ss64.com/nt/robocopy.html :
If either the source or destination are a "quoted long foldername" do
not include a trailing backslash as this will be treated as an escape
character, i.e. "C:\some path\" will fail but "C:\some path\\" or
"C:\some path." or "C:\some path" will work.
robocopy is not an exception. Any executable uses its own line parser to determine the arguments that were sent to it. The OS just uses the API to create the process and pass to it a string to be handled as arguments. The process can handle the string as it wants.
In the case of robocopy, the parser used is the standard Microsoft C startup code. This parser follow the rules described here, and in the full list you can found
A double quotation mark preceded by a backslash, \", is interpreted as
a literal double quotation mark (").

tcl exec to open a program with agruments

I want to open a text file in notepad++ in a particular line number. If I do this in cmdline the command should be:
start notepad++ "F:\Path\test.txt" -n100
And it is working fine from command line. Now I have to do this from tcl. But I can't make this command work with exec. When I try to execute this:
exec "start notepad++ \"F:\Path\test.txt\" -n100"
I am getting this error:
couldn't execute "start notepad++ "F:\Path\test.txt" -n100": no such file or directory.
What am I missing. Please guide.
Similar to this question:
exec {*}[auto_execok start] notepad++ F:/Path/test.txt -n10
First, you need to supply each argument of the command as separate values, instead of a single string/list. Next, to mimic the start command, you would need to use {*}[auto_execok start].
I also used forward slashes instead of backslashes, since you would get a first level substitution and get F:Path est.txt.
EDIT: It escaped me that you could keep the backslashes if you used braces to prevent substitution:
exec {*}[auto_execok start] notepad++ {F:\Path\test.txt} -n10
You can simply surround the entire exec statement in curly braces. Like this:
catch {exec start notepad++.exe f:\Path\test.txt -n10}
I haven't found a perfect solution to this yet. All my execs seem to be different from each other. On windows there are various issues.
Preserving double quotes around filename (or other) arguments.
e.g. in tasklist /fi "pid eq 2060" /nh the quotes are required.
Preserving spaces in filename arguments.
Preserving backslash characters in filename arguments.
[Internally, Windows doesn't care whether pathnames have / or \, but some programs will parse the filename arguments and expect the backslash character].
The following will handle the backslashes and preserve spaces, but will not handle double-quoted arguments. This method is easy to use. You can build up the command line using list and lappend.
set cmd [list notepad]
set fn "C:\\test 1.txt"
lappend cmd $fn
exec {*}$cmd
Using a string variable rather than a list allows preservation of quoted arguments:
set cmd [auto_execok start]
append cmd " notepad"
append cmd " \"C:\\test 1.txt\""
exec {*}$cmd
Note that if you need to supply the full path to the command to be executed, it often needs to be quoted also due to spaces in the pathname:
set cmd "\"C:\\Program Files\\mystuff\\my stuff.exe\" "

bash script to build complex command syntax, print it first then execute - problems with variable expansion

I want to create scipt to faciliate producing local text file extracts from Hive.
This is to basically execute commands like below:
hive -e "SET hive.cli.print.header=true;SELECT * FROM dropme"|perl -pe 's/(?:\t|^)\KNULL(?=\t|$)//g'>extract/outbound/dropme.txt
While the above works like a charm I find it quite problematic to implement through the parametrized following script (much simplified):
#!/bin/sh
TNAME=dropme
SQL="SELECT * FROM $TNAME"
echo $SQL
echo "SQL: $SQL"
EXTRACMD="hive -e \"SET hive.cli.print.header=true;$SQL\"|perl -pe 'BEGIN{if(defined(\$_=<ARGV>)){s/\b\w+\.//g;print}}s/(?:\t|^)\KNULL(?=\t|$)//g'>extract/outbound/$TNAME.txt"
echo "CMD: $EXTRACMD";
${EXTRACMD}
When run I get: Exception in thread "main" java.lang.NumberFormatException: For input string: "e"
I know there may be many flavours you can print the text or execute command. For instance the line echo $SQL prints me list of files in the directory instead:
SELECT file1.txt file2.txt file3.txt file4.txt FROM dropme
while the next one: echo "SQL: $SQL" gives just what I want: SQL: SELECT * FROM dropme
echo "CMD: $EXTRACMD" prints the (almost) the command to be executed. Almost, as I see \t in perl code being expanded:
CMD: hive -e "SET hive.cli.print.header=true;SELECT * FROM dropme"|perl -pe 'BEGIN{if(defined($_=<ARGV>)){s\w+\.//g;print}}s/(?: |^)\KNULL(?= |$)//g'>extract/outbound/dropme.txt
Maybe that's still ok, but what I want is to be able to copy&paste this command into (other) terminal and execute as the command I put at the top. Ideally I would like that command to be exactly the same (so with \t there)
Biggest problem I have comes when I try to execute it (${EXTRACMD} line). I'm getting the error:
Exception in thread "main" java.lang.NumberFormatException: For input string: "e" …and so on, irrelevant as bash treats every 'word' as single command here. I assume as I don't even know what is really tries to run (prior print attempt obviously doesn't help)
I'm aware that I have multiple options, like:
escaping special characters in the command definition string (like I did with doublequotes)
experimenting with echo and $VAR, '$VAR' or "$VAR"
experimenting with "${EXTRACMD}" or evaluating through eval "${EXTRACMD}"
experimenting with shopt -s extglob or set -f
but as number of combinations is quite large and with my little bash experience I feel it's better to ask for good practice here so my question is:
Is there a way to print a (complex/compound shell) command first and subsequently be able to execute it (exactly as per printed output)? In this case it would be printing the exact command from the top, then executing it the same way as by manually copying that output into terminal prompt and pressing Enter.
Do not construct commands as strings. See http://mywiki.wooledge.org/BashFAQ/050 for details.
That page also talks about a built-in way of getting the shell to tell you what it is running (section 6).
If that doesn't do what you want you can also, with bash, try using printf %q\\n "${arr[*]}".

Single quotes and double quotes in perl

I'm a newbie in perl and I'm trying to execute an operating system command from a Perl script.
The operating system command is executed with IPC::Open3, and the command is like
$cmd = "mycommand --add \"condition LIKE '%TEXT%' \"";
This command is supposed to insert the string contained after the "add" into a database.
The problem is that it inserts the record in the database without the single quotes around %TEXT%, like this:
condition LIKE %TEXT%
If I execute the command at a command prompt it inserts the string correctly.
How should I format the double and single quotes so that it is inserted in the database correctly?
Thanks for the help
By putting the command in a single string, you are inflicting upon it a pass through your system's shell. (You don't mention if it's cmd.exe or bash or other fun stuff.)
Suggestions:
Creating your system command as an array of strings will avoid the shell re-interpolating your command line.
#cmd = ('mycommand', '--add', q(condition LIKE '%TEXT%'));
Throw in extra backslashes to protect the single quotes from your shell. Prepending echo to your command could help with the debugging....
(my personal favorite) Don't shell out for your Database access. use DBI.
$cmd = q{mycommand --add "condition LIKE '%TEXT%'"};
qq is for double-quoting , q is for single quoting.
This way it takes whole command as it is.