DBeaver - Statement Terminator not OK? - tsql

Am using a DECLARE and SET statement to define #end_dt as CAST(...) but, in DBeaver if i end my DECLARE statement, and my SET statements with an ; it treats those statements as if separate from the following main query and gives me a scalar variable error. Once I remove the statement terminators (;) then it runs.
This is my first month using DBeaver. Is there a setting that allows use of statement terminators smoothly?

Related

Is it possible to use uppercase strings in pgTap without it assuming it's a prepared statement?

We're attempting to run tests against a postgres function that returns a SETOF VARCHAR uppercase strings.
Whenever this test case runs, however, pgTap tries to look up a prepared statement with the same name as the uppercase value we're expecting to be returned. Is there any way of escaping this behaviour or calling this test case in another way to check output?
SELECT set_eq(
the_function_call(_property := 'value'),
$$ VALUES ('FIRST_RETURN'), ('SECOND_RETURN') $$,
'returns both expected values'
);
psql:/path/to/test.test.sql:20: ERROR: prepared statement "second_return" does not exist
CONTEXT: SQL statement "CREATE TEMP TABLE __taphave__ AS EXECUTE SECOND_RETURN"
I've tried backslashes, quotes, using UPPER on lowercase strings, casting the values as VARCHAR and ARRAY ['FIRST_RETURN', 'SECOND_RETURN'] as the second argument but still get the same prepared statement issue.
I was trying to avoid creating a prepared statement for each value that literally returns the same value again, but if that's the only way I guess I'll have to relent! Any help greatly appreciated.
Managed to get this working by changing the way the test is called;
SELECT set_eq(
$$ SELECT * FROM the_function_call(_property := 'value') $$,
ARRAY ['FIRST_RETURN', 'SECOND_RETURN'],
'returns both expected values'
);
Now passes with no issues

How to execute file with multiple DDL statements

I'm running redshift_connector in Python under the Window environment. I have .sql file that is the collection of DDLs for the database.
when I execute I get the following error
'cannot insert multiple commands into a prepared statement'
You've probably found a solution for this already, but for the sake of closing the loop, here is mine:
I switched from using redshift_connector to using psycopg2, which does allow multiple commands.
If you need to use redshift_connector, however, there is a little workaround. It might mean updating some of your sql code:
Add a function to your python script which splits the sql into multiple commands, delimited by the ';' character, for example.
The only issue with this is where you need the ';' to mean ';'. For this reason, I added an escape argument, which is where you'd need to update your sql. This allows you to replace the ';' in the sql which the function then replaces with a ';' after it has split out the multiple commands.
Here's the function:
def rc_multicommand(rc_cursor,cs,escape=None):
    cs_split = cs.split(";")
    for split_command in cs_split:
        if not len(split_command) == 0:
            if not escape == None:
                split_command = split_command.replace(escape,';')
            rc_cursor.execute(split_command)

By entering values into table, command line does not work [duplicate]

I have a table test(id,name).
I need to insert values like: user's log, 'my user', customer's.
insert into test values (1,'user's log');
insert into test values (2,''my users'');
insert into test values (3,'customer's');
I am getting an error if I run any of the above statements.
If there is any method to do this correctly please share. I don't want any prepared statements.
Is it possible using sql escaping mechanism?
String literals
Escaping single quotes ' by doubling them up → '' is the standard way and works of course:
'user's log' -- incorrect syntax (unbalanced quote)
'user''s log'
Plain single quotes (ASCII / UTF-8 code 39), mind you, not backticks `, which have no special purpose in Postgres (unlike certain other RDBMS) and not double-quotes ", used for identifiers.
In old versions or if you still run with standard_conforming_strings = off or, generally, if you prepend your string with E to declare Posix escape string syntax, you can also escape with the backslash \:
E'user\'s log'
Backslash itself is escaped with another backslash. But that's generally not preferable.
If you have to deal with many single quotes or multiple layers of escaping, you can avoid quoting hell in PostgreSQL with dollar-quoted strings:
'escape '' with '''''
$$escape ' with ''$$
To further avoid confusion among dollar-quotes, add a unique token to each pair:
$token$escape ' with ''$token$
Which can be nested any number of levels:
$token2$Inner string: $token1$escape ' with ''$token1$ is nested$token2$
Pay attention if the $ character should have special meaning in your client software. You may have to escape it in addition. This is not the case with standard PostgreSQL clients like psql or pgAdmin.
That is all very useful for writing PL/pgSQL functions or ad-hoc SQL commands. It cannot alleviate the need to use prepared statements or some other method to safeguard against SQL injection in your application when user input is possible, though. #Craig's answer has more on that. More details:
SQL injection in Postgres functions vs prepared queries
Values inside Postgres
When dealing with values inside the database, there are a couple of useful functions to quote strings properly:
quote_literal() or quote_nullable() - the latter outputs the unquoted string NULL for null input.
There is also quote_ident() to double-quote strings where needed to get valid SQL identifiers.
format() with the format specifier %L is equivalent to quote_nullable().
Like: format('%L', string_var)
concat() or concat_ws() are typically no good for this purpose as those do not escape nested single quotes and backslashes.
According to PostgreSQL documentation (4.1.2.1. String Constants):
To include a single-quote character within a string constant, write
two adjacent single quotes, e.g. 'Dianne''s horse'.
See also the standard_conforming_strings parameter, which controls whether escaping with backslashes works.
This is so many worlds of bad, because your question implies that you probably have gaping SQL injection holes in your application.
You should be using parameterized statements. For Java, use PreparedStatement with placeholders. You say you don't want to use parameterised statements, but you don't explain why, and frankly it has to be a very good reason not to use them because they're the simplest, safest way to fix the problem you are trying to solve.
See Preventing SQL Injection in Java. Don't be Bobby's next victim.
There is no public function in PgJDBC for string quoting and escaping. That's partly because it might make it seem like a good idea.
There are built-in quoting functions quote_literal and quote_ident in PostgreSQL, but they are for PL/PgSQL functions that use EXECUTE. These days quote_literal is mostly obsoleted by EXECUTE ... USING, which is the parameterised version, because it's safer and easier. You cannot use them for the purpose you explain here, because they're server-side functions.
Imagine what happens if you get the value ');DROP SCHEMA public;-- from a malicious user. You'd produce:
insert into test values (1,'');DROP SCHEMA public;--');
which breaks down to two statements and a comment that gets ignored:
insert into test values (1,'');
DROP SCHEMA public;
--');
Whoops, there goes your database.
In postgresql if you want to insert values with ' in it then for this you have to give extra '
insert into test values (1,'user''s log');
insert into test values (2,'''my users''');
insert into test values (3,'customer''s');
you can use the postrgesql chr(int) function:
insert into test values (2,'|| chr(39)||'my users'||chr(39)||');
When I used Python to insert values into PostgreSQL, I also met the question: column "xxx" does not exist.
The I find the reason in wiki.postgresql:
PostgreSQL uses only single quotes for this (i.e. WHERE name = 'John'). Double quotes are used to quote system identifiers; field names, table names, etc. (i.e. WHERE "last name" = 'Smith').
MySQL uses ` (accent mark or backtick) to quote system identifiers, which is decidedly non-standard.
It means PostgreSQL can use only single quote for field names, table names, etc. So you can not use single quote in value.
My situation is: I want to insert values "the difference of it’s adj for sb and it's adj of sb" into PostgreSQL.
How I figure out this problem:
I replace ' with ’, and I replace " with '. Because PostgreSQL value does not support double quote.
So I think you can use following codes to insert values:
insert into test values (1,'user’s log');
insert into test values (2,'my users');
insert into test values (3,'customer’s');
If you need to get the work done inside Pg:
to_json(value)
https://www.postgresql.org/docs/9.3/static/functions-json.html#FUNCTIONS-JSON-TABLE
You must have to add an extra single quotes -> ' and make doubling quote them up like below examples -> ' ' is the standard way and works of course:
Wrong way: 'user's log'
Right way: 'user''s log'
problem:
insert into test values (1,'user's log');
insert into test values (2,''my users'');
insert into test values (3,'customer's');
Solutions:
insert into test values (1,'user''s log');
insert into test values (2,'''my users''');
insert into test values (3,'customer''s');

DB2 trigger illegal token

Fairly new to DB2 sql, so forgive my ignorance :)
I have a trigger with a condition inside. I want to then insert some params depending on the condition.. Here it is:
I've looked at DB2 documentation for triggers and also for if statements, and at least to my eyes it appears to comply with it, however i get a -104 error (Illegal symbol token) on the insert line.
The insert works fine provided i use values not from 'N'.
OK, it works if i have nothing in the if then statement.. but only if i have nothing!
Thanks
My guess would be that DB2 is confused by your statement terminators. You use a semicolon to terminate the CREATE TRIGGER statement, but at the same time you use the same terminator inside the CREATE TRIGGER statement, after the INSERT.
In whatever client you use to execute your code, redefine the statement terminator to something else and place that terminator at the end of CREATE TRIGGER, after END ID. For example, if you use the command line processor, save this to a file trig.sql:
CREATE OR REPLACE TRIGGER AUTO_INSERT_DEPO_NOMINEE
AFTER INSERT ON CA_ENTITLEMENT
REFERENCING NEW AS N
FOR EACH ROW
IF NOT EXISTS (SELECT D.DEPO_CD FROM DEPO D WHERE D.DEPO_CD = N.DEPO_CD)
THEN
INSERT INTO DEPO (DEPO_CD, DEPO_NME, BRANCH_CD, AUTO_GENERATED)
VALUES(N.DEPO_CD, NULL, N.BRANCH_CD, 'Y');
END IF#
then run it, specifying "#" as the statement terminator:
db2 -td# -f mytrig.sql

execute command in while loop

I have a postgres function which runs the following loop
while x<=colnum LOOP
EXECUTE
'Update attendrpt set
slot'||x||' = pres
from (SELECT branch, semester, attend_date , div, array_to_string(ARRAY_AGG(first_name||':'||alias_name||':'||lect_type||':'||
to_char(present,'99')),';')
As pres
from attend1 where lecture_slot_no ='||x||'
group by branch, semester, attend_date , div ) j
where attendrpt.branch=j.branch and attendrpt.semester=j.semester
and attendrpt.attenddate=j.attend_date and attendrpt.div=j.div;';
`x:=x+1;
END LOOP;`
The problem here is it is conflicting the single quotes closing in query and the execute command.Is there anyway to solve this.
Thanks in advance.
Quote your function definition with dollar-quoting (like $BODY$ or just $$) as per the manual.
Use execute ... using instead of string substitution. For substituting identifiers use the %I format specifier from the format function.
If you absolutely must use || string concatenation, say if you're on some ancient version of PostgreSQL, you need to use the quote_literal and quote_ident functions to avoid issues with quoting and potential security problems.
Beyond that, it looks like the whole approach is completely unnecessary; you're doing something that looks like it can be done in simple SQL.