Dynamic T-SQL input to T-SQL Task in SSIS - tsql

My SSIS package includes a T-SQL task. I have a package parameter that I want to pass into my T-SQL task but I can't find the correct syntax to do this:
DECLARE #myVariable int;
SET #myVariable = $Package::myParameter --not working
SET #myVariable = #[$Package::myParameter] -- also not working
What is the correct way to pass parameters to a T-SQL task?

I'd recommend using an Execute SQL Task as it provides more functionality than an Execute T-SQL Statement Task. However if you're looking to use the T-SQL Task with a parameter this can be done by creating a string variable with an expression that includes the parameter. An example of this is below. To set this as the statement for the T-SQL task, go to the Properties window of the task (press F4), click the ellipsis next to the Expressions field, select the SqlStatementSource property and add the string variable containing the T-SQL as the Expression. Since the variable in your SQL is of the INT data type, I'm assuming the package parameter also is, thus it needs to be cast to a string to be included as part of the expression in the string variable. This will still be parsed as a numeric data type and submitted to SQL Server as such. This casting is done with the (DT_STR, length, code page) function below. This just uses an example length of 10. As a side note, the (DT_WSTR, length) function would be used for Unicode data. Make sure to enclose the SQL text in quotes as done below. Also be aware that parameter names are case sensitive within an expression, for example #[$Package::MyParameter] would return an error if the parameter name was #[$Package::myParameter], starting with a lower case m.
"DECLARE #myVariable INT;
SET #myVariable = " + (DT_STR, 10, 1252)#[$Package::myParameter] + "
UPDATE Database.Schema.Table
SET NAME = 'TW'
WHERE ID = #myVariable"

You can't pass parameters to a T-SQL task.
According to the documentation:
If you need to run parameterized queries, save the query results to
variables, or use property expressions, you should use the Execute SQL
task instead of the Execute T-SQL Statement task. For more
information, see Execute SQL Task.

Related

How to return integer value from notebook in adf pipeline

I have a usecase where I need to return an integer as output from a synapse notebook in pipeline and pass this output in next stage of my pipeline.
Currently mssparkutils.notebook.exit() takes only string values. Is there any utility methods available for this?
I know we can cast the integer to string type and send it to the exit("") method. I wanted to know if I could achieve this without casting.
cast()function is the standard and official method suggested by Spark itself. AFAIK, there is no other method. Otherwise, you need to manage it programmatically.
You can also try #equals in dynamic content to check whether the exitValue fetched from the notebook activity output equals to some specific value.
#equals(activity('Notebook').output.status.Output.result.exitValue, '<value>')
Refer: Spark Cast String Type to Integer Type (int), Transform data by running a Synapse notebook
instead, you can convert the string number to an integer in dynamic content. like this:
#equals(
int(activity('Notebook').output.status.Output.result.exitValue)
,1)
or add an activity that sets the string value to a variable that is an int.

Escape string interpolation in anorm

I want to insert the literal '${a}' into a table using anorm 2.5.2, which means I want to execute the bare SQL query
INSERT INTO `db`.`table` (`a`) VALUES ('${a}');
without using any anorm / string interpolation. When I try to do the following
SQL("INSERT INTO `db`.`table` (`a`) VALUES ('${a}');").execute()
I get an anorm.Sql$MissingParameter: Missing parameter value exception because it tries to use anorm interpolation on ${a} but no value a is available in the scope.
How to escape the anorm / string interpolations $... and ${...}?
Escape a dollar sign in string interpolation doesn't seem to work here.
You can make ${a} the value of a parameter, i.e.
SQL("""INSERT INTO db.table (a) VALUES ({x})""").on("x" -> s"$${a}")
(s"$${a}" is the way to write "${a}" without getting a warning about possible missing interpolators).
The same can be written equivalently as
val lit = s"$${a}"
SQL"""INSERT INTO db.table (a) VALUES ($lit)"""
The below will probably work, but I am not sure:
SQL"INSERT INTO db.table (a) VALUES ('$${a}')"
It may also be worth asking if it's intentional behavior or a bug: when talking about parametrized SQL queries, it doesn't make sense to have a parameter inside '.

Using variables in a postgres sql block

Can you have an SQL block that accepts a variable for input, uses that variable in a join and returns a result set.
The kicker is that I have been asked to do this outside a function - i.e. within an SQL block.
So, for example, I want to pass the value 1234567890 to the variable v_hold:
DO $$
declare
v_hold integer;
BEGIN
select * from t_table_A where ID = v_hold ;
--return alert_mesg;
END$$;
--$$ LANGUAGE plpgsql
The testing I've done says that in order to return a result set, you have to define that in a RETURN TABLE declaration. I've tried to define this outside a function, but I haven't figured it out.
Can this be done outside a function - i.e pass a variable and return a result set based on a select statement which references a variable in the where clause?
You could try using a prepared statement. For example:
PREPARE myStatement (int) AS SELECT * FROM t_table_A where ID = $1;
To then run the statement use the execute command:
EXECUTE myStatement(1234567890);
From the documentation:
DO executes an anonymous code block, or in other words a transient anonymous function in a procedural language.
The code block is treated as though it were the body of a function with no parameters, returning void. It is parsed and executed a single time.
You could generate your code block with a shell script and get the sort of effect you are looking for.

String passed into cursor.callproc becomes unknown (psycopg2, python 2.7, postgres 9.3)

For some reason, passing a string from Python into a Postgres function and calling it using a psycopg2 cursor causes the function to be unrecognized because the argument is unknown.
The function is very simple like so:
CREATE OR REPLACE FUNCTION text_test (some_text text)
RETURNS void AS $$
BEGIN
END;
$$ LANGUAGE plpgsql;
On the python side, I have:
cursor.callproc("text_test", ("test",))
And I get the following error:
psycopg2.ProgrammingError: function text_test(unknown) does not exist
LINE 1: SELECT * FROM text_test('test')
^
Hint: No function matches the given name and argument types. You might need to add explicit type casts.
Why does this only happen with strings and what do I need to do to have a function successfully accept a string? For some reason numeric data types are unaffected by this problem.
This happens because there is no way to cast the string to the "correct" text type. Is it a char(N)? A varchar(N)? A text?
Unfortunately .callproc() doesn't provide an easy way to specify the argument types but you can always use .execute() casting the arguments explicitly and everything works:
curs.execute("SELECT * FROM text_test(%s::text)", ("test",))
You could also make a list with the parameters you need to send:
param_list = ["test"]
curs.callproc(proc_name, param_list)
Here is a good answer about it:
python + psycopg2 = unknown types?

how to deal with complex branching in iReport or Jasper

I need to fill a field in a report based on the end result of a complex branching statement. How do I do this in iReport? The report needs to show a different string depending on what is in different fields in the database. Do I make a really complex SQL statement? Do I use variables?
So, for instance,
If field x=1
IF y=1
IF z=1
Field should read A
If x=1
IF y=1
IF z=2
Field should read B
You could do something similar to the following:
( $F{staff_type} == null ? new String("") :
( $F{staff_type}.equalsIgnoreCase("Permanent") ? new String("1") :
( $F{staff_type}.equalsIgnoreCase("Non-permanent") ? new String("2") : new String("")
)))
Basically, you need to use nested condition expressions.
in the textfieldexpression write an expression like this
(($F{PAYMENTMODE}.equals("CS")) ? "Cash":($F{PAYMENTMODE}.equals("CQ"))? "Cheque":"Bank")e
I think the easiest way to do this would be to have the field(s) filled by a parameter passed from the backing bean. The jdbc connection is created in the bean and passed to the report, it should be relatively easy to access the field or fields you need and run the data through a method which determines the branching outcome. Assign the outcome to the parameter and pass it to the report in the jasperParameter variable of JasperFillManager.fillReport(file, parameters, jdbcConnection).
Usually, I handle all the programming logic before passing the data to the Jasper Report Engine, but in some cases, post-processing or post-checking is required. If it is that scenario and If I had MANY cases (strings) to check, I would code a 'Jasper Report Scriptlet' and handle that logic there (so that the code/report is readable and maintainable and also for code reuse). If it is just 2 or 3 strings to check, I would use the 'Ternary' operator.
If you want to use report scriptlet, create a scriptlet class (or use an existing one), code a method to handle this logic (Ex: 'checkString' method) and put $P{REPORT_SCRIPTLET}.checkString(someString) in the TextField's expression.