Azure DataFactory replace espcial characters in Dynamic Content - azure-data-factory

I created a dynamic SQL script that is created with line breaks to make it more readable, but these line breaks are not accepted by Azure Delta Breaks which throws a sitanxis error, is there a way to remove these line breaks? Is it possible to make a replacement?
Dynamic content created in azure data factory:
#CONCAT('SELECT ',activity('Get Parameters').output.firstRow.SourceSelectFields,'
FROM ',activity('Get Parameters').output.firstRow.SourceTable,' ',activity('Get Parameters').output.firstRow.CrossingTables,'
WHERE ',activity('Get Parameters').output.firstRow.Conditions,
' AND doc.status in (''Complete'',''Draft'')',
' AND PRJT.taskname IN ',variables('ListadoTareas'),
' AND (COALESCE (CAST(DATE_FORMAT(cw.begindate,''yyyy-MM-dd'') AS DATE )))> ''',activity('Get Parameters').output.firstRow.BeginDate,'''',
' AND ''',utcnow('yyyy-MM-dd'),''' <=
CASE WHEN CAST(DATE_FORMAT(cw.expirationdate,''yyyy-MM-dd'') AS DATE) !=''1970-01-01'' THEN (CAST(ADD_MONTHS(DATE_FORMAT(cw.expirationdate,''yyyy-MM-dd''),4) AS DATE))
ELSE CAST(ADD_MONTHS(DATE_FORMAT(DATE_ADD(cw.effectivedate,CAST(cw.cus_plazodias_wjcc2 AS INT)),''yyyy-MM-dd''),4)AS DATE)
END',
' AND SUBSTRING(doc.title,1,3) IN ',variables('ListadoDocumentos'),
' AND dl.projectid is null')
Line breaks are found in SELECT, FROM and WHERE statements and AND conditions.

I tried to pass the below script with line breaks as a dynamic content.
#concat('select ',activity('Lookup1').output.firstRow.SourceSelectFields, '
from
',
activity('Lookup1').output.firstRow.SourceTable)
When the pipeline is run, Input of the script activity is as in below image.
The pipeline run successfully without error.
But if there is error in the pipeline, try to replace the line breaks with empty string. Below is the approach.
Variable V1 is taken, and the same script is given.
In Variable V2, the below expression is given to replace the line break with empty string.
#replace(variables('v1'),'
',' ')
Variable V1 Output
Variable V2 Output
This way, special character line break can be replaced.

Related

PostgreSQL script control structures

I have an sql script which copies data from a file:
--myscript.sql
\set file :dir 'data.csv'
copy data from :'file' csv;
and execute it with psql providing the dir variable
psql -v dir="D:\data\" -f myscript.sql
now I would like the copy command executed only if some other variable is let, e.g. -v doit=
Are there any script control structures available for this? Looking for something like
$if :{?doit}
copy data from :'file' csv;
$endif;
Have tried to wrap it by an anonymous block
do '
begin
if ':{?doit}' then
copy data from :'file' csv;
end if;
end';
But it gives the error
Error: syntax error (approximate position "TRUE")
LINE 3: if 'TRUE' then
^
Answering my own question.
There were two issues there
psql variables cannot be directly substituted in do statements
This can be solved by putting a psql variable into a server setting as suggested here
the copy command does not accept any functions for the file path name, so the first solution won't work here.
I ended up formating the do block like this
select format('
begin
if ''%s''=''t'' then
copy rates from ''%s'' csv header delimiter E''\t'';
end if;
end;',:{?doit},:'file') as do \gset
do :'do';
The format function lets us use a multiline string. The resulting string is then assigned to a new varialbe with the help of \gset and feed it to do.
Note: The format function formats 'TRUE' as 't', so we have to treat it as such.
Thanks to #Mark for the directions.

Escape '\' in dynamic content in Azure Data Factory

I am hitting a query through lookup activity on DB2 Database using ADF.
Query is:
But when I execute the query the '\' count gets double.
The output is provided in the below screenshot.
As you can see in the screenshot given, the '\' count gets double.
I need a solution to remove these extra '\' coming in my query.
By default, backslash escape character \ gets doubled when used in the Azure data factory.
Use replace function in your expression to replace double backslash with single backslash.
Example:
#replace('\\abc\\','\\','\')

Error Code: 1582. Incorrect parameter count in the call to native function 'SUBSTRING_INDEX'

When I tried to copy the text from before the first comma in the first column to the second column using the following command:
UPDATE Table_Name
SET second_column = SUBSTRING_INDEX(first_column, ‘,’, 1);
I got the error message:
Error Code: 1582. Incorrect parameter count in the call to native function 'SUBSTRING_INDEX'
What is going wrong?
Thanks in advance.
The error came from copying and pasting the command from a non-programming text editor (i.e. Word):
The quotes around the delimiter were changed from two normal single quotes (') and (') to left and right single quotes (‘) and (’).
Changing the delimiter to two normal single quotes around the delimiter solved the problem:
UPDATE Table_Name
SET second_column = SUBSTRING_INDEX(first_column, ',', 1);

Use of column names in Redshift COPY command which is a reserved keyword

I have a table in redshift where the column names are 'begin' and 'end'. They are Redshift keywords. I want to explicitly use them in the Redshift COPY command. Is there a workaround rather than renaming the column names in the table. That will be my last option.
I tried to enclose them within single/double quotes, but looks like the COPY command only accepts comma separated column names.
Copy command works fails if you don't escape keywords as column name. e.g. begin or end.
copy test1(col1,begin,end,col2) from 's3://example/file/data1.csv' credentials 'aws_access_key_id=XXXXXXXXXXXXXXX;aws_secret_access_key=XXXXXXXXXXX' delimiter ',';
ERROR: syntax error at or near "end"
But, it works fine if as begin and end are enclosed by double quote(") as below.
copy test1(col1,"begin","end",col2) from 's3://example/file/data1.csv' credentials 'aws_access_key_id=XXXXXXXXXXXXXXX;aws_secret_access_key=XXXXXXXXXXX' delimiter ',';
I hope it helps.
If there is some different error please update your question.

Calling stored proc with exec statement with CRLF in parameter

I want to call a stored proc with a parameter that contains a string that includes a CRLF.
I am actually calling the SP from an MSAccess Pass Through query. I build the exec statement and it passes it through to SQL in the query
Any ideas?
Malcolm
EDIT: The parameter is to be the body of an email used in the SP.
After "Lester employee:" is a newline.
wit_data_driven_subscription_2parameters 'Invoice', _
'malcolm-smith#bigpond.com', _
'payroll#lester.com.au', _
"Invoice Reprint As Requested", _
"Attached is the invoice reprint you requested for Lester employee:
Jeff Rogers", 'CapturedInvoice', '120744'
TSQL accepts line break characters within string literals... In other words, this is valid:
EXEC storedProcedure 'My parameter
with a line break'
It depends on how you generate this string in MSAccess... from vba (ms access).. you can build the query like this:
Dim query as String
query = "EXEC storedProcedure 'My parameter" + vbCrLf + "with a line break'"
I don't remember a great deal about access but if you can pass in values using parameters rather than string literals then you avoid this problem, and you avoid plenty of other problems such as SQL injection
In your exec yourSP statement, have something like:
exec yourStoredProcedure #yourParam = replace(#yourCRLF, '')
I don't know what the CRLF looks like in your string (\r\n maybe???), but you can replace #yourCRLF variable with the actual CRLF char/string.
I think this might be an email issue and not actually a SQL problem. I had this problem on HTML emails in the past. Replace your CrLf with an HTML break tag <BR>.