How to save Data factory stored procedure output - azure-data-factory

Whenever I execute a stored procedure in the ADFv2, it gives me an output as
{
"effectiveIntegrationRuntime": "DefaultIntegrationRuntime (Australia Southeast)",
"executionDuration": 34
}
even though I have set 2 variables as output in the procedure. Is there any way to map the output of the stored procedure in the ADFv2? Till now I can map the output of all the other activities but not of Stored procedures.

You could use a lookup activity to get the result.
Please reference this post. https://social.msdn.microsoft.com/Forums/azure/en-US/82e84ec4-fc40-4bd3-b6d5-b742f3cd1a33/adf-v2-how-to-check-if-stored-procedure-output-is-empty?forum=AzureDataFactory
Update by Gagan:
Instead of getting the output of SP (which is not possible in ADFv2 right now), I stored the output in the table and then apply lookup-foreach to the table to get the value.

Stored procedure call in Data factory (v2) does not capture the result data set. So you cannot use the stored procedure activity to get the result data set and refer it in next activities.
Workaround is to use lookup activity to call exact same stored procedure as lookup will get you the result data set from stored procedure. Replace your Stored procedure activity with lookup and it will work.

Related

ADF add ActivityRunID to Sink table

I have a sink table that I would like to populate with the ActivityRunID of the Copy Data iteration within the Until loop
I understand that I cannot map ActivityRunID within the Copy Data task until that task is completed. My Until looks like this:
Is there an easy way to populate my sink with the RunID once the Copy Data task has finished? I was thinking of populating the sink with a dummy GUID then using a Lookup task to populate it in a subsequent task
If you want to use the Lookup activity just to pull your dummy GUID.
Alternatively, you can generate a GUID using data factory dynamic expressions and store it in a variable using Set Variable activity. And use the variable in later activities directly.
If you want to use the copy data activity RunID, create a stored procedure to update the value in the sink with the input parameter and pass the parameter of activity RunID from the stored procedure activity.
Parameter value: #activity('Copy data1').ActivityRunId

Azure Data Factory - Insert Sql Row for Each File Found

I need a data factory that will:
check an Azure blob container for csv files
for each csv file
insert a row into an Azure Sql table, giving filename as a column value
There's just a single csv file in the blob container and this file contains five rows.
So far I have the following actions:
Within the for-each action I have a copy action. I did give this a source of a dynamic dataset which had a filename set as a parameter from #Item().name. However, as a result 5 rows were inserted into the target table whereas I was expecting just one.
The for-each loop executes just once but I don't know to use a data source that is variable(s) holding the filename and timestamp?
You are headed in the right direction, but within the For each you just need a Stored Procedure Activity that will insert the FileName (and whatever other metadata you have available) into Azure DB Table.
Like this:
Here is an example of the stored procedure in the DB:
CREATE Procedure Log.PopulateFileLog (#FileName varchar(100))
INSERT INTO Log.CvsRxFileLog
select
#FileName as FileName,
getdate() as ETL_Timestamp
EDIT:
You could also execute the insert directly with a Lookup Activity within the For Each like so:
EDIT 2
This will show how to do it without a for each
NOTE: This is the most cost effective method, especially when dealing with hundred or thousands of files on a recurring basis!!!
1st, Copy the output Json Array from your lookup/get metadata activity using a Copy Data activity with a Source of Azure SQLDB and Sink of Blob Storage CSV file
-------SOURCE:
-------SINK:
2nd, Create another Copy Data Activity with a Source of Blob Storage Json file, and a Sink of Azure SQLDB
---------SOURCE:
---------SINK:
---------MAPPING:
In essence, you save the entire json Output to a file in Blob, you then copy that file using a json file type to azure db. This way you have 3 activities to run even if you are trying to insert from a dataset that has 500 items in it.
Of course there is always more than one way to do things, but I don't think you need a For Each activity for this task. Activities like Lookup, Get Metadata and Filter output their results as JSON which can be passed around. This JSON can contain one or many items and can be passed to a Stored Procedure. An example pattern:
This is the sort of ELT pattern common with early ADF gen 2 (prior to Mapping Data Flows) which makes use of resources already in use in your architecture. You should remember that you are charged by the activity executions in ADF (eg multiple iteration in an unnecessary For Each loop) and that generally compute in Azure is expensive and storage is cheap, so think about this when implementing patterns in ADF. If you build the pattern above you have two types of compute: the compute behind your Azure SQL DB and the Azure Integration Runtime, so two types of compute. If you add a Data Flow to that, you will have a third type of compute operating concurrently to the other two, so personally I only add these under certain conditions.
An example implementation of the above pattern:
Note the expression I am passing into my example logging proc:
#string(activity('Filter1').output.Value)
Data Flows is perfectly fine if you want a low-code approach and do not have compute resource already available to do this processing. In your case you already have an Azure SQL DB which is quite capable with JSON processing, eg via the OPENJSON, JSON_VALUE and JSON_QUERY functions.
You mention not wanting to deploy additional code which I understand, but then where did your original SQL table come from? If you are absolutely against deploying additional code, you could simply call the sp_executesql stored proc via the Stored Proc activity, use a dynamic SQL statement which inserts your record, something like this:
#concat( 'INSERT INTO dbo.myLog ( logRecord ) SELECT ''', activity('Filter1').output, ''' ')
Shred the JSON either in your stored proc or later, eg
SELECT y.[key] AS name, y.[value] AS [fileName]
FROM dbo.myLog
CROSS APPLY OPENJSON( logRecord ) x
CROSS APPLY OPENJSON( x.[value] ) y
WHERE logId = 16
AND y.[key] = 'name';

Calling stored procedure which does not return a list using Entity Framework

I understand if my stored procedure returns a data set I can do this
_context.Database.SqlQuery<Product>(query, parameters).ToList<Product>()
But, If I have a stored procedure which does not return anything, how can I call that from entity framework? What goes in place of the "?" below?
_context.Database.SqlQuery<?>(query, parameters).ToList<?>()
Have you mapped the stored procedure to a function in the model browser yet?
If you go to your edmx file and right click, you can add->function import and then specify the details of the stored procedure you want to import.
This will map it to a function, you can then efffectively call your stored procedure using
_context.(NameYouGaveFunction)

Retrieving output values from stored procedure - GetColumn

I need to invoke a stored procedure and it has no OUT parameters instead it writes to the buffer manager by calling putcolumn.How can I retrieve the data set using put column from a JAVA application.
My stored procedure is an oracle on.Can someone provide your inputs to sort my issue?

How to set DTS variable value as stored procedure name in Execute Sql Task

I have execute sql task control ,which is intended to execute a stored procedure , how can i pass the dts variable value as stored procedure name in execute sql task.
Change SQLSourceType from "Direct input" to "Variable" and then select the variable you want from the SourceVariable dropdown.
I believe you can write "exec ?" in the SQLStatementfield and map the variable in the Parameter Mappingsections. This assumes you are not passing extra parameters to the stored procedure though.
Also, this method works for OLEDB connectors. I believe this is different in ADO.NET connectors
To pass parameters to an ADO.NET connector, you use #VariableName Instead of the "?".
http://msdn.microsoft.com/en-us/library/cc280502.aspx
In the link above scroll down to the section "Passing parameters to a stored procedure". It describes the process in detail