Azure Data Factory Data Flow using Cached Lookup into select task - azure-data-factory

I'm trying to use Data Flow Cached Lookups to implement a dynamic columns mapping from from a SQL source to a SQL destination. The idea is to persists a map on a metadata table, read as source into the Data Flow and store as Cached Lookup. It looks like this:
(The Key Columns is set to sourceField)
Now we just need to use this map into a Rule Based Mapping into a Select transform in order to select mapped only columns and apply the target naming. The expression is the following:
This configuration will result in a runtime error on the Select task:
Do you have any idea why? The error message is not helpful.
Edit: below the full script definition
parameters{
sourceSchema as string ("dbo"),
sourceTable as string ("RiepilogoSocieta")
}
source(allowSchemaDrift: true,
validateSchema: false,
inferDriftedColumnTypes: true,
isolationLevel: 'READ_UNCOMMITTED',
format: 'table') ~> DatabookSource
source(output(
{_id} as integer,
sourceSchema as string,
sourceTable as string,
sourceField as string,
targetField as string,
targetType as string,
targetSchema as string,
targetTable as string
),
allowSchemaDrift: true,
validateSchema: false,
isolationLevel: 'READ_UNCOMMITTED',
format: 'table') ~> Objmetadata
DatabookSource select(mapColumn(
each(match(!isNull(CacheFieldsMap#lookup(name).targetField)),
CacheFieldsMap#lookup($$).targetField = $$)
),
skipDuplicateMapInputs: true,
skipDuplicateMapOutputs: true) ~> MapColumns
Objmetadata filter(sourceSchema == $sourceSchema && sourceTable == $sourceTable) ~> FilterForSourceTable
FilterForSourceTable select(mapColumn(
sourceField,
targetField
),
skipDuplicateMapInputs: true,
skipDuplicateMapOutputs: true) ~> SelectFieldsMap
MapColumns derive({_createdAt} = currentTimestamp(),
{_updatedAt} = currentTimestamp()) ~> AddMetaColums
AddMetaColums sink(allowSchemaDrift: true,
validateSchema: false,
deletable:false,
insertable:true,
updateable:false,
upsertable:false,
truncate:true,
format: 'table',
skipDuplicateMapInputs: true,
skipDuplicateMapOutputs: true,
errorHandlingOption: 'stopOnFirstError') ~> AnalyticsSink
SelectFieldsMap sink(skipDuplicateMapInputs: true,
skipDuplicateMapOutputs: true,
keys:['sourceField'],
store: 'cache',
format: 'inline',
output: false,
saveOrder: 1) ~> CacheFieldsMap

Post the DSL script(click on script button). I can't make out what patterns $$ is mapping to. Make sure that is datatype compatible and not passing integer types for string parameters to lookup.

Related

What is the correct dataURL including filters for sapui5 spreadsheet control

I am using the export to spreadsheet control in SAPUI5 ( "sap/ui/export/Spreadsheet" described under https://sapui5.hana.ondemand.com/#/api/sap.ui.export.Spreadsheet).
When calling the odata Service without filters it works fine. When adding filters I get an 'uncaught in promise Unexpected Server Response' error in the frontend. What is the correct Url including filters that I have to specify under dataUrl?
Thank you for the help!
Christophe
So far I am trying with:
dataUrl: "/sap/opu/odata/sap//?$filter=Export eq 'X'"
oSettings = {
workbook: {
columns: aCols,
hierarchyLevel: "Level",
},
dataSource: {
type: "odata",
dataUrl: "/sap/opu/odata/sap/<myService>/<myEntitySet>?$filter=Export eq 'X'",
serviceUrl: oModelInterface.sServiceUrl,
headers: oModelInterface.getHeaders ? oModelInterface.getHeaders() : null,
count: oBinding.getLength ? oBinding.getLength() : null,
useBatch: true,
sizeLimit: oModelInterface.iSizeLimit
},
worker: false,
};
var oSpreadsheet = new Spreadsheet(oSettings);
oSpreadsheet.build();
There is an API for the download URL.
Just use oBinding.getDownloadUrl()
Just to be safe:
dataUrl: oRowBinding.getDownloadUrl ? oRowBinding.getDownloadUrl() : null
Reference: https://openui5.hana.ondemand.com/#/api/sap.ui.model.odata.v2.ODataListBinding

Input definition optional keys

I generated an action with sails generate action task/update-task. I now am trying to create an input parameter that should be an object with optional keys:
inputs: {
fields: {
type: {
body: 'string?',
rruleSetStr: 'string?',
},
required: true,
description: 'All keys are not required, but at least one is'
},
However I keep getting error:
The action `task/update-task` could not be registered. It looks like a machine definition (actions2), but it could not be used to build an action.
Details: ImplementationError: Sorry, could not interpret "task/update-task.js" because its underlying implementation has a problem:
------------------------------------------------------
• Invalid input definition ("fields"). Unrecognized `type`. (Must be 'string', 'number', 'boolean', 'json' or 'ref'. Or set it to a type schema like `[{id:'number', name: {givenName: 'Lisa'}}]`.)
------------------------------------------------------
If you are the maintainer of "task/update-task.js", then you can change its implementation to solve the problem above. Otherwise, please file a bug report with the maintainer, or fork your own copy and fix that.
[?] See https://sailsjs.com/support for help.
at machineAsAction (C:\Users\Mercurius\Documents\GitHub\Homie-Web\node_modules\machine-as-action\lib\machine-as-action.js:271:28)
at helpRegisterAction (C:\Users\Mercurius\Documents\GitHub\Homie-Web\node_modules\sails\lib\app\private\controller\help-register-action.js:63:27)
at C:\Users\Mercurius\Documents\GitHub\Homie-Web\node_modules\sails\lib\app\private\controller\load-action-modules.js:146:13
Does anyone know where the documentation is on how to make optional keys in this? I tried here - http://node-machine.org/spec/machine#inputs - but no luck.
Type must be 'string', 'number', 'boolean', 'json' or 'ref' like error say.
So u need set type to 'ref' (object or array), and u can use custom function for validate.
inputs: {
fields: {
type: 'ref',
custom: function (data) {
// some logic
// example
if (typeof data.body !== "string") {
return false;
// or u can use trow Error('Body is not a string')
}
return true;
},
required: true,
description: 'All keys are not required, but at least one is'
}
Now input is type object and in custom function return false or trow Error('Some problem') break validation.
If u use schema type, just remove ? from your example:
inputs: {
fields: {
type: {
body: 'string',
rruleSetStr: 'string'
},
required: true,
description: 'All keys are not required, but at least one is'
}
This is Runtime (recursive) type-checking for JavaScript., please check documentation for writing rules.

Node Opcua / QtOpcUa - Method Calls

I have a Node OPC Server which I connect to with a Qt application using the QtOpcUa client library.
On my server I define a method that's basically a crude historic access request as HDA support is not yet available, it takes in a start_date and end_date then queries a database for the relevant values which it returns in an array.
It looks a bit like this:
const deviceTrends = namespace.addObject({
organizedBy: deviceObject,
browseName: strings.TREND_NODE
})
const method = namespace.addMethod(deviceTrends,{
nodeId: strings.NSI + part.name + "-Trend",
browseName: part.name + "-Trend",
inputArguments: [
{
name:"start_date",
description: { text: "Trend Start Date" },
dataType: opcua.DataType.DateTime
},{
name:"end_date",
description: { text: "Trend End Date" },
dataType: opcua.DataType.DateTime
}
],
outputArguments: [{
name:"Trend",
description:{ text: "Trend Data from start_date to end_date" },
dataType: opcua.DataType.String ,
valueRank: 1
}]});
method.bindMethod(function(inputArguments,context,callback) {
console.log("called")
const start = inputArguments[0].value;
const end = inputArguments[1].value;
console.log("Start: ", start);
console.log("End: ", end);
let sql = `SELECT Date date,
Name name,
Value value
FROM Trends
WHERE DateTime >= ? AND DateTime <= ?`;
var result = []
db.each(sql, [start, end], (err, row) =>
{
result.push(`${row.date}: ${row.name} - ${row.value}`)
})
console.log(result)
const callMethodResult = {
statusCode: opcua.StatusCodes.Good,
outputArguments: [{
dataType: opcua.DataType.String,
arrayType: opcua.VariantArrayType.Array,
value :result
}]
};
callback(null,callMethodResult);});}
I can see this in a client such as Prosys and call the method which works okay:
However I can't seem to call this method from Qt, I've cut out the packaging of arguments and the result handler (it just lists out the received params):
QOpcUaNode* n = devices[deviceName].client->node("ns=1;s=Speed-Trend");
connect(n, &QOpcUaNode::methodCallFinished, [this, deviceName](QString methodNodeId, QVariant result, QOpcUa::UaStatusCode status)
{
qDebug() << " Response received ";
this->handleNodeTrendResponse(deviceName, methodNodeId, result, status);
});
n->callMethod(n->nodeId(), args);
Trace:
Requesting Trend: From QDateTime(2018-10-07 13:13:56.766 BST Qt::TimeSpec(LocalTime)) TO QDateTime(2018-10-07 13:14:05.390 BST Qt::TimeSpec(LocalTime))
qt.opcua.plugins.open62541: Could not call method: BadNodeIdInvalid
Response received [Output from method result handler]
Device Name: "speed-device"
Method Node Id: "ns=1;s=Speed-Trend"
Result: QVariant(Invalid)
Result to List: << ()
Status: QOpcUa::UaStatusCode(BadNodeIdInvalid)
I also can't seem to find the method on other clients too, this is from an OPC UA Client application on my phone which shows nothing under the Trends object:
Everything else seems accessible, I can request variables, setup monitoring all fine.
Is there something I'm just missing here or is it an issue with QtOpcUa and other clients?
I can work around this by creating variables instead to capture input and output arguments and a boolean to represent a method call but it's a lot neater to tie everything up in a single method.
Thanks

Postgres: create a new jsonb object with matched values

So I have this table questions that have a settings jsonb column:
{
id: 'question-id-1',
settings: {
foo1: true,
foo2: true,
bar: false
}
},
{
id: 'question-id-2',
settings: {
bar: true
}
}
now I want to make a postgres db update migration script that results to:
{
id: 'question-id-1',
settings: {
foo1: true,
foo2: true,
bar: false,
opts: ['foo1', 'foo2']
}
},
{
id: 'question-id-2',
settings: {
bar: true,
opts: ['bar']
}
}
so only those that has values true is added to a new opts array inside settings.
This is for Postgres 9.5.
Thank you in advanced.
Create a function to update the column:
create or replace function add_opts(jsonb)
returns jsonb language sql as $$
select $1 || jsonb_build_object('opts', jsonb_agg(key))
from jsonb_each_text($1)
where value::bool;
$$;
Test:
with questions(settings) as (
values
('{
"foo1": true,
"foo2": true,
"bar": false
}'::jsonb)
)
select add_opts(settings)
from questions;
add_opts
----------------------------------------------------------------------
{"bar": false, "foo1": true, "foo2": true, "opts": ["foo1", "foo2"]}
(1 row)
Your update query should look like this:
update questions
set settings = add_opts(settings);
The variant which eliminates the opts array when there are no set options:
create or replace function add_opts(jsonb)
returns jsonb language sql as $$
select case
when jsonb_agg(key) is null then $1
else $1 || jsonb_build_object('opts', jsonb_agg(key))
end
from jsonb_each_text($1)
where value::bool;
$$;

doctrine:build --forms --model doesn't generate specific filter form

I have a schma.yml file with a variety of tables. For one single table, the doctrine:build command doesn't build a formfilter class
The schema.yml definition of the table is
InactiveReason:
actAs:
Sortable: ~
columns:
id: { type: integer(4), notnull: true, unique: true, primary: true, autoincrement: true }
name: { type: string(100), notnull: true }
Both model and form classes are generated, however I can't get it to generate the formfilter classes (both InactiveReasonFormFilter and BaseInactiveReasonFormFilter)
I have tried moving around the definition in schema.yml, which didn't help either. Am I using some sort of reserved keywore here?
You are not specifying for the build task to generate the filter classes
doctrine:build --forms --model
Add on the --filters option or replace all options with the --all-classes option
doctrine:build --forms --model --filters
doctrine:build --all-classes