How can I pass context params using talend api? - talend

I'm trying to automate talend job executions using the Talend API but I'm getting an error when I try to pass the context params using the api.
The json I'm encoding to 64 is the following:
JSON='{ "actionName":"runTask", "authPass": "TalendPass", "authUser": "name#example.com", "jvmParams": [ "-Xmx256m" , "-Xms64m" ], "contextParams": ["host_mysql_db01": "failed", "database_analytics": "testing.it"],"mode": "synchronous", "taskId": 43}'
Error message:
{"error":"Expected a ',' or ']' at character 172","returnCode":2}
I found another stackoverflow issue Add context parameters to Talend job in Tac via API without actually running it but he doesn't say how he pass it and I cannot reply with a comment asking how he did it
The real talend api call is:
wget -O file http://localhost:8080/org.talend.administrator/metaServlet?$JSON_ENCODED
Can I get some help?

Actually, the json your are passing to the metaservlet is not valid json. You can check it with an online validator like http://jsonlint.com.
You are specifying the contextParams attribute as an array, but that syntax is not valid in json. An array can contain either a list of values (like jvmParams) or objects (which can themselves contain arrays). Here's an example.
Moreover, according to Talend reference, the attribute should be called "context" and must be an object instead of an array, like so:
"context":{"varname1": "varvalue", "varname2": "varvalue2"}

Related

Parameterize Parameter in API Body in dataflow

below is the image of the dynamic body in Web activity wherein I am leveraging a pipeline parameter
With the above set of configuration and values, the web activity is successfully getting executed.
Now for the same REST API, I am trying to use it as source in dataflow :
with the same aspect of configuration and it is failing with below error :
Dataflow param:
Error:
ailure to read most recent page request: DF-REST_001 - Error response from server: Some({"error":{"code":"DatasetExecuteQueriesError","pbi.error":{"code":"DatasetExecuteQueriesError","parameters":{},"details":[{"code":"DetailsMessage","detail":{"type":1,"value":"The JSON DDL request failed with the following error: Invalid JavaScript property identifier character: }. Path '', line 1, position 7.."}},{"code":"AnalysisServicesErrorCode","detail":{"type":1,"value":"3239182519"}}]}}}), Status code: 400. Please check your request url and body.
Can someone please help what I am doing Wrong?
To use dataflow parameter within the expression builder, we have to concatenate the parameter with our data. Look at the following demonstration.
I am using derived column to show how to use parameters within expression builder. Using the dynamic content same as yours, I get the following output:
'{"queries":[{"query":"{$query}"}],"serializerSettings":{"includeNulls":true}}'
String interpolation does not work in the dataflow expression builder. To get it correctly, you can concatenate the parameter using + to get desired result.
'{"queries":[{"query":"'+$query+'"}],"serializerSettings":{"includeNulls":true}}'
Using the following content in expression builder also works (using concat() function).
concat('{"queries":[{"query":"',$query,'"}],"serializerSettings":{"includeNulls":true}}')

What is the appropriate way to build JSON within a Data Factory Pipeline

In my earlier post, SQL Server complains about invalid json, I was advised to use an 'appropriate methods' for building a json string, which is to be inserted into a SQL Server table for logging purposes. In the earlier post, I was using string concatenation to build a json string.
What is the appropriate tools/functions to build json within a Data Factory pipeline? I've looked into the json() and string() functions, but they would still rely on concatenation.
Clarification: I'm trying to generate a logging message that looks like this: Right now I'm using string concatenation to generate the logging json. Is there a better, more elegant (but lightweight) way to generate the json data?
{ "EventType": "DataFactoryPipelineRunActivity",
"DataFactoryName":"fa603ea7-f1bd-48c0-a690-73b92d12176c",
"DataFactoryPipelineName":"Import Blob Storage Account Key CSV file into generic SQL table using Data Flow Activity Logging to Target SQL Server",
"DataFactoryPipelineActivityName":"Copy Generic CSV Source to Generic SQL Sink",
"DataFactoryPipelineActivityOutput":"{runStatus:{computeAcquisitionDuration:316446,dsl: source() ~> ReadFromCSVInBlobStorage ReadFromCSVInBlobStorage derive() ~> EnrichWithDataFactoryMetadata EnrichWithDataFactoryMetadata sink() ~> WriteToTargetSqlTable,profile:{ReadFromCSVInBlobStorage:{computed:[],lineage:{},dropped:0,drifted:1,newer:1,total:1,updated:0},EnrichWithDataFactoryMetadata:{computed:[],lineage:{},dropped:0,drifted:1,newer:6,total:7,updated:0},WriteToTargetSqlTable:{computed:[],lineage:{__DataFactoryPipelineName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryPipelineName]}]},__DataFactoryPipelineRunId:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryPipelineRunId]}]},id:{mapped:true,from:[{source:ReadFromCSVInBlobStorage,columns:[id]}]},__InsertDateTimeUTC:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__InsertDateTimeUTC]}]},__DataFactoryName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryName]}]},__FileName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__FileName]}]},__StorageAccountName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__StorageAccountName]}]}},dropped:0,drifted:1,newer:0,total:7,updated:7}},metrics:{WriteToTargetSqlTable:{rowsWritten:4,sinkProcessingTime:1436,sources:{ReadFromCSVInBlobStorage:{rowsRead:4}},stages:[{stage:3,partitionTimes:[621],bytesWritten:0,bytesRead:24,streams:{WriteToTargetSqlTable:{type:sink,count:4,partitionCounts:[4],cached:false},EnrichWithDataFactoryMetadata:{type:derive,count:4,partitionCounts:[4],cached:false},ReadFromCSVInBlobStorage:{type:source,count:4,partitionCounts:[4],cached:false}},target:WriteToTargetSqlTable,time:811}]}}},effectiveIntegrationRuntime:DefaultIntegrationRuntime (East US)}",
"DataFactoryPipelineRunID":"63759585-4acb-48af-8536-ae953efdbbb0",
"DataFactoryPipelineTriggerName":"Manual",
"DataFactoryPipelineTriggerType":"Manual",
"DataFactoryPipelineTriggerTime":"2019-11-05T15:27:44.1568581Z",
"Parameters":{
"StorageAccountName":"fa603ea7",
"FileName":"0030_SourceData1.csv",
"TargetSQLServerName":"5a128a64-659d-4481-9440-4f377e30358c.database.windows.net",
"TargetSQLDatabaseName":"TargetDatabase",
"TargetSQLUsername":"demoadmin"
},
"InterimValues":{
"SchemaName":"utils",
"TableName":"vw_0030_SourceData1.csv-2019-11-05T15:27:57.643"
}
}
You can using Data Flow, it help you build the JSON string within pipeline in Data Factory.
Here's the Data Flow tutorial: Mapping data flow JSON handling.
It can help you:
Creating JSON structures in Derived Column
Source format options
Hope this helps.

How to send a list as parameter in databricks notebook task?

I am using Databricks Resi API to create a job with notebook_task in an existing cluster and getting the job_id in return.
Then I am calling the run-now api to trigger the job.
In this step, I want to send a list as argument via the notebook_params, which throws an error saying "Expected non-array for field value".
Is there any way I can send a list as an argument to the job?
I have tried sending the list argument in base_params as well with same error.
user_json={
"name": job_name,
"existing_cluster_id": cluster_id,
"notebook_task": {
"notebook_path": notebook_path
},
"email_notifications":{
"on_failure":[email_id]
},
"max_retries": 0,
"timeout_seconds": 3600
}
response=requests.post('https://<databricks_uri>/2.0/jobs/create',headers=head,json=user_json,timeout=5, verify=False)
job_id=response.json()['job_id']
json_job={"job_id":job_id,"notebook_params":{"name":"john doe","my_list":my_list}}
response = requests.post('https://<databricks_uri>/2.0/jobs/run-now', headers=head, json=json_job, timeout=200, verify=False)
Not found any native solution yet, but my solution was to pass the list as a string and parse it back out on the other side:
json_job={"job_id":job_id,
"notebook_params":{
"name":"john doe",
"my_list":"spam,eggs"
}
}
Then in databricks:
my_list=dbutils.widgets.get("my_list")
my_list=my_list.split(",")
With appropriate care around special characters or e.g. conversion to numeric types.
If the objects in the list are more substantial, then sending them as a file to dbfs using the CLI or API before running the job may be another option to explore.
Hi may be I'm bit late but found a better solution.
Step 1:
Use JSON.stringyfy() in the console of any browser to convert your value(object, array, JSON etc) into string
Ex:
Now use this value in the body of URL
In Databricks notebook convert string to JSON using python json module.
Hope this helps

how to get dictionary value from webrequest using sharepoint designer

I am trying to retrieve a value from a HTTP web service call in sharepoint designer. This should be simple. the Rest query is simple, and always returns only a single value:
https://Site.sharepoint.com/sites/aSiteName/_api/web/lists/getByTitle('MyListTitle')/items/?$select=Title&$top=1
In the Sharepoint Designer workflow, I'm setting the required Accept and Content-type header to the value of "application/json;odata=verbose
I am unable to get the value of the "Title" field that is returned by the call.
when I execute the REST query in the browser, I get the following data returned:
{"d":{"results":[{"__metadata":{"id":"af9697fe-9340-4bb5-9c75-e43e1fe20d30","uri":"https://site.sharepoint.com/sites/aSiteName/_api/Web/Lists(guid'6228d484-4250-455c-904d-6b7096fee573')/Items(5)","etag":"\"1\"","type":"SP.Data.MyListName"},"Title":"John Doe"}]}}
I've tried dozens of variations of the dictionary 'query', but they always return blank.
I'm using the 'get an item from a dictionary' action in SP Designer, using item name or path values like:
d/results(0)/Title
d/Title
d/results/Title
and literally dozens of other variations - but it always returns blank.
I'm writing the raw response from the webRequest to the list for debugging, and it shows the value like this:
{"odata.metadata":"https:\/\/site.sharepoint.com\/sites\/aSiteName\/_api\/$metadata#SP.ListData.MyListTitle&$select=Title","value":[{"odata.type":"SP.Data.MyListTitle","odata.id":"616ed0ed-ef1d-405b-8ea5-2682d9662b0a","odata.etag":"\"1\"","odata.editLink":"Web\/Lists(guid'6228d484-4250-455c-904d-6b7096fee573')\/Items(5)","Title":"John Doe"}]}
I must be doing something simple that is wrong?
Using "d/results(0)/Title" is right. Check the steps in article below to create a workflow.
CALLING THE SHAREPOINT 2013 REST API FROM A SHAREPOINT DESIGNER WORKFLOW
It working fine in my test workflow.
I faced the exact same issue. In my case the reason was that in the API call, the header was not set properly.
As you would have noticed many times, that if you type the variables inline when creating the "Call Http Web service" action, those might not get set properly. The surest way is to open the properties and set from there. In my case when i opened the properties i found that RequestHeaders was not set. Once i set it from there, i got the desired results.
Hope this helps to someone in future, this question being unanswered till now!!
tried dump those json called from sharepoint workflow into a list. Sometimes you'll get a different format than when you called that from browser. I experienced this issue when calling API projectserver (project online). When I called it using servistate (chrome extension) it returns d/results, but when I dump the value into the list I got value and yet I used same request header value.

Facebook batch operations custom object

I am trying to make some batch operations for custom objects that I have created oat Open graph API. I have seen tons of examples for feeds or images, but I still do not know if Facebook supports batch operations for custom objects. For example, I'd like to know if a batch operation for the following objects would work:
batch=[
{:method=>"post", :relative_url=>"/me/tfgtopquiz:win", "profile"=>"users/1/victories"}
{:method=>"post", :relative_url=>"/me/tfgtopquiz:guessed", "triviaquestion"=>"questions/1"} ]
Notice that I have custom types (triviaquestion). It seems that if I pass it as a parameter facebook ignores it, and I would need this information. After the request, I get this error message:
"The action you're trying to publish is invalid because it does not specify any reference objects. At least one of the following properties must be specified: triviaquestion."
I really tried to define the type "triviaquestion", but it looks like Facebook ignored it iside batch JSON.
Any thoughts?
Thanks!
After some time I realized that I should send the JSON in other format:
{:access_token=>"MY_ACCESS_TOKEN",
:requests=>
[{:action=>"guessed", :object_type=>"triviaquestion", :object_url=>"URL"},
{:action=>"guessed", :object_type=>"triviaquestion", :object_url=>"URL"},
{:action=>"guessed", :object_type=>"triviaquestion", :object_url=>"URL"},
{:action=>"play", :object_type=>"correct_answer", :object_url=>"URL"}]}
Thanks!