GET a Salesforce Batch request using workbench - rest

I am using the following syntax to get a batch request for bulk data load job from performed in our dev org
https://instance_name—api.salesforce.com/services/async/APIversion/job/jobid/batch/batchId/request
In workbench I went to REST Explorer clicked GET and used the following query:
/services/async/v29.0/job/7501j000000Lb31/batch/7501g000000l0Lf
When clicking on execute, I get the following error message:
{"exceptionCode":"InvalidSessionId","exceptionMessage":"Unable to find session id"}
My end goal is to be able to pull all view request csvs from a bulk data load job instead of having to download each one of them manually
Thanks

Related

Can't read REST API with an XML response using Synapse Pipeline's Copy Activity

Trying to read REST API endpoints through Synapse pipeline and sinking it in a JSON format. The API response is XML and the run ends up erroring out.
--------Error---------
{
"errorCode": "2200",
"message": "Failure happened on 'Source' side. ErrorCode=JsonInvalidDataFormat,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error occurred when deserializing source JSON file ''. Check if the data is in valid JSON object format.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Newtonsoft.Json.JsonReaderException,Message=Unexpected character encountered while parsing value: <. Path '', line 0, position 0.,Source=Newtonsoft.Json,'",
"failureType": "UserError",
"target": "Copy REST API Data",
"details": []
}
--------Error---------
Do not want to go back and use existing C# script-based code which is currently run through SSIS packages.
Any assistance will be appreciated.
I tried to repro this using Rest connector in ADF and got the same error. Rest connector supports only JSON file. Refer this Microsoft Document on Supported capabilities of REST connector.
Instead use HTTP connector and select XML dataset. Below is the approach to do it.
Select HTTP in linked service.
Enter Base URL and authentication type and then click create.
Create new dataset for HTTP linked service. Select HTTP and then continue.
Select XML format and then select continue.
Give the linked service and relative url and then click OK.
Use this dataset as source dataset in copy activity. Once pipeline is run, data will be copied to sink.

Error while accessing SAP data using Azure data factory CDC connector

We are trying to read data from SAP using Azure data factory change data capture(CDC) connector. We get the below error when tried to access the data. The connector works fine for full load and it fails for delta load.
Error Message: DF-SAPODP-012 - SapOdp copy activity failure with run id: XXXXXXXX-XXXX-4444-826e-XXXXX, error code: 2200 and error message: ErrorCode=SapOdpOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Sap Odp operation 'OpenOdpRead' failed. Error Number: '013', error message: 'Error while accessing',Source=Microsoft.DataTransfer.Runtime.SapRfcHelper,', Exception: com.microsoft.dataflow.Utils$.failure(Utils.scala:76)
com.microsoft.dataflow.store.sapodp.SapOdpAdmsRequestConstructor$.executeAndMonitorCopyActivity(SapOdpAdmsRequestConstructor.scala:206)
com.microsoft.dataflow.store.sapodp.SapOdpAdmsRequestConstructor$.com$microsoft$dataflow$store$sapodp$SapOdpAdmsRequestConstructor$$executeSapCDCCopyInternal(SapOdpAdmsReque
The issue was due to the additional privileges needed for the user to read data from SAP Operational Data Provisioning (ODP) framework. The full load works as there is not need to track the changes. To solve this issue, we added authorization objects S_DHCDCACT, S_DHCDCCDS, S_DHCDCSTP to the user profile which read data from SAP.

Swagger call fails - net::ERR_EMPTY_RESPONSE

I have defined swagger for REST APIs. Swagger explorer is working fine. 1 of the APIs is doing bulk update of dynamo DB records. It takes around 3-4 minutes to complete dynamoDB operation. For this API, swagger call runs for sometime and then it says
Response Code
0
Response Headers
{
"error": "no response from server"
}
When I checked logs, there is no any exception from swagger. This API update operation runs in background and completes processing in 3-4 minutes. I have verified it using logs and metric datapoint. I want swagger also to pause/ continuous active load instead of getting timed out.
After checking Inspect part of page, it says
net::ERR_EMPTY_RESPONSE
Is there any way to update swagger timeout/ anything?

Not Able to Publish ADF Incremental Package

As Earlier Posted a thread for syncing Data from Premises Mysql to Azure SQL over here referring this article, and found that lookup component for watermark detection is only available for SQL Server Only.
So tried a work Around, that while using "Copy" Data Flow task ,will pick data greater than last watermark stored from Mysql.
Issue:
Able to validate package successfully but not able to publish same.
Question :
In Copy Data Flow Task i'm using below query to get data from MySql greater than watermark available.
Can't we use Query like below on other relational sources like Mysql
select * from #{item().TABLE_NAME} where #{item().WaterMark_Column} > '#{activity('LookupOldWaterMark').output.firstRow.WatermarkValue}'
CopyTask SQL Query Preview
Validate Successfully
Error With no Details
Debug Successfully
Error After following steps mentioned by Franky
Azure SQL Linked Service Error (Resolved by re configuring connection /edit credentials in connection tab)
Source Query got blank (resolved by re-selection source type and rewriting query)
Could you verify if you have access to create a template deployment in the azure portal?
1) Export the ARM Template: int he top-right of the ADFv2 portal, click on ARM Template -> Export ARM Template, extract the zip file and copy the content of the "arm_template.json" file.
2) Create ARM Template deployment: Go to https://portal.azure.com/#create/Microsoft.Template and log in with the same credentials you use in the ADFv2 portal (you can also get to this page going in the Azure portal, click on "Create a resource" and search for "Template deployment"). Now click on "Build your own template in editor" and paste the ARM template from the previous step in the editor and Save.
3) Deploy template: Click on existing resource group and select the same resource group as the one where your Data Factory is. Fill out the parameters that are missing (for this testing it doesn't really matter if the values are valid); Factory name should already be there. Agree the terms and click purchase.
4) Verify the deployment succeeded. If not let me know the error, it might be an access issue which would explain why your publish fails. (ADF team is working on giving a better error for this issue).
Did any of the objects publish into your Data Factory?

Drools workbench test scenarios

Two part question:
I need to setup test case where one bean/fact has collection of items - is it possible to do with workbench editor, I picked Guided list and for each item I was trying to do new Item('sku', 'name') .. but when it tries to compiled it it has can not find Item class. Item class is imported.
After deploying artifact to execution server and testing rules with SOAP UI where I did not specify session in the tag - it seems like by default execution server uses statefull session which affect subsequent rule executions.
I went to project properties created "stateless" session
ksession, default=yes, state=stateless, clock=realtime
Now however if I try to execute by test cases in the workbench I get:
Unable to complete your request. The following exception occurred: Cannot find a default KieSession.
Any ideas???

Categories