How to read a local csv file using Azure Data Factory and a self-hosted runtime? - azure-data-factory

I have a Windows Server VM with the ADF Integration Runtime installed running under a local account called deploy. This account is a member of the local admins group. The server is not domain-joined.
I created a new linked service (File System) and pointed it to a csv file on the root of the C drive as a test. When I test the connection I get Connection failed.
Error occurred when trying to access the file in Folder 'C:\etr.csv', File filter: ''. The directory name is invalid. Activity ID: 1b892702-7cc3-48d5-83c7-c680d6d15afd.
Any ideas on a fix?

The linked service needs to be a folder on the target machine. In your screenshot, change C:\etr.csv to C:\ and then define a new dataset that uses the linked service to select etr.csv.

The dataset represents the structure of the data within the linked data stores, and the linked service defines the connection to the data source. So the linked service should point to the folder instead of file. It should be C:\ instead of C:\etr.csv

Related

How to use multiple client certificates stored in keystore.p12 and set jmeter's system.properties while setting test in Azure Load Testing service?

I am trying to set up test (manual/yaml) in Azure Load Testing service and my test uses client certificates, so I uploaded jmx, keystore(.p12) and csv (has alias of certificates in keystore) to test plan.
In Azure Load Testing, where can I set javax.net.ssl.keyStoreType, javax.net.ssl.keyStore, javax.net.ssl.keyStorePassword, https.use.cached.ssl.context,https.keyStoreStartIndex and https.keyStoreEndIndex properties?
In case of Jmeter, I would set above properties in jmeter's system.properties file. But, in case of Azure Load Testing, not sure how to get this working.
Please suggest, Thanks
As per Configure a load test in YAML
configurationFiles List of relevant configuration files or other files that you reference in the Apache JMeter script. For example, a CSV data set file, images, or any other data file. These files will be uploaded to the Azure Load Testing resource alongside the test script. If the files are in a subfolder on your local machine, use file paths that are relative to the location of the test script.
So my expectation is that if you upload system.properties file along with the .jmx script and CSV file with certificate aliases the Azure load testing engine should pick it up and apply.
It should also be possible to do via GUI:
More information: How to Use Multiple Certificates When Load Testing Secure Websites

AWS MobileAnalyticsManager access to folder 'AWS Mobile Services\M4SP' is denied

I am trying to add the AWSSDK DLL into my C# code to collect my event data and pass the data to the AWS bucket. My C# code is created with VS Share point template. The project contains WSP files. The following code indicates how I use the AWSSDK :
using Amazon;
using Amazon.CognitoIdentity;
using Amazon.MobileAnalytics.MobileAnalyticsManager;
CognitoAWSCredentials credentials = new CognitoAWSCredentials(
"us-east-1:xxxxxx",//PoolID
RegionEndpoint.USEast1
);
Amazon.AWSConfigs.ApplicationName = "M4SP";
AWSConfigs.LoggingConfig.LogMetrics = true;
AWSConfigs.LoggingConfig.LogResponses = ResponseLoggingOption.Always;
AWSConfigs.LoggingConfig.LogMetricsFormat = LogMetricsFormatOption.JSON;
MobileAnalyticsManager manager = MobileAnalyticsManager.GetOrCreateInstance(
"xxxxxxxxxxxxxxxxxxx",//AppID
credentials,
RegionEndpoint.USEast1 // Region
);
CustomEvent customEvent = new CustomEvent("TestRecordEvent");
customEvent.AddAttribute("label", "M4SP");
customEvent.AddAttribute("action", "invoke");
customEvent.AddAttribute("details", "run the workflow test");
manager.RecordEvent(customEvent);
I found the code inside AWSSDK DLL was trying to log the data to local folder before passing it to AWS database. The location of the folder is C:\Users\[userid]\AppData\Roaming\AWS Mobile Services.
There is no problem in a standalone project since it always uses current user’s identity to run the application so it has access to the folder. But, because of the authentication mechanism of SharePoint solutions, it uses Application Pool Identity to access the folder and it gets access denied issue and the whole process fails.
Here is the error:
"Access to the path 'AWS Mobile Services\M4SP' is denied."
I modified the access right of Share point Application Pool Identity (in my case, it is “network service” account) but it still can’t access the folder .
Does anyone have a solution for this issue? Thanks very much for the help!!

local webIDE not connectig to es4 service

I have installed SAP WebIDE local on my machine and trying to connect with the below services:
https://sapes4.sapdevcenter.com/sap/opu/odata/IWBEP/GWSAMPLE_BASIC/?sap-ds-debug=true
http://services.odata.org/v3/northwind/northwind.svc/
I am getting two errors attached for reference.
Below is my destination file1:
Description=es4
Type=HTTP
TrustAll=true
Authentication=NoAuthentication
WebIDEUsage=odata_abap
Name=es4
WebIDEEnabled=true
URL=https\://sapes4.sapdevcenter.com\:443
ProxyType=Internet
WebIDESystem=es4
File 2:
Description=es4
Type=HTTP
TrustAll=true
Authentication=NoAuthentication
WebIDEUsage=odata_gen
Name=es4
WebIDEEnabled=true
URL=https\://sapes4.sapdevcenter.com
ProxyType=Internet
WebIDESystem=es4
Is there any configuration needed in my local Cloud connector?
First, you shouldn't have separate files for the same destination. Please have it in one file and separate the WebIDEUsage values with commas (make sure there are no spaces). More information can be found in the documentation Hofit has added.
Second, there's no need in a Cloud Connector, as there's no cloud here. If you install Web IDE locally then it's installed in your local station, there's no connectivity to the cloud.
I'm sure you can find all the needed information in both the documentation and SAP community.
I just tried to connect to es4- as you did in the first screenshot and it is working fine. (the name in the service catalog dropdown should be es4 as the name in the destination file 1- and not es4123).
Here is a link to the documentation.

how to copy local directory with files to remote server talend

in Talend(data integration) i am trying to copy local directory to remote directory but when i am running the job only i can copy files but not folders from directory.please help me with this job.
In my talend job i am using local connection and remote connection components->
tfilelist->tfileproperties(to store path and name in one table)->tmssqlinput(extracting path from last table)->iteration-> tssh(if directory s not available then create)->finally sending it to tftpput to connect and copy to remote directory.
when i am storing in one table using tfileproperties in that for files it will generate some size but when folder s coming the size will be zero,using this condition m creating the directory using tssh component but unable to create folders,please help me.
Do you get an error message?
I believe the output of the TMSSqlInput should be a row based, rather than iteration. That might be the source of the problem.
tMSqlInput docs
tMSSqlInput executes a DB query with a strictly defined order which
must correspond to the schema definition. Then it passes on the field
list to the next component via a Main row link.

Database script encounterd "AWKDBE018E Cannot access required JDBC Driver folder" in Workload Scheduler

I create a step of database script which access to SQL Database Service in Workload Scheduler Service. When I run the process, the step encountered the error below.
error message
AWKDBE018E Cannot access required JDBC Driver folder
message information
http://www-01.ibm.com/support/knowledgecenter/SSGSPN_9.2.0/com.ibm.tivoli.itws.doc_9.2/common/src_ms/awsmsawkdbe.htm?lang=en
AWKDBE018E Cannot access required JDBC Driver folder
Explanation
The job was not able to access a JDBC Driver folder, you might not
have enough permissions.
System action
The operation is not performed.
Operator response
Verify that you have enough permissions.
This message seems to ask me to grant the proper authority to the job user. But there is no property to specify the job user of Workload Automation Agent. I use a Workload Automation Agent provisioned by Bluemix automatically.
Could you teach me which parameters are needed ?
Database script step information
JDBC driver class path info
I checked the path by the following "ls -lR" command step's log.
it seems to have a problem with the agent, I tried to replicate the same job type but it is not working with the same error message (even using different solutions for jdbc driver path).
If you are using the Workload Automation Agent that is created for you then you could open a support ticket to have the Workload team look at that agent.
Edit after having support from service team:
in the jar classpath field for a predefined workload scheduler process you have to put only the path to the directory containing jar files, without putting the jar file name to use.
So, according to current Workload Scheduler documentation, you have to use the following value:
/home/wauser/utils
By this way the database script works fine.
(screenshot added)
It looks like it is having issues referencing the location to the JDBC class path for DB2. Can you please double check the location for the class path for the DB2 driver?
Even though old, I wanted to make some quick checks.
This is tested on a 9.5 FP1 dynamic agent, part of the container delivery. The path values are the standard values for the container.
Try 1 - full path - SUCCESS
<jsdldatabase:driverPath>/opt/wa/TWS/jdbcdrivers/db2/</jsdldatabase:driverPath>
= Status Message: Success
= Exit Status : 0
Try 2 - relative path - FAIL
<jsdldatabase:driverPath>./jdbcdrivers/db2/</jsdldatabase:driverPath>
Job status : FAIL
===============================================================
AWKDBE018E Cannot access required JDBC Driver folder
===============================================================
Try3 - variable in path - FAIL
<jsdldatabase:driverPath>${UNISONHOME}/jdbcdrivers/db2/</jsdldatabase:driverPath>
===============================================================
AWKDBE018E Cannot access required JDBC Driver folder
===============================================================
Try4 - variable in path - FAIL
<jsdldatabase:driverPath>$UNISONHOME/jdbcdrivers/db2/</jsdldatabase:driverPath>
===============================================================
AWKDBE018E Cannot access required JDBC Driver folder
===============================================================
So put short you need an absolute path into that parameter.
BUT, you can set the path in a config file global to the agent
Try5 - variable in agent config -
Inside IWSDATA Home : wadata/JavaExt/cfg/DatabaseJobExecutor.properties, write the following line
jdbcDriversPath=/opt/wa/TWS/jdbcdrivers
then remove the xml element about driver from the job, so no line
<jsdldatabase:driverPath>/opt/wa/TWS/jdbcdrivers/db2/</jsdldatabase:driverPath>
===============================================================
= Exit Status : 0
Note that in this case the jdbcdrivers/db2 is not needed. It will search for subdirectories.