Talend runtime : No datasource with alias: ds-mysql8 available - talend

I'm using talend runtime to launch a web service who should be configurable.
This webservice work fine until I use the alias function of tDBconnection.
I tried to use this by following the documentation of talend : https://help.talend.com/r/en-US/8.0/data-service-route-example/installing-mysql-driver-into-container
After following exactly this tutorial, I got this issue : No datasource with alias: ds-mysql8 available!
The following command service:list Datasource have no return.
EDIT : I could get the datasource list :
The right command to get them is this one : service:list DataSource
I still not understand why my data source are not available when make a request

Related

Dreamfactory - Why is my server event script not called?

I have a bitnami instance that I use to run DF 2.12.0, on which I added a custom “Remote service” (a HTTP REST API). I would like to use server-side event scripting functionalities to pre-process request data before sending it. I have this pre-processing Node js test script, that is linked to the “pre_process” event of my resource :
console.log("test");
But it seems that this script is not executed, after having a look at the DF log file :
However, all DF built-in functionalities such as the user management service seem to work with event scripting. Here is the same log file about a script linked to the user.session.get.pre_process event, that is indeed called :
Strangely, the complete path of my main event script is netwrixapi.search.post.pre_process, but the first log file image only mentions a call to the event “netwrixapi.post.pre_process” (without my resource “search”).
I included the “X-DreamFactory-Api-Key” in my request header, which references an app with a full-access role to API and script sources for all HTTP methods :
I also set APP_DEBUG=true and APP_LOG_LEVEL=debug in my .env file, without any luck.
Any ideas ?
Problem finally solved : seems like the log_events variable wasn't set to true by default (even if the official documentation says so) in my {$HOME}/apps/dreamfactory/htdocs/vendor/dreamfactory/df-core/config/df.php file.

Migration.exe -connectionString arguments does not work doing EF migration on CI server

I am trying to do database update of code first migration on a build server.
I read about using migration.exe from EF 6 tools, and passing misc. context and connection settings in as arguments to the migrate.exe call.
I want to be able to specify the connection string, default catalog and security myself directly as arguments.
The problem is that when I have specified my connection string, etc. like:
migrate.exe Ef.Data.DLL /ConnectionString:"Data Source=myserver;Initial Catalog=MyCatalog;Integrated Security=true" /connectionProviderName:System.Data.SqlClient /verbose
Then migrate.exe will throw an error:
System.InvalidOperationException: No connection string named 'MyContext' could be found in the application config file
My context is defined in code like:
public MyContext(): base("name=MyContext")
So it expects a MyContext connection string like it was still trying to use an App.Config or web.config for this, but it should not, since I'm passing this information in as arguments.
If I try to specify a ConnectionStringName as argument (-connectionStringName:MyContext) along the other args. the I get:
ERROR: Only one of /connectionStringName or /connectionString can be specified.
So I'm pretty stuck here. Cant seem to solve this one. Any ideas are highly appreciated.
I had the same issue today. For me, it was sorted by changing my context constructor, from this:
public RootContext() : base("name=MyContext")
to this:
public RootContext() : this("MyContext")
Having the name= in front of the context name forces the value to be found in the config otherwise it throws an error. This is great when you're only deploying from Visual Studio because it'll helpfully throw errors if your strings don't match, but not so great when trying to automate migrations in different environments from the command line.
So, if you make this change, be careful when running your migrations in VS - if your strings don't match it'll happily continue and create a new database in LocalDB named "MyContext" without telling you that's what it did!

ADMG0007E: The configuration data type ConfigSynchronizationService is not valid

I'm trying to automize WebSphere Deployment process for zero down-time using steps which you can find in this link..
According to documentation's first step , we should disable "Automatic Synchronization" for each node. To automize it, I'm using script which given in documentation but when I try to apply the command below :
set syncServ [$AdminConfig list ConfigSynchronizationService $na_id]
I'm facing with an error that : ADMG0007E: The configuration data type ConfigSynchronizationService is not valid
As I checked IBM documentations, I couldn't see any resource which referred to this problem.
Does anyone have any workaround or direct solution for it?
Thanks in advance for your suggestions.
PS: I should mention that document written for zOS system but I'm trying to apply that methodology to WebSphere AS on Linux.

Function app binding issue - Value cannot be null. Parameter name: hostAccount

When I try to upload the zip file to an azure function app using kudu REST API, it throws an error while I try to view the c# code in Function App editor in the browser. The error is:
"Error:
Function ($Source) Error: Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.Source'. Microsoft.Azure.WebJobs.Host: Value cannot be null.
Parameter name: hostAccount.
Session Id: xxxxxxxxxxx
Timestamp: 2016-12-02T18:35:00.083Z"
Please note that I have automated end to end of Application Insights starting from creation of a resource group till exporting the multi-setep web test results to our Splunk - All using Powershell.
In this process of automation, I am forming a storage connection string and setting it to the app settings of the function app and then providing that key in my function.json binding.
But still I get this error.
Here is the issue I created in the Azure Function App - Git: https://github.com/Azure/azure-webjobs-sdk-script/issues/1020
The error points to missing host configuration (e.g. the host storage account).
If you're automating the creation process, you need to make sure the required environment variables are properly set. The recommended way would be to use an ARM template manage that for you.
I have some details on how you can get the ARM template for a resource (which you could use to look at the required settings for your Function App) here.
We also have some sample templates you can use linked here
I hope this helps!

Error trying to load driver for generic database configuration in Mule

I am using Mule 3.6 and would like to use the bulk insert option on the Generic Database Configuration to load data into mongodb 3.0.8.
I have entered the URL as:
jdbc:mongo://localhost:27017/test
and have tried a number of different Mongo and JDBC drivers but keep receiving the message "Test connection failed. Error trying to load driver..."
How can I configure the Generic Database Connector in Mule to connect to Mongo?
As already stated in this post, there is no official JDBC driver for MongoDB but one the suggested alternatives is using UnityJDBC.
If you decide to follow the UnityJDBC approach, then:
Download and install the driver by executing the following command:
java -jar UnityJDBC_Trial_Install.jar
Go to the installation folder and copy mongodb_unityjdbc_full.jar to the classpath of your Mule app.
Configure the URL and Driver in the Global Element of your Generic Database component (the values you provided are OK):
URL: jdbc:mongo://<host>:<port>/<database>
Driver Class Name: mongodb.jdbc.MongoDriver
If not, use the MongoDB Connector as suggested by #JoostD.
You need to use the MongoDB connector, it should be included in studio.
Otherwise install it from the Anypoint Exchange:
https://www.mulesoft.com/exchange/#!/mongodb-integration-connector
Also see some example on it:
https://www.mulesoft.com/exchange/#!/importing-csv-into-mongodb