matlab error when connecting to postgresql database - postgresql

I'm trying to connec to a PostgreSQL database with following command:
connection = database( ...
options.getDatabaseName(), ...
options.getUsername(), ...
options.getPassword(), ...
"org.postgresql.Driver", ...
"jdbc:postgresql://" + options.getHostname() + ":" + options.getPort() + "/" + options.getDatabaseName() ...
);
It returns me following error:
Error using database (line 59)
Unmatched parameter name 'org.postgresql.Driver' must be a string scalar or character vector that can represent a field name.
I've seen other questions about that, like this one but the error message is different.
What I'm doing wrong?

I've found the solution by myself, and it's tricky (maybe related to a bug in my opinion).
In order to test the database connection I've created first a connection with the Database explorer. It worked, and I saved this connection using the same name of the database.
When I use the database command, by inspecting it source code I've seen that the first thing that it does it to check if there's an existing data source with that name and, if not, it search for the database. The problem was that since my connection had the same database name, database supposed that I wanted to use the data source command version instead of the database. It tried to use this command:
conn = database(datasource,username,password)
instead of this one:
conn = database(databasename,username,password,driver,url)
since wtrade is both name of the database and of the data source. In that case the fourth argument, driver, must be a parameter name, like "Vendor" of "PortNumber", as per Matlab documentation, so since the driver string does not match a parameter name, I had the error.
I've removed the datasource with the same name of the database and everything started to work.
I've notified this to MathWorks, since in my opinion there should be no problem if a database has the same name of a datasource, since the signature are different, so database command should handle also this case.

Related

SQL Developer and DB2 errors

I'd like to use SQL Developer with DB2, I was able to connect and I canned execute my queries, but when I have an error, I cannot know witch error is. SQL Developer shown me only the error code, not the message. There is the way to know the error I have?
EDIT:
For example, launching this query:
Select * from WrongTable
other programs says:
ERROR[42704][IBM][DB2/NT64] SQL0204N "USERNAME.WRONGTABLE" รจ un nome non definito
sqldeveloper limits its report to the error nr only:
Errore alla riga del comando : 1 colonna : 1
Report errori -
Errore SQL: DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704, SQLERRMC=USERNAME.WRONGTABLE, DRIVER=4.19.49
Thank you.
The URL syntax for connecting to Db2 with type-4 jdbc drivers is documented here.
The property that controls how much information is returned with getMessage() is called retrieveMessagesFromServerOnGetMessage, and its default value is disabled ( false , 0 ). Set it to value 1 (or YES, or true )to enable more details on errors.
You can append many properties after the database name in the Database field, on the Oracle SQL-Developer connection properties. Express each property in the form x=y , each x=y pair is separated by a semi-colon and the final one is terminated by a semi-colon, and the first property is prefixed by colon immediately after the database name.
For example, suppose the database name is sample and I wanted three additional properties, the I would put this in the Database field in Oracle-SQL-developer:
sample:useJDBC4ColumnNameAndLabelSemantics=No;securityMechanism=11;retrieveMessagesFromServerOnGetMessage=1;
If value 1 does not give the expected result, use value YES although they should be equivalent. Remember to SAVE the setting change, disconnect from the database, reconnect , before retrying your queries to assess the change.
Many other properties are available, see many related pages in the documentation , some properties are common to all target Db2 platforms, other properties are specific to Db2-LUW, or Db2-Z/OS, or Informix etc, so read the docs carefully. Some properties can be set by code after the connection is already established.

Loopback (DB2) - Can not create an instance of PersistedModel that uses a schema other than the userid

I am trying to define a model that is based on the PersistedModel to access a table in DB2, call it MY_SCHEMA.MY_TABLE.
I created the model MY_TABLE, based on PersistedModel, with a Data Source (datasources.json) where the definition includes the attribute "schema": "MY_SCHEMA". The data source also contains the userid my_userid, used for the connection.
Current Behavior
When I try to call the API for this model, it tries to access the table my_userid.MY_TABLE.
Expected Behavior
It should access MY_SCHEMA.MY_TABLE.
The DB2 instance happens to be on a System Z. I have created a table called my_userid.MY_TABLE and that will work, however for the solution we are trying to build, there are multiple schemas required.
Note that this only appears to be an issue with Db2 on System Z. I can change schemas on Db2 LUW.
What LoopBack connector are you using? What version? Can you also check what version of loopback-ibmdb is installed in your node_modules folder?
AFAICT, LoopBack's DB2-related connectors support schema field, see https://github.com/strongloop/loopback-ibmdb/blob/master/lib/ibmdb.js#L96-L100
self.schema = this.username;
if (settings.schema) {
self.schema = settings.schema.toUpperCase();
}
self.connStr += ';CurrentSchema=' + self.schema;
Have you considered configuring the database connection using DSN instead of individual fields like hostname and username?
In your datasource config JSON:
"dsn": "DATABASE={dbname};HOSTNAME={hostname};UID={username};PWD={password};CurrentSchema=MY_SCHEMA"

How to populated the table via Pentaho Data Integration's table_output step?

I am performing an ETL job via Pentaho 7.1.
The job is to populate a table 'PRO_T_TICKETS' in PostgreSQL 9.2 via the Pentaho Jobs and transformations?
I have mapped the table fields with respect to the stream fields
Mapped Fields
My Table PRO_T_TICKETS contains the Schema (Column Names) in UPPERCASE.
Is this the reason I can't populate the table PRO_T_TICKETS with my ETL Job?
I duplicated the step TABLE_OUTPUT to PRO_T_TICKETS and changed the Target table field to 'PRO_T_TICKETS2'. Pentaho created a new table with lowercase schema and populated the data in it.
But I want this data to be uploaded in the table PRO_T_TICKETS only and with the UPPERCASE schema if possible.
I am attaching the whole job here and the error thrown by Pentaho. Pentaho Error I have also tried my query by adding double quotes to the column names as you can see in the error. But it didn't help.
What do you think I should do?
When you create (or modify) the connection, select Advanced on the left panel and click on the Force to upper case or Force to lower case or, even better, Preserve case of reserved words.
To know which option to choose, copy the 4th line of your error log, the line starting with INSERT INTO "public"."PRO_T_TICKETS("OID"... in your SQL-developer tool and change the connection advanced parameters until it works.
Also, at debug time, don't use batch updates, don't use lazy conversion on previous steps, and try with one (1) field rather than all (25).
Just as a complement: it worked for me following the tips from AlainD and using specific configurations that I'd like to share with you. I have a transformation streaming data from MySQL to PostgreSQL using a Table Input and Output. In both of DBs I have uppercase objects.
I did the following steps to work in the right way:
In the table input (MySQL) the objects are uppercase too, but I typed in lowercase and it worked and I didn't set any special option in the DB Connection.
In the table output (PostgreSQL) I typed everything in uppercase (schema, table name and columns) and I also set "specify the database fields" (clicking on "Get fields").
In the target DB Connection (PostgreSQL) I put the options (in "Advanced" section): "Quote all in database" and "Preserve case of reserved words".
PS: Ah, the last option is because I've found out that there was one more problem with my fields: there was a column called "Admin" (yes guys, they created a camelcase column using a reserved word!) and for that reason I must to put "Preserve case of reserved words" and type it as "Admin" (without quotes and in camelcase) in the Table Output.

How to keep Powerbuilder from prepending table owner to table name (Postgres / PB 10.5)

I am connecting to a Postgresql database from Power Builder 10.5, using ODBC on Windows 7.. and I notice that PB prepends the table owner to the table name, eg if I am connected to the database as "user", it will format the query as "SELECT x, y, z FROM user.tablename".
This makes sense in Sybase, but does not work correctly in postgres, where schemas and users are a separate thing.
I tested by creating a postgres schema with the same name as the user, and then putting the tables within the schema. So, when PB used "username.tablename" postgres interpreted it as "schemaname.tablename" and this worked.. but it was just a test and not a useable solution.
It says in the docs that if the table owner is the same as the current user, PB will not prepend the owner, but if they don't match, it will. But in my test program, I see is the opposite: If UID is the same as the owner name, it DOES prepend, if they don't match, it doesn't.
Here's my connect code:
sqlca.DBMS = "ODBC"
sqlca.userid = "pblearn"
sqlca.dbpass = "pblearn"
string ls_DSN = "PBLEARN"
string ls_connect = "ConnectString='"
ls_connect += "DSN=" + ls_DSN + ";"
ls_connect += "UID=" + sqlca.userid + ";"
ls_connect += "PWD=" + sqlca.dbpass + "'"
sqlca.dbparm = ls_connect + ", SQLQualifiers=0"
connect;
My schemas are pblearn and public (default).. and two users "pblearn" and "pblearn2". If I connect with pblearn, prepend happens and I see the tables in pblearn (owner of the tables) schema, if I use pblearn2, the username is not prepended and I see the tables in public schema.
How can I get PB to either not prepend the username, or to prepend a consistent schema name regardless of user?
Thanks
In your database section of the PBODB105.INI used by your installation, add the following property :
PBTableOwner='NO'
From the documentation :
;PBTableOwner='NO' - do not qualify table names, default is 'YES'
EDIT :
If no sections exist for a particular connection then Powerbuilder
runs as an ODBC compliant client and extensions that might be
available cannot not be utilized. The search algorithm for the
entries is:
IF section and entry for are present current datasource
THEN use entry value
ELSE IF section corresponding to DBMS_Name Driver_Name exist
THEN use entry value if it exist
ELSE IF section corresponding to DBMS_Name exist
THEN use entry value if it exist
SECTION Headings
DataSource_Name (None are in ini file by default but if you need to override the more general setting of DBMS_Driver or DBMS_Name you would put in a data source specific section
DBMS_Name Driver_Name (Driver_Name is stripped of .dll extension)
DBMS_Name (DBMS name returned by the SQLGetInfo call)
So the easiest way to add a section for your Postgres installation is to make a section named with your current datasource name or if you prefer to use the DBMS_NAME then check this example : http://www.rgagnon.com/pbdetails/pb-0061.html to see the DBMS name returned by the ODBC driver.

T-SQL 2000: Four part table name

I don't usually work with linked servers, and so I'm not sure what I'm doing wrong here.
A query like this will work to a linked foxpro server from sql 2000:
EXEC('Select * from openquery(linkedServer, ''select * from linkedTable'')')
However, from researching on the internet, something like this should also work:
Select * from linkedserver...linkedtable
but I receive this error:
Server: Msg 7313, Level 16, State 1, Line 1
Invalid schema or catalog specified for provider 'MSDASQL'.
OLE DB error trace [Non-interface error: Invalid schema or catalog specified for the provider.].
I realize it's supposed to be ServerAlias.Category.Schema.TableName, but if I run sp_ tables _ex on the linked server, for the category for all tables I just get the network path to where the data files are, and the schema is null.
Is this server setup incorrectly? Or is what I'm trying to do not possible?
From MSDN:
Always use fully qualified names when
working with objects on linked
servers. There is no support for
implicit resolution to the dbo owner
name for tables in linked servers
You cannot rely on the implicit schema name resolution of the '..' notation for linked servers. For a FoxPro 'server' you're going to have to use the database and schema as they map to their FoxPro counterparts in the driver you use (I think they map to folder and file name, but I have't use a ISAM file driver in more than 10 years now).
I think you need to be explicit about resources in the linked server part of the query, for example:
EXEC SomeLinkedServer.Database.dbo.SomeStoredProc
In other words just dotting them out doesn't work in this case, you have to be more specific.
It's actually:
ServerAlias.Catalog.Schema.LinkedTable
Catalog is the database that you're querying on the linked server, and catalog is the catalog of the remote table. So a valid four-part name would look lik this
ServerAlias.AdventureWorks.HumanResources.Employee
or
ServerAlias.MyDB.dbo.MyTable