I'm using OLE and C#.NET to query the schema of a MS Access database. Specifically, I need to find out whether a particular column is an "identity" column or not. For SQL Server, I can use:
select COLUMNPROPERTY(object_id('dbo.tablename'),'columnname','IsIdentity')
... but when I invoke this SQL against Access, I get an OleDbException with the following message:
Undefined function 'COLUMNPROPERTY' in expression.
Searching the archives, it appears there are ways to do this with DAO, but I need to use OLE. Anyone happen to know how I can do this with OLE?
You can get the schema from the connection, for example:
cn.GetOleDbSchemaTable(OleDbSchemaGuid.Indexes,
new Object[] { null, null, null, null, "Table1" });
Is the indexes for Table1. One of the fields returned is PRIMARY_KEY
See http://msdn.microsoft.com/en-us/library/system.data.oledb.oledbschemaguid.columns(v=vs.71)
The same using the GetSchema method.
using(OleDbConnection con = new OleDbConnection(#"Provider=Microsoft.ACE.OLEDB.12.0;" +
"Data Source=C:\temp\db.mdb;" +
"Persist Security Info=False;"))
{
con.Open();
var schema = con.GetSchema("Indexes");
var col = schema.Select("TABLE_NAME = 'YourTableName' AND PRIMARY_KEY = True");
Console.WriteLine(col[0]["COLUMN_NAME"].ToString());
}
Related
I have a Vert.x web application that needs to query an AWS RDS instance running Postgres 10.7. The Vert.x JDBC client is io.vertx:vertx-jdbc-client:3.8.4. I want to query a table with the constraint that a certain column's value is included in a set of values:
select from table where column in/any (?)
I followed the Vertx documentation, which says to create a JsonArray and populate it with the values to inject into the query. The column is of type text and the list that I want to match on is a Java ArrayList<String>. My query code looks like:
String sql = "SELECT a FROM table WHERE col IN (?)";
List<String> values = someObject.someField();
sqlClient.getConnection(connectionResult -> {
if (connectionResult.failed()) {
// handle
} else {
SQLConnection connection = connectionResult.result();
JsonArray params = new JsonArray()
.add(values);
connection.queryWithParams(sql, params, queryResult -> {
if (queryResult.failed()) {
// handle
} else {
// parse
}
});
}
});
The query fails with the error: org.postgresql.util.PSQLException: Can't infer the SQL type to use for an instance of io.vertx.core.json.JsonArray. Use setObject() with an explicit Types value to specify the type to use.
I know that in the worst case, I can create a literal SQL string where col in (?, ?, ?, ..., ?) and add each String from the list to a JsonArray, but there must be a way to just add the ArrayList<String> as a parameter and keep the query simple. How can I specify a list of values to match against in my query?
The Vert.x JDBC Client does not support array parameters in queries.
However it is possible with the Vert.x Pg Client, which does not depend on JDBC. You need to modify your query first:
SELECT a FROM table WHERE col = ANY(?)
Then:
pgClient.preparedQuery(query, Tuple.of(possibleValues), collector, handler);
In my CRUD Rest Service I do an insert into a DB and want to respond to the caller with the created new record. I am looking for a nice way to convert the map to json.
I am running on ballerina 0.991.0 and using a postgreSQL.
The return of the Update ("INSERT ...") is a map.
I tried with convert and stamp but i did not work for me.
import ballerinax/jdbc;
...
jdbc:Client certificateDB = new({
url: "jdbc:postgresql://localhost:5432/certificatedb",
username: "USER",
password: "PASS",
poolOptions: { maximumPoolSize: 5 },
dbOptions: { useSSL: false }
}); ...
var ret = certificateDB->update("INSERT INTO certificates(certificate, typ, scope_) VALUES (?, ?, ?)", certificate, typ, scope_);
// here is the data, it is map<anydata>
ret.generatedKeys
map should know which data type it is, right?
then it should be easy to convert it to json like this:
{"certificate":"{certificate:
"-----BEGIN
CERTIFICATE-----\nMIIFJjCCA...tox36A7HFmlYDQ1ozh+tLI=\n-----END
CERTIFICATE-----", typ: "mqttCertificate", scope_: "QARC", id_:
223}"}
Right now i do a foreach an build the json manually. Quite ugly. Maybe somebody has some tips to do this in a nice way.
It cannot be excluded that it is due to my lack of programming skills :-)
The return value of JDBC update remote function is sql:UpdateResult|error.
The sql:UpdateResult is a record with two fields. (Refer https://ballerina.io/learn/api-docs/ballerina/sql.html#UpdateResult)
UpdatedRowCount of type int- The number of rows which got affected/updated due to the given statement execution
generatedKeys of type map - This contains a map of auto generated column values due to the update operation (only if the corresponding table has auto generated columns). The data is given as key value pairs of column name and column value. So this map contains only the auto generated column values.
But your requirement is to get the entire row which is inserted by the given update function. It can’t be returned with the update operation if self. To get that you have to execute the jdbc select operation with the matching criteria. The select operation will return a table or an error. That table can be converted to a json easily using convert() function.
For example: Lets say the certificates table has a auto generated primary key column name ‘cert_id’. Then you can retrieve that id value using below code.
int generatedID = <int>updateRet.generatedKeys.CERT_ID;
Then use that generated id to query the data.
var ret = certificateDB->select(“SELECT certificate, typ, scope_ FROM certificates where id = ?”, (), generatedID);
json convertedJson = {};
if (ret is table<record {}>) {
var jsonConversionResult = json.convert(ret);
if (jsonConversionResult is json) {
convertedJson = jsonConversionResult;
}
}
Refer the example https://ballerina.io/learn/by-example/jdbc-client-crud-operations.html for more details.?
In my current project i am working with the database which has very strange table structure (All Id Fields in most tables are marked as not nullable and primary while there is not auto increment increment enabled for them those Id fields need to be unique as well).
unfortunately there is not way i can modify DB so i find another why to handle my problem.
I have no issues while querying for data but during insert What i want to do is,
To get max Id from table where entity is about to be inserted and increment it by one or even better use SSELECT max(id) pattern during insert.
I was hoping to use Interceptor inside EF to achieve this but is looks too difficult for me now and all i managed to do is to identify if this is insert command or not.
Can someone help me through my way on this problem? how can i achieve this and set ID s during insert either by selecting max ID or using SELECT max(id)
public void TreeCreated(DbCommandTreeInterceptionContext context)
{
if (context.OriginalResult.CommandTreeKind != DbCommandTreeKind.Insert && context.OriginalResult.DataSpace != DataSpace.CSSpace) return;
{
var insertCommand = context.Result as DbInsertCommandTree;
var property = insertCommand?.Target.VariableType.EdmType.MetadataProperties.FirstOrDefault(x => x.Name == "TableName");
if (property == null) return;
var tbaleName = property?.Value as ReadOnlyCollection<EdmMember>;
var variableReference = insertCommand.Target.VariableType.Variable(insertCommand.Target.VariableName);
var tenantProperty = variableReference.Property("ID");
var tenantSetClause = DbExpressionBuilder.SetClause(tenantProperty, DbExpression.FromString("(SELECT MAX(ID) FROM SOMEHOWGETTABLENAME)"));
var filteredSetClauses = insertCommand.SetClauses.Cast<DbSetClause>().Where(sc => ((DbPropertyExpression)sc.Property).Property.Name != "ID");
var finalSetClauses = new ReadOnlyCollection<DbModificationClause>(new List<DbModificationClause>(filteredSetClauses) { tenantSetClause });
var newInsertCommand = new DbInsertCommandTree(
insertCommand.MetadataWorkspace,
insertCommand.DataSpace,
insertCommand.Target,
finalSetClauses,
insertCommand.Returning);
context.Result = newInsertCommand;
}
}
Unfortunately that concept of Interceptor is a little bit new for me and i do not understand it completely.
UPDATE
I manage to dynamically build that expression so that ID field is now included in insert statement, but the problem here is that I can not use SQL query inside it. whenever i try to use this it always results in some wrong SQL query so is there anyway i tweak insert statement so that this SELECT MAX(ID) FROM TABLE_NAME is executed during insert?
Get the next id from the context, and then set the parameter of the insert command accordingly.
void NonQueryExecuting(DbCommand command, DbCommandInterceptionContext<int> interceptionContext)
{
var context = interceptionContext.DbContexts.First() as WhateverYourEntityContainerNameIs;
// get the next id from the database using the context
var theNextId = (from foo in context...)
// update the parameter on the command
command.Parameters["YourIdField"].Value = theNextId;
}
Just bear in mind this is not terribly thread safe; if two users update the same table at exactly the same time, they could theoretically get the same id. This is going to be a problem no matter what update method you use if you manage keys in the application instead of the database. But it looks like that decision is out of your hands.
If this is going to be a problem, you might have to do something more drastic like alter the command.CommandText to replace the value in the values clause with a subquery, for example change
insert into ... values (#YourIdField, ...)
to
insert into ... values ((select max(id) from...), ...)
If I have table in WebSQL database with some data can jaydata work with it?
For example, I have such table:
var shortName = 'Del';
var version = '1.0';
var displayName = 'Del';
var maxSize = 65536;
db = openDatabase(shortName, version, displayName, maxSize);
db.transaction(
function(transaction) {
transaction.executeSql(
'CREATE TABLE IF NOT EXISTS "main" ("name" VARCHAR NOT NULL , "last" DATETIME NOT NULL DEFAULT CURRENT_DATE);'
);
}
);
Disclaimer: I work for JayData
Yes, if you are following the naming conventions JayData uses you are able to use existing databases on a limited extent. Foreign keys and relations are likely a thing that will not work.
Using OrientDB 1.7-rc and Scala, I would like to insert a document (ODocument), into a document (not graph) database, with connections to other documents. How should I do this?
I've tried the following, but it seems to insert an embedded list of documents into the Package document, rather than connect the package to a set of Version documents (which is what I want):
val doc = new ODocument("Package")
.field("id", "MyPackage")
.field("versions", List(new ODocument("Version").field("id", "MyVersion")))
EDIT:
I've tried inserting a Package with connections to Versions through SQL, and that seems to produce the desired result:
insert into Package(id, versions) values ('MyPackage', [#10:3, #10:4] )
However, I need to be able to do this from Scala, which has yet to produce the correct results when loading the ODocument back. How can I do it (from Scala)?
You need to create the individual documents first and then inter-link them using below SQL commands.
Some examples given in OrientDB documentation
insert into Profile (name, friends) values ('Luca', [#10:3, #10:4] )
OR
insert into Profile SET name = 'Luca', friends = [#10:3, #10:4]
Check here for more details.
I tried posting in comments above, but somehow the code is not readable, so posting the response separately again.
Here is an example of linking two documents in OrientDB. This is take from documentation. Here we are adding new user in DB and connecting it to give role:
var db = orient.getDatabase();
var role = db.query("select from ORole where name = ?", roleName);
if( role == null ){
response.send(404, "Role not found", "text/plain", "Error: role name not found" );
} else {
db.begin();
try{
var result = db.save({ "#class" : "OUser", name : "Gaurav", password : "gauravpwd", roles : role});
db.commit();
return result;
}catch ( err ){
db.rollback();
response.send(500, "Error: Server", "text/plain", err.toString() );
}
}
Hope it helps you and others.
This is how to insert a Package with a linkset referring to an arbitrary number of Versions:
val version = new ODocument("Version")
.field("id", "1.0")
version.save()
val versions = new java.util.HashSet[ODocument]()
versions.add(version)
val package = new ODocument("Package")
.field("id", "MyPackage")
.field("versions", versions)
package.save()
When inserting a Java Set into an ODocument field, OrientDB understands this to mean one wants to insert a linkset, which is an unordered, unique, collection of references.
When reading the Package back out of the database, you should get hold of its Versions like this:
val versions = doc.field[java.util.HashSet[ODocument]]("versions").asScala.toSeq
As when the linkset of versions is saved, a HashSet should be used when loading the referenced ODocument instances.
Optionally, to enforce that Package.versions is in fact a linkset of Versions, you may encode this in the database schema (in SQL):
create property Package.versions linkset Version