How to return a plain value from a Knex / Postgresql query? - postgresql

I'm trying to return a simple, scalar string value from a Postgres DB using Knex. So far, everything I do returns a JSON object with a key (the column name) and the value, so I have to reach into the object to get the value. If I return multiple rows, then I get multiple JSON objects, each one repeating the key.
I could be returning multiple columns, in which case each row would at least need to be an array. I'm not looking for a special case where specifying a single column returns the value without the array -- I'm OK reaching into the array. I want to avoid the JSON object with the repetitive listing of column names as keys.
I've scoured the Knex docs but don't see how to control the output.
My table is a simple mapping table with two string columns:
CREATE TABLE public._suite
(
piv_id character(18) NOT NULL,
sf_id character(18) NOT NULL,
CONSTRAINT _suite_pkey PRIMARY KEY (piv_id)
)
When I build a query using Knex methods like
let myId = 'foo', table = '_suite';
return db(table).where('piv_id', myId).first(['sf_id'])
.then( function(id) { return(id); });
I get {"sf_id":"a4T8A0000009PsfUAE"} ; what I want is just "a4T8A0000009PsfUAE"
If I use a raw query, like
return db.raw(`select sf_id from ${table} where piv_id = '${myId}'`);
I get a much larger JSON object describing the result:
{"command":"SELECT","rowCount":1,"oid":null,"rows":[{"sf_id":"a4T8A0000009Q9HUAU"}],"fields":[{"name":"sf_id","tableID":33799,"columnID":2,"dataTypeID":1042,"dataTypeSize":-1,"dataTypeModifier":22,"format":"text"}],"_parsers":[null],"RowCtor":null,"rowAsArray":false}
What do I have to do to just get the value itself? (Again, I'm OK if it's in an array -- I just don't want the column names.)

Take a look at the pluck method.
db(table).where('piv_id', myId).pluck('sf_id'); // => will return you ["a4T8A0000009PsfUAE"]

Related

how to convert map<anydata> to json

In my CRUD Rest Service I do an insert into a DB and want to respond to the caller with the created new record. I am looking for a nice way to convert the map to json.
I am running on ballerina 0.991.0 and using a postgreSQL.
The return of the Update ("INSERT ...") is a map.
I tried with convert and stamp but i did not work for me.
import ballerinax/jdbc;
...
jdbc:Client certificateDB = new({
url: "jdbc:postgresql://localhost:5432/certificatedb",
username: "USER",
password: "PASS",
poolOptions: { maximumPoolSize: 5 },
dbOptions: { useSSL: false }
}); ...
var ret = certificateDB->update("INSERT INTO certificates(certificate, typ, scope_) VALUES (?, ?, ?)", certificate, typ, scope_);
// here is the data, it is map<anydata>
ret.generatedKeys
map should know which data type it is, right?
then it should be easy to convert it to json like this:
{"certificate":"{certificate:
"-----BEGIN
CERTIFICATE-----\nMIIFJjCCA...tox36A7HFmlYDQ1ozh+tLI=\n-----END
CERTIFICATE-----", typ: "mqttCertificate", scope_: "QARC", id_:
223}"}
Right now i do a foreach an build the json manually. Quite ugly. Maybe somebody has some tips to do this in a nice way.
It cannot be excluded that it is due to my lack of programming skills :-)
The return value of JDBC update remote function is sql:UpdateResult|error.
The sql:UpdateResult is a record with two fields. (Refer https://ballerina.io/learn/api-docs/ballerina/sql.html#UpdateResult)
UpdatedRowCount of type int- The number of rows which got affected/updated due to the given statement execution
generatedKeys of type map - This contains a map of auto generated column values due to the update operation (only if the corresponding table has auto generated columns). The data is given as key value pairs of column name and column value. So this map contains only the auto generated column values.
But your requirement is to get the entire row which is inserted by the given update function. It can’t be returned with the update operation if self. To get that you have to execute the jdbc select operation with the matching criteria. The select operation will return a table or an error. That table can be converted to a json easily using convert() function.
For example: Lets say the certificates table has a auto generated primary key column name ‘cert_id’. Then you can retrieve that id value using below code.
int generatedID = <int>updateRet.generatedKeys.CERT_ID;
Then use that generated id to query the data.
var ret = certificateDB->select(“SELECT certificate, typ, scope_ FROM certificates where id = ?”, (), generatedID);
json convertedJson = {};
if (ret is table<record {}>) {
var jsonConversionResult = json.convert(ret);
if (jsonConversionResult is json) {
convertedJson = jsonConversionResult;
}
}
Refer the example https://ballerina.io/learn/by-example/jdbc-client-crud-operations.html for more details.?

mybatis - Passing multiple parameters on #One annotation

I am trying to access a table in my Secondary DB whose name I am obtaining from my Primary DB. My difficulty is to pass the "DB-Name" as a parameter into my secondary query, (BTW I am using MyBatis annotation based Mappers).
This is my Mapper
#SelectProvider(type = DealerQueryBuilder.class, method = "retrieveDealerListQuery")
#Results({
#Result(property="dealerID", column="frm_dealer_master_id"),
#Result(property="dealerTypeID", column="frm_dealer_type_id", one=#One(select="retrieveDealerTypeDAO")),
#Result(property="dealerName", column="frm_dealer_name")
})
public List<Dealer> retrieveDealerListDAO(#Param("firmDBName") String firmDBName);
#Select("SELECT * from ${firmDBName}.frm_dealer_type where frm_dealer_type_id=#{frm_dealer_type_id}")
#Results({
#Result(property="dealerTypeID", column="frm_dealer_type_id"),
#Result(property="dealerType", column="frm_dealer_type")
})
public DealerType retrieveDealerTypeDAO(#Param("firmDBName") String firmDBName, #Param("frm_dealer_type_id") int frm_dealer_type_id);
The firmDBName I have is obtained from my "Primary DB".
If I omit ${firmDBName} in my second query, the query is trying to access my Primary Database and throws out table "PrimaryDB.frm_dealer_type" not found. So it is basically trying to search for a table named "frm_dealer_type" in my Primary DB.
If I try to re-write the #Result like
#Result(property="dealerTypeID", column="firmDBName=firmDBName, frm_dealer_type_id=frm_dealer_type_id", one=#One(select="retrieveDealerTypeDAO")),
It throws an error that Column"firmDBName" does not exist.
Changing ${firmDBName} to #{firmDBName} also did not help.
I did refer to this blog - here
I want a solution to pass my parameter firmDBName from my primary query into secondary query.
The limitation here is that your column must be returned by the first #SELECT.
If you look at the test case here you will see that parent_xxx values returned by the first Select.
Your DealerQueryBuilder must select firmDBName as a return value and your column must map the name of the return column to that.
Your column definition is always wrong, it should be:
{frm_dealer_type_id=frm_dealer_type_id,firmDBName=firmDBName} or whatever it was returned as from your first select.
Again you can refer to the test case I have above as well as the documentation here http://www.mybatis.org/mybatis-3/sqlmap-xml.html#Nested_Select_for_Association

How insert item on top table

How insert item on top table in PostgreSQL? That it is possible? In the table I have only two fields as text. First is primary key.
CREATE TABLE news_table (
title text not null primary key,
url text not null
);
I need a simple query for the program in java.
OK, this is my code:
get("/getnews", (request, response) -> {
List<News> getNews = newsService.getNews();
List<News> getAllNews = newsService.getAllNews();
try (Connection connection = DB.sql2o.open()) {
String sql = "INSERT INTO news_table(title, url) VALUES (:title, :url)";
for (News news : getNews) {
if (!getAllNews.contains(news)) {
connection.createQuery(sql, true)
.addParameter("title", news.getTitle())
.addParameter("url", news.getUrl())
.executeUpdate()
.getKey();
}
}
}
return newsService.getNews();
}, json());
The problem is that as it calls getnews method for the second time this new news adds at the end of the table, and there is no extant hronologi news. How this resolve? I use Sql2o + sparkjava.
Probably already I know. I need to reverse the List getnews before I will must contains object getnews and getallnews?
There is no start or end in a table. If you want to sort your data, just use an ORDER BY in your SELECT statements. Without ORDER BY, there is no order.
Relational theory, the mathematical foundation of relational databases, lays down certain conditions that relations (represented in real databases as tables) must obey. One of them is that they have no ordering (i.e., the rows will neither be stored nor retrieved in any particular order, since they are treated as a mathematical set). It's therefore completely under the control of the RDBMS where a new row is entered into a table.
Hence there is no way to ensure a particular ordering of the data without using an ORDER BY clause when you retrieve the data.

Custom expression in JPA CriteriaBuilder

I have an Entity with a String field (storing JSON), and need to compare value from its database column with another value. Problem is that type of this database column is TEXT, but in fact it contains JSON. So, is there a way to write something like this? I.e. I need to compare my value with some field of JSON from TEXT column.
criteriaBuilder.equal(root.get("json_column").customExpressionn(new Expression{
Object handle(Object data){
return ((Object)data).get("json_field")
}
}), value)
Assuming, you have a MySQL server with version > 5.7.x
I just had the same issue. I wanted to find all entities of a class that had a JSON field value inside a JSON object column.
The solution that worked for me was something along the lines (sorry typing from a phone)
(root, query, builder)->{
return builder.equal(
builder.function("JSON_EXTRACT", String.class, root.get("myEntityJsonAttribute"), builder.literal("$.json.path.to.json.field")),
"searchedValueInJsonFieldOfJsonAttribute"
)
}

How to query a JSON element

Let's say I have a Postgres database (9.3) and there is a table called Resources. In the Resources table I have the fields id which is an int and data which is a JSON type.
Let's say I have the following records in said table.
1, {'firstname':'Dave', 'lastname':'Gallant'}
2, {'firstname':'John', 'lastname':'Doe'}
What I want to do is write a query that would return all the records in which the data column has a json element with the lastname equal to "Doe"
I tried to write something like this:
records = db_session.query(Resource).filter(Resources.data->>'lastname' == "Doe").all()
Pycharm however is giving me a compile error on the "->>"
Does anyone know how I would write the filter clause to do what I need?
Try using astext
records = db_session.query(Resource).filter(
Resources.data["lastname"].astext == "Doe"
).all()
Please note that the column MUST have a type of a JSONB. The regular JSON column will not work.
Also you could explicitly cast string to JSON (see Postgres JSON type doc).
from sqlalchemy.dialects.postgres import JSON
from sqlalchemy.sql.expression import cast
db_session.query(Resource).filter(
Resources.data["lastname"] == cast("Doe", JSON)
).all()
If you are using JSON type (not JSONB) the following worked for me:
Note the '"object"'
query = db.session.query(ProductSchema).filter(
cast(ProductSchema.ProductJSON["type"], db.String) != '"object"'
)
I have some GeoJSON in a JSON (not JSONB) type column and none of the existing solutions worked, but as it turns out, in version 1.3.11 some new data casters were added, so now you can:
records = db_session.query(Resource).filter(Resources.data["lastname"].as_string() == "Doe").all()
Reference: https://docs.sqlalchemy.org/en/14/core/type_basics.html#sqlalchemy.types.JSON
Casting JSON Elements to Other Types
Index operations, i.e. those invoked by calling upon the expression
using the Python bracket operator as in some_column['some key'],
return an expression object whose type defaults to JSON by default, so
that further JSON-oriented instructions may be called upon the result
type. However, it is likely more common that an index operation is
expected to return a specific scalar element, such as a string or
integer. In order to provide access to these elements in a
backend-agnostic way, a series of data casters are provided:
Comparator.as_string() - return the element as a string
Comparator.as_boolean() - return the element as a boolean
Comparator.as_float() - return the element as a float
Comparator.as_integer() - return the element as an integer
These data casters are implemented by supporting dialects in order to
assure that comparisons to the above types will work as expected, such
as:
# integer comparison
data_table.c.data["some_integer_key"].as_integer() == 5
# boolean comparison
data_table.c.data["some_boolean"].as_boolean() == True
According sqlalchemy.types.JSON, you can do it like this
from sqlalchemy import JSON
from sqlalchemy import cast
records = db_session.query(Resource).filter(Resources.data["lastname"] == cast("Doe", JSON)).all()
According to this, pre version 1.3.11, the most robust way should be like this, as it works for multiple database types, e.g. SQLite, MySQL, Postgres:
from sqlalchemy import cast, JSON, type_coerce, String
db_session.query(Resource).filter(
cast(Resources.data["lastname"], String) == type_coerce("Doe", JSON)
).all()
From version 1.3.11 onward, type-specific casters is the new and neater way to handle this:
db_session.query(Resource).filter(
Resources.data["lastname"].as_string() == "Doe"
).all()