Parse PGobject in Groovy - postgresql

I'm trying to parse selected result from PostgreeSQL in Groovy, but something goes wrong, when I'm trying to parse it.
I used a lot of things, but I haven't reached the goal.
I've tried so:
I have three columns in postgre. One of the columns (document) is jsonb type and I want to parse it.
//instance of connection string to postrgre
PgAdmin pg = new
List<GroovyRowResult> result = pg.getSelectResult(Here goes select script)
for(Map oneRow in result) {
def document = oneRow.get("document")
println documnet //Well from here it works good, it prints jsonb data stored in document column.
def customerName = document.get("CustomerName") // it failes and goes on error, where CustomerName is inside jsonB.
}
I've also tried to use JsonSlurper, but it also fails.

Related

how to convert map<anydata> to json

In my CRUD Rest Service I do an insert into a DB and want to respond to the caller with the created new record. I am looking for a nice way to convert the map to json.
I am running on ballerina 0.991.0 and using a postgreSQL.
The return of the Update ("INSERT ...") is a map.
I tried with convert and stamp but i did not work for me.
import ballerinax/jdbc;
...
jdbc:Client certificateDB = new({
url: "jdbc:postgresql://localhost:5432/certificatedb",
username: "USER",
password: "PASS",
poolOptions: { maximumPoolSize: 5 },
dbOptions: { useSSL: false }
}); ...
var ret = certificateDB->update("INSERT INTO certificates(certificate, typ, scope_) VALUES (?, ?, ?)", certificate, typ, scope_);
// here is the data, it is map<anydata>
ret.generatedKeys
map should know which data type it is, right?
then it should be easy to convert it to json like this:
{"certificate":"{certificate:
"-----BEGIN
CERTIFICATE-----\nMIIFJjCCA...tox36A7HFmlYDQ1ozh+tLI=\n-----END
CERTIFICATE-----", typ: "mqttCertificate", scope_: "QARC", id_:
223}"}
Right now i do a foreach an build the json manually. Quite ugly. Maybe somebody has some tips to do this in a nice way.
It cannot be excluded that it is due to my lack of programming skills :-)
The return value of JDBC update remote function is sql:UpdateResult|error.
The sql:UpdateResult is a record with two fields. (Refer https://ballerina.io/learn/api-docs/ballerina/sql.html#UpdateResult)
UpdatedRowCount of type int- The number of rows which got affected/updated due to the given statement execution
generatedKeys of type map - This contains a map of auto generated column values due to the update operation (only if the corresponding table has auto generated columns). The data is given as key value pairs of column name and column value. So this map contains only the auto generated column values.
But your requirement is to get the entire row which is inserted by the given update function. It can’t be returned with the update operation if self. To get that you have to execute the jdbc select operation with the matching criteria. The select operation will return a table or an error. That table can be converted to a json easily using convert() function.
For example: Lets say the certificates table has a auto generated primary key column name ‘cert_id’. Then you can retrieve that id value using below code.
int generatedID = <int>updateRet.generatedKeys.CERT_ID;
Then use that generated id to query the data.
var ret = certificateDB->select(“SELECT certificate, typ, scope_ FROM certificates where id = ?”, (), generatedID);
json convertedJson = {};
if (ret is table<record {}>) {
var jsonConversionResult = json.convert(ret);
if (jsonConversionResult is json) {
convertedJson = jsonConversionResult;
}
}
Refer the example https://ballerina.io/learn/by-example/jdbc-client-crud-operations.html for more details.?

How to return a plain value from a Knex / Postgresql query?

I'm trying to return a simple, scalar string value from a Postgres DB using Knex. So far, everything I do returns a JSON object with a key (the column name) and the value, so I have to reach into the object to get the value. If I return multiple rows, then I get multiple JSON objects, each one repeating the key.
I could be returning multiple columns, in which case each row would at least need to be an array. I'm not looking for a special case where specifying a single column returns the value without the array -- I'm OK reaching into the array. I want to avoid the JSON object with the repetitive listing of column names as keys.
I've scoured the Knex docs but don't see how to control the output.
My table is a simple mapping table with two string columns:
CREATE TABLE public._suite
(
piv_id character(18) NOT NULL,
sf_id character(18) NOT NULL,
CONSTRAINT _suite_pkey PRIMARY KEY (piv_id)
)
When I build a query using Knex methods like
let myId = 'foo', table = '_suite';
return db(table).where('piv_id', myId).first(['sf_id'])
.then( function(id) { return(id); });
I get {"sf_id":"a4T8A0000009PsfUAE"} ; what I want is just "a4T8A0000009PsfUAE"
If I use a raw query, like
return db.raw(`select sf_id from ${table} where piv_id = '${myId}'`);
I get a much larger JSON object describing the result:
{"command":"SELECT","rowCount":1,"oid":null,"rows":[{"sf_id":"a4T8A0000009Q9HUAU"}],"fields":[{"name":"sf_id","tableID":33799,"columnID":2,"dataTypeID":1042,"dataTypeSize":-1,"dataTypeModifier":22,"format":"text"}],"_parsers":[null],"RowCtor":null,"rowAsArray":false}
What do I have to do to just get the value itself? (Again, I'm OK if it's in an array -- I just don't want the column names.)
Take a look at the pluck method.
db(table).where('piv_id', myId).pluck('sf_id'); // => will return you ["a4T8A0000009PsfUAE"]

Custom expression in JPA CriteriaBuilder

I have an Entity with a String field (storing JSON), and need to compare value from its database column with another value. Problem is that type of this database column is TEXT, but in fact it contains JSON. So, is there a way to write something like this? I.e. I need to compare my value with some field of JSON from TEXT column.
criteriaBuilder.equal(root.get("json_column").customExpressionn(new Expression{
Object handle(Object data){
return ((Object)data).get("json_field")
}
}), value)
Assuming, you have a MySQL server with version > 5.7.x
I just had the same issue. I wanted to find all entities of a class that had a JSON field value inside a JSON object column.
The solution that worked for me was something along the lines (sorry typing from a phone)
(root, query, builder)->{
return builder.equal(
builder.function("JSON_EXTRACT", String.class, root.get("myEntityJsonAttribute"), builder.literal("$.json.path.to.json.field")),
"searchedValueInJsonFieldOfJsonAttribute"
)
}

grails-mongodb: findAllBy*InList not returning results in order

I'm using grails2.4.4 with mongodb plugin version 3.0.3. I'm facing issue while getting results of my domain object. I'm using below code:
My domain:
Employee{
ObjectId id
String name
}
I have list of ids , using below code to fetch employees: (Please note that below data is just for representing my problem. In realtime, my ids are random and so i can't use sorting, but i just want the result in the order of input.)
def idsList=[new ObjectId("2001"), new ObjectId("2002"), new ObjectId("2003")]
def results=Employee.findAllByIdInList(idsList)
Expected result:
[Employee#2001,Employee#2002,Employee#2003]
Actual result (not in order):
[Employee#2002, Employee#2003 , Employee#2001] or sometimes
[Employee#2003, Employee#2001 , Employee#2002]
For now i'm doing like this to get the output in desired order:
def results=[]
for(id in idsList){
def emp=Employee.findById(id)
results<<emp
}
But i want to do this with single call(findAllBy*InList) without iterating over objects. Can anyone advise how can i get the results in the order of input ids?
Have you tried
Employee.findAllByIdInList(idsList, [sort: 'id', order:'asc'])
? It should work as expected
If it doesn't, instead of for loop you can use
def result = idList.collect { id -> Employee.findById(id) }

How to query a JSON element

Let's say I have a Postgres database (9.3) and there is a table called Resources. In the Resources table I have the fields id which is an int and data which is a JSON type.
Let's say I have the following records in said table.
1, {'firstname':'Dave', 'lastname':'Gallant'}
2, {'firstname':'John', 'lastname':'Doe'}
What I want to do is write a query that would return all the records in which the data column has a json element with the lastname equal to "Doe"
I tried to write something like this:
records = db_session.query(Resource).filter(Resources.data->>'lastname' == "Doe").all()
Pycharm however is giving me a compile error on the "->>"
Does anyone know how I would write the filter clause to do what I need?
Try using astext
records = db_session.query(Resource).filter(
Resources.data["lastname"].astext == "Doe"
).all()
Please note that the column MUST have a type of a JSONB. The regular JSON column will not work.
Also you could explicitly cast string to JSON (see Postgres JSON type doc).
from sqlalchemy.dialects.postgres import JSON
from sqlalchemy.sql.expression import cast
db_session.query(Resource).filter(
Resources.data["lastname"] == cast("Doe", JSON)
).all()
If you are using JSON type (not JSONB) the following worked for me:
Note the '"object"'
query = db.session.query(ProductSchema).filter(
cast(ProductSchema.ProductJSON["type"], db.String) != '"object"'
)
I have some GeoJSON in a JSON (not JSONB) type column and none of the existing solutions worked, but as it turns out, in version 1.3.11 some new data casters were added, so now you can:
records = db_session.query(Resource).filter(Resources.data["lastname"].as_string() == "Doe").all()
Reference: https://docs.sqlalchemy.org/en/14/core/type_basics.html#sqlalchemy.types.JSON
Casting JSON Elements to Other Types
Index operations, i.e. those invoked by calling upon the expression
using the Python bracket operator as in some_column['some key'],
return an expression object whose type defaults to JSON by default, so
that further JSON-oriented instructions may be called upon the result
type. However, it is likely more common that an index operation is
expected to return a specific scalar element, such as a string or
integer. In order to provide access to these elements in a
backend-agnostic way, a series of data casters are provided:
Comparator.as_string() - return the element as a string
Comparator.as_boolean() - return the element as a boolean
Comparator.as_float() - return the element as a float
Comparator.as_integer() - return the element as an integer
These data casters are implemented by supporting dialects in order to
assure that comparisons to the above types will work as expected, such
as:
# integer comparison
data_table.c.data["some_integer_key"].as_integer() == 5
# boolean comparison
data_table.c.data["some_boolean"].as_boolean() == True
According sqlalchemy.types.JSON, you can do it like this
from sqlalchemy import JSON
from sqlalchemy import cast
records = db_session.query(Resource).filter(Resources.data["lastname"] == cast("Doe", JSON)).all()
According to this, pre version 1.3.11, the most robust way should be like this, as it works for multiple database types, e.g. SQLite, MySQL, Postgres:
from sqlalchemy import cast, JSON, type_coerce, String
db_session.query(Resource).filter(
cast(Resources.data["lastname"], String) == type_coerce("Doe", JSON)
).all()
From version 1.3.11 onward, type-specific casters is the new and neater way to handle this:
db_session.query(Resource).filter(
Resources.data["lastname"].as_string() == "Doe"
).all()