DB2 select JSON_ARRAYAGG - db2

Using of JSON_ARRAYAGG does not working for me
1) [Code: -104, SQL State: 42601] An unexpected token "ORDER" was found following "CITY)
". Expected tokens may include: ")".. SQLCODE=-104, SQLSTATE=42601, DRIVER=4.28.11
2) [Code: -727, SQL State: 56098] An error occurred during implicit system action type "2". Information returned for the error includes SQLCODE "-104", SQLSTATE "42601" and message tokens "ORDER|CITY)
|)".. SQLCODE=-727, SQLSTATE=56098, DRIVER=4.28.11
I want an output like this
{"id":901, "name": "Hansi", "addresses" :
[
{ "address":"A", "city":"B"},
{ "address":"C", "city":"D"}
]
}
I am using IBM DB2 11.1 for Linux, UNIX and Windows.
values (
json_array(
select json_object ('ID' value ID,
'NAME' value NAME,
'ADDRESSES' VALUE JSON_ARRAYAGG(
JSON_OBJECT('ADDRESS' VALUE ADDRESS,
'CITY' VALUE CITY)
ORDER BY ADDRESS)
)
FROM CUSTOMER
JOIN CUSTOMER_ADDRESS ON ADDRESS_CUSTOMER_ID = ID
GROUP BY ID, NAME
FORMAT JSON
));
Used tables are:
CUSTOMER - ID (INT), NAME (VARCHAR64)
ADDRESS - ADDRESS (VARCHAR64), CITY (VARCHAR64)

Related

symfony - configure postgres entity manager

I would like to change the database in my existing project from mysql to postgresql.
I have configured the database, I have regenerated the migrations which work, but the problem appears in the fixtures.
when trying to load fixtures an error appears like this:
An exception occurred while executing 'INSERT INTO user (nickname, password, id, created_at, updated_at, email) VALUES (?, ?, ?, ?, ?, ?)' with params ["user", "$2y$13$rgHtT56Vlk2
avmf3gX2W7.QYcQ5d6AXRzr41ebRMGfxREqLZQfsTG", "017c4562-d487-ddff-c303-108c1916d6dd", "2021-10-03 11:01:16", "2021-10-03 11:01:16", "user#user.pl"]:
SQLSTATE[42601]: Syntax error: 7 BŁĄD: błąd składni w lub blisko "user"
LINE 1: INSERT INTO user (nickname, password, id, created_at, update... ,
this is a problem possibly caused by entity manager generating the mysql dialect instead of postgres dialect.
a similar error occurs during the get shot under the user entity:
"hydra:description": "An exception occurred while executing 'SELECT u0_.nickname AS nickname_0, u0_.password AS password_1, u0_.id AS id_2, u0_.created_at AS created_at_3, u0_.updated_at AS updated_at_4, u0_.email AS email_5 FROM user u0_':\n\nSQLSTATE[42703]: Undefined column: 7 BŁĄD: kolumna u0_.nickname nie istnieje\nLINE 1: SELECT u0_.nickname AS nickname_0, u0_.password AS password_.
Here is my doctrine.yaml configuration:
doctrine:
dbal:
url: '%env(resolve:DATABASE_URL)%'
driver: 'pdo_pgsql'
charset: utf8
# IMPORTANT: You MUST configure your server version,
# either here or in the DATABASE_URL env var (see .env file)
#server_version: '13'
orm:
auto_generate_proxy_classes: true
naming_strategy: doctrine.orm.naming_strategy.underscore_number_aware
auto_mapping: true
mappings:
App:
is_bundle: false
type: annotation
dir: '%kernel.project_dir%/src/Entity'
prefix: 'App\Entity'
alias: App
Could someone help me get rid of the bug? :)

How to insert 'NULL' values for 'int' column tupes in Aurora PostgreSQL db using Python boto3 client

I have a CSV file (MS SQL server table export) and I would like to import it to Aurora Serverless PostgreSQL database table. I did a basic preprocessing of the CSV file to replace all of the NULL values in it (i.e. '') to "NULL". The file looks like that:
CSV file:
ID,DRAW_WORKS
10000002,NULL
10000005,NULL
10000004,FLEXRIG3
10000003,FLEXRIG3
The PostgreSQL table has the following schema:
CREATE TABLE T_RIG_ACTIVITY_STATUS_DATE (
ID varchar(20) NOT NULL,
DRAW_WORKS_RATING int NULL
)
The code I am using to read and insert the CSV file is the following:
import boto3
import csv
rds_client = boto3.client('rds-data')
...
def batch_execute_statement(sql, sql_parameter_sets, transaction_id=None):
parameters = {
'secretArn': db_credentials_secrets_store_arn,
'database': database_name,
'resourceArn': db_cluster_arn,
'sql': sql,
'parameterSets': sql_parameter_sets
}
if transaction_id is not None:
parameters['transactionId'] = transaction_id
response = rds_client.batch_execute_statement(**parameters)
return response
transaction = rds_client.begin_transaction(
secretArn=db_credentials_secrets_store_arn,
resourceArn=db_cluster_arn,
database=database_name)
sql = 'INSERT INTO T_RIG_ACTIVITY_STATUS_DATE VALUES (:ID, :DRAW_WORKS);'
parameter_set = []
with open('test.csv', 'r') as file:
reader = csv.DictReader(file, delimiter=',')
for row in reader:
entry = [
{'name': 'ID','value': {'stringValue': row['RIG_ID']}},
{'name': 'DRAW_WORKS', 'value': {'longValue': row['DRAW_WORKS']}}
]
parameter_set.append(entry)
response = batch_execute_statement(
sql, parameter_set, transaction['transactionId'])
However, there is an error that gets returned suggests that there is a type mismatch:
Invalid type for parameter parameterSets[0][5].value.longValue,
value: NULL, type: <class 'str'>, valid types: <class 'int'>"
Is there a way to configure Aurora to accept NULL values for types such as int?
Reading the boto3 documentation more carefully I found that we can use isNull value set to True in case a field is NULL. The bellow code snippet shows how to insert null value to the database:
...
entry = [
{'name': 'ID','value': {'stringValue': row['ID']}}
]
if row['DRAW_WORKS'] == 'NULL':
entry.append({'name': 'DRAW_WORKS', 'value': {'isNull': True}})
else:
entry.append({'name': 'DRAW_WORKS_RATING', 'value': {'longValue': int(row['DRAW_WORKS'])}})
parameter_set.append(entry)

Knex cannot find table in Cloud SQL Postgres from Cloud Functions

I am trying to connect to a Postgres 12 DB running in Cloud SQL from a Cloud Function written in TypeScript.
I create the database with the following:
import * as Knex from "knex"
const { username, password, instance } = ... // username, password, connection name (<app-name>:<region>:<database>)
const config = {
client: 'pg',
connection: {
user: username,
password: password,
database: 'ingredients',
host: `/cloudsql/${instance}`,
pool: { min: 1, max: 1}
}
}
const knex = Knex(config as Knex.Config)
I am then querying the database using:
const query = ... // passed in as param
const result = await knex('tableName').where('name', 'ilike', query).select('*')
When I run this code, I get the following error in the Cloud Functions logs:
Unhandled error { error: select * from "tableName" where "name" ilike $1 - relation "tableName" does not exist
at Parser.parseErrorMessage (/workspace/node_modules/pg-protocol/dist/parser.js:278:15)
at Parser.handlePacket (/workspace/node_modules/pg-protocol/dist/parser.js:126:29)
at Parser.parse (/workspace/node_modules/pg-protocol/dist/parser.js:39:38)
at Socket.stream.on (/workspace/node_modules/pg-protocol/dist/index.js:10:42)
at Socket.emit (events.js:198:13)
at Socket.EventEmitter.emit (domain.js:448:20)
at addChunk (_stream_readable.js:288:12)
at readableAddChunk (_stream_readable.js:269:11)
at Socket.Readable.push (_stream_readable.js:224:10)
at Pipe.onStreamRead [as onread] (internal/stream_base_commons.js:94:17)
I created the table using the following commands in the GCP Cloud Shell (then populated with a data from a CSV):
\connect ingredients;
CREATE TABLE tableName (name VARCHAR(255), otherField VARCHAR(255), ... );
In that console, if I run the query SELECT * FROM tableName;, I see the correct data listed.
Why does Knex not see the table: tableName, but the GCP Cloud Shell does?
BTW, I am definitely connecting to the correct db, as I see the same error logs in the Cloud SQL logging interface.
Looks like you are creating the table tableName without quoting, which makes it actually lower case (case insensitive). So when creating schema do:
CREATE TABLE "tableName" ("name" VARCHAR(255), "otherField" VARCHAR(255), ... );
or use only lower-case table / column names.

Golang Postgres pq failed scanning to *string

I'm trying to scan a postgresql list into an empty slice of string. However, I'm getting the error below:
Failed creating education: sql: Scan error on column index 14, name "descriptions": unsupported Scan, storing driver.Value type string into type *[]*string. Looks like I need to customize the scanner somehow, but how do I do that with squirrel? thanks.
Here's how I'm building the query:
squirrel.StatementBuilder.PlaceholderFormat(squirrel.Dollar).RunWith(db).Insert("educations").
Columns("id", "school", "city", "state", "degree", "month_start", "year_start", "month_end", "year_end", "\"order\"", "logo_url", "created_at", "updated_at", "style", "descriptions").
Values(
uuid.Must(uuid.NewV4()).String(),
education.School,
education.City,
education.State,
education.Degree,
education.MonthStart,
education.YearStart,
education.MonthEnd,
education.YearEnd,
education.Order,
education.LogoURL,
currentTime,
currentTime,
savedStyle.ID,
pq.Array(education.Descriptions),
).
Suffix("RETURNING *").
Scan(
&savedEducation.ID,
&savedEducation.School,
&savedEducation.City,
&savedEducation.State,
&savedEducation.Degree,
&savedEducation.MonthStart,
&savedEducation.YearStart,
&savedEducation.MonthEnd,
&savedEducation.YearEnd,
&savedEducation.Order,
&savedEducation.LogoURL,
&savedEducation.CreatedAt,
&savedEducation.UpdatedAt,
&ignored,
&savedEducation.Descriptions,
)

Knex.js SQL syntax error near 'select'

I'm getting an odd error:
{ __cid: '__cid9',
method: 'insert',
options: undefined,
bindings:
[ 500,
'Dinner',
'10/02/2015 7:57 PM',
'09/29/2015 8:00 PM',
'Grand Plaza',
1 ],
sql: 'insert into "expense" ("amount", "description", "due_date", "payment_date", "vendor_id") values ($1, $2, $3, $4, select "vendor_id" from "vendor" where "name" = $5 limit $6)',
returning: undefined }
error: syntax error at or near "select"
at [object Object].Connection.parseE (/.../node_modules/pg/lib/connection.js:534:11)
at [object Object].Connection.parseMessage (/.../node_modules/pg/lib/connection.js:361:17)
at Socket.<anonymous> (/.../node_modules/pg/lib/connection.js:105:22)
at Socket.emit (events.js:107:17)
at readableAddChunk (_stream_readable.js:163:16)
at Socket.Readable.push (_stream_readable.js:126:10)
at TCP.onread (net.js:538:20)
I have run the raw SQL with those value cut and paste and it works just fine.
This is the code thats generating the error:
Promise.each subbudget.expenses, (expense) ->
vendor.get(expense.vendor).then (vendor_id) ->
knex('expense').insert(
due_date: expense.dueDate
vendor_id: (knex.first("vendor_id").from("vendor").where({name: vendor_id}))
amount: expense.amount
description: expense.description
payment_date: expense.paidDate
)
Edit (Partial Solution):
The issue seems to be parentheses missing around the SELECT statement. Knex offers .wrap(), which only works on raw, and .as(), which only works on nested statements; for some reason this does not qualify as a nested statement, so I can't get parentheses around it. Any ideas?
knex.raw("(" + knex.first("vendor_id").from("vendor").where({name: vendor_id}).toString() + ")")
Not the cleanest, but use .toString(), then wrap it in .raw()