Knex.js SQL syntax error near 'select' - postgresql

I'm getting an odd error:
{ __cid: '__cid9',
method: 'insert',
options: undefined,
bindings:
[ 500,
'Dinner',
'10/02/2015 7:57 PM',
'09/29/2015 8:00 PM',
'Grand Plaza',
1 ],
sql: 'insert into "expense" ("amount", "description", "due_date", "payment_date", "vendor_id") values ($1, $2, $3, $4, select "vendor_id" from "vendor" where "name" = $5 limit $6)',
returning: undefined }
error: syntax error at or near "select"
at [object Object].Connection.parseE (/.../node_modules/pg/lib/connection.js:534:11)
at [object Object].Connection.parseMessage (/.../node_modules/pg/lib/connection.js:361:17)
at Socket.<anonymous> (/.../node_modules/pg/lib/connection.js:105:22)
at Socket.emit (events.js:107:17)
at readableAddChunk (_stream_readable.js:163:16)
at Socket.Readable.push (_stream_readable.js:126:10)
at TCP.onread (net.js:538:20)
I have run the raw SQL with those value cut and paste and it works just fine.
This is the code thats generating the error:
Promise.each subbudget.expenses, (expense) ->
vendor.get(expense.vendor).then (vendor_id) ->
knex('expense').insert(
due_date: expense.dueDate
vendor_id: (knex.first("vendor_id").from("vendor").where({name: vendor_id}))
amount: expense.amount
description: expense.description
payment_date: expense.paidDate
)
Edit (Partial Solution):
The issue seems to be parentheses missing around the SELECT statement. Knex offers .wrap(), which only works on raw, and .as(), which only works on nested statements; for some reason this does not qualify as a nested statement, so I can't get parentheses around it. Any ideas?

knex.raw("(" + knex.first("vendor_id").from("vendor").where({name: vendor_id}).toString() + ")")
Not the cleanest, but use .toString(), then wrap it in .raw()

Related

When querying WHERE 1 = 2 in rust postgres. I get an invalid byte sequence error

It seems that when I try to query WHERE 1 = 2 from postgresql in rust, everything breaks because I pass a null. Below I have pasted my exact query and the arg params. Below that is the code I use
SELECT "spec", "id" FROM "ppm"."ppm_database_account" WHERE $1 = $2 AND "name" = $3 ORDER BY "id" DESC PostgresValues(
[
PostgresValue(
Int(
Some(
1,
),
),
),
PostgresValue(
Int(
Some(
2,
),
),
),
PostgresValue(
String(
Some(
"ppm",
),
),
),
],
)
thread 'rocket-worker-thread' panicked at 'called `Result::unwrap()` on an `Err` value: Error { kind: Db, cause: Some(DbError { severity: "ERROR", parsed_severity: Some(Error), code: SqlSt
ate(E22021), message: "invalid byte sequence for encoding \"UTF8\": 0x00", detail: None, hint: None, position: None, where_: Some("unnamed portal parameter $1"), schema: None, table: None,
column: None, datatype: None, constraint: None, file: Some("mbutils.c"), line: Some(1665), routine: Some("report_invalid_encoding") }) }', src/models/database_account/v1_0_0.rs:15:60
stack backtrace:
Code used
let rows = trx.query(&query, &args.as_params()).await.unwrap();
Minimal reproducible example:
In this case client is a tokio postgres client. I queried a specific table but this error occurs on all tables
let rows = client.query("SELECT \"id\" FROM \"ppm\".\"ppm_database_account\" WHERE $1 = $2", &[&PostgresValue(sea_query::Value::Int(Some(1))), &PostgresValue(sea_query::Value::Int(Some(2)))]).await.unwrap();
Any idea why it seems that the values seem to get passed as null (0x00)?

DB2 select JSON_ARRAYAGG

Using of JSON_ARRAYAGG does not working for me
1) [Code: -104, SQL State: 42601] An unexpected token "ORDER" was found following "CITY)
". Expected tokens may include: ")".. SQLCODE=-104, SQLSTATE=42601, DRIVER=4.28.11
2) [Code: -727, SQL State: 56098] An error occurred during implicit system action type "2". Information returned for the error includes SQLCODE "-104", SQLSTATE "42601" and message tokens "ORDER|CITY)
|)".. SQLCODE=-727, SQLSTATE=56098, DRIVER=4.28.11
I want an output like this
{"id":901, "name": "Hansi", "addresses" :
[
{ "address":"A", "city":"B"},
{ "address":"C", "city":"D"}
]
}
I am using IBM DB2 11.1 for Linux, UNIX and Windows.
values (
json_array(
select json_object ('ID' value ID,
'NAME' value NAME,
'ADDRESSES' VALUE JSON_ARRAYAGG(
JSON_OBJECT('ADDRESS' VALUE ADDRESS,
'CITY' VALUE CITY)
ORDER BY ADDRESS)
)
FROM CUSTOMER
JOIN CUSTOMER_ADDRESS ON ADDRESS_CUSTOMER_ID = ID
GROUP BY ID, NAME
FORMAT JSON
));
Used tables are:
CUSTOMER - ID (INT), NAME (VARCHAR64)
ADDRESS - ADDRESS (VARCHAR64), CITY (VARCHAR64)

Knex cannot find table in Cloud SQL Postgres from Cloud Functions

I am trying to connect to a Postgres 12 DB running in Cloud SQL from a Cloud Function written in TypeScript.
I create the database with the following:
import * as Knex from "knex"
const { username, password, instance } = ... // username, password, connection name (<app-name>:<region>:<database>)
const config = {
client: 'pg',
connection: {
user: username,
password: password,
database: 'ingredients',
host: `/cloudsql/${instance}`,
pool: { min: 1, max: 1}
}
}
const knex = Knex(config as Knex.Config)
I am then querying the database using:
const query = ... // passed in as param
const result = await knex('tableName').where('name', 'ilike', query).select('*')
When I run this code, I get the following error in the Cloud Functions logs:
Unhandled error { error: select * from "tableName" where "name" ilike $1 - relation "tableName" does not exist
at Parser.parseErrorMessage (/workspace/node_modules/pg-protocol/dist/parser.js:278:15)
at Parser.handlePacket (/workspace/node_modules/pg-protocol/dist/parser.js:126:29)
at Parser.parse (/workspace/node_modules/pg-protocol/dist/parser.js:39:38)
at Socket.stream.on (/workspace/node_modules/pg-protocol/dist/index.js:10:42)
at Socket.emit (events.js:198:13)
at Socket.EventEmitter.emit (domain.js:448:20)
at addChunk (_stream_readable.js:288:12)
at readableAddChunk (_stream_readable.js:269:11)
at Socket.Readable.push (_stream_readable.js:224:10)
at Pipe.onStreamRead [as onread] (internal/stream_base_commons.js:94:17)
I created the table using the following commands in the GCP Cloud Shell (then populated with a data from a CSV):
\connect ingredients;
CREATE TABLE tableName (name VARCHAR(255), otherField VARCHAR(255), ... );
In that console, if I run the query SELECT * FROM tableName;, I see the correct data listed.
Why does Knex not see the table: tableName, but the GCP Cloud Shell does?
BTW, I am definitely connecting to the correct db, as I see the same error logs in the Cloud SQL logging interface.
Looks like you are creating the table tableName without quoting, which makes it actually lower case (case insensitive). So when creating schema do:
CREATE TABLE "tableName" ("name" VARCHAR(255), "otherField" VARCHAR(255), ... );
or use only lower-case table / column names.

sequelize.findOrCreate did not save even though the table is empty

My app is saving a socket related information into a socketlist table. The idea is if the entry with the socket_id does not exist, then create an entry in postgres table. otherwise to do nothing. Here is the code:
await SocketList.findOrCreate({ where: {socket_id : socket_id, active: true}, default: {event_id: event_id, user_id: user_id, server_id: server_id}});
sequelizejs 4.42.0 is used. The socketlist table is empty but the above code throws the error below:
Executing (325bc621-414d-4e3f-9f2b-230f66537631): START TRANSACTION;
Executing (325bc621-414d-4e3f-9f2b-230f66537631): SELECT "id", "user_id", "socket_id", "event_id", "server_id", "active" FROM "socketlists" AS "socketlist" WHERE "socketlist"."socket_id" = '2TPk6DpsxPwttaemAAAA' AND "socketlist"."active" = true LIMIT 1;
Executing (325bc621-414d-4e3f-9f2b-230f66537631): CREATE OR REPLACE FUNCTION pg_temp.testfunc(OUT response "socketlists", OUT sequelize_caught_exception text) RETURNS RECORD AS $func_4fe094a05f394fe8a0ec032506b86e21$ BEGIN INSERT INTO "socketlists" ("id","socket_id","active") VALUES (NULL,'2TPk6DpsxPwttaemAAAA',true) RETURNING * INTO response; EXCEPTION WHEN unique_violation THEN GET STACKED DIAGNOSTICS sequelize_caught_exception = PG_EXCEPTION_DETAIL; END $func_4fe094a05f394fe8a0ec032506b86e21$ LANGUAGE plpgsql; SELECT (testfunc.response).*, testfunc.sequelize_caught_exception FROM pg_temp.testfunc(); DROP FUNCTION IF EXISTS pg_temp.testfunc();
Executing (325bc621-414d-4e3f-9f2b-230f66537631): COMMIT;
Socket was not saved for userId: 1 { SequelizeDatabaseError: null value in column "id" violates not-null constraint
at Query.formatError (C:\d\code\js\emps_bbone\node_modules\sequelize\lib\dialects\postgres\query.js:363:16)
at query.catch.err (C:\d\code\js\emps_bbone\node_modules\sequelize\lib\dialects\postgres\query.js:86:18)
at tryCatcher (C:\d\code\js\emps_bbone\node_modules\bluebird\js\release\util.js:16:23)
at Promise._settlePromiseFromHandler (C:\d\code\js\emps_bbone\node_modules\bluebird\js\release\promise.js:512:31)
at Promise._settlePromise (C:\d\code\js\emps_bbone\node_modules\bluebird\js\release\promise.js:569:18)
at Promise._settlePromise0 (C:\d\code\js\emps_bbone\node_modules\bluebird\js\release\promise.js:614:10)
at Promise._settlePromises (C:\d\code\js\emps_bbone\node_modules\bluebird\js\release\promise.js:690:18)
at _drainQueueStep (C:\d\code\js\emps_bbone\node_modules\bluebird\js\release\async.js:138:12)
at _drainQueue (C:\d\code\js\emps_bbone\node_modules\bluebird\js\release\async.js:131:9)
at Async._drainQueues (C:\d\code\js\emps_bbone\node_modules\bluebird\js\release\async.js:147:5)
at Immediate.Async.drainQueues [as _onImmediate] (C:\d\code\js\emps_bbone\node_modules\bluebird\js\release\async.js:17:14)
at runCallback (timers.js:705:18)
at tryOnImmediate (timers.js:676:5)
at processImmediate (timers.js:658:5)
name: 'SequelizeDatabaseError',
parent:
{ error: null value in column "id" violates not-null constraint
at Connection.parseE (C:\d\code\js\emps_bbone\node_modules\pg\lib\connection.js:601:11)
at Connection.parseMessage (C:\d\code\js\emps_bbone\node_modules\pg\lib\connection.js:398:19)
at Socket.<anonymous> (C:\d\code\js\emps_bbone\node_modules\pg\lib\connection.js:120:22)
at Socket.emit (events.js:182:13)
at addChunk (_stream_readable.js:283:12)
at readableAddChunk (_stream_readable.js:264:11)
at Socket.Readable.push (_stream_readable.js:219:10)
at TCP.onStreamRead [as onread] (internal/stream_base_commons.js:94:17)
name: 'error',
length: 465,
severity: 'ERROR',
code: '23502',
detail:
'Failing row contains (null, null, 2TPk6DpsxPwttaemAAAA, null, null, t).',
hint: undefined,
position: undefined,
internalPosition: undefined,
internalQuery: undefined,
where:
'SQL statement "INSERT INTO "socketlists" ("id","socket_id","active") VALUES (NULL,\'2TPk6DpsxPwttaemAAAA\',true) RETURNING *"\nPL/pgSQL function pg_temp_3.testfunc() line 1 at SQL statement',
schema: 'public',
table: 'socketlists',
column: 'id',
dataType: undefined,
constraint: undefined,
file:
'd:\\pginstaller.auto\\postgres.windows-x64\\src\\backend\\executor\\execmain.c',
line: '2041',
routine: 'ExecConstraints',
sql:
'CREATE OR REPLACE FUNCTION pg_temp.testfunc(OUT response "socketlists", OUT sequelize_caught_exception text) RETURNS RECORD AS $func_4fe094a05f394fe8a0ec032506b86e21$ BEGIN INSERT INTO "socketlists" ("id","socket_id","active") VALUES (NULL,\'2TPk6DpsxPwttaemAAAA\',true) RETURNING * INTO response; EXCEPTION WHEN unique_violation THEN GET STACKED DIAGNOSTICS sequelize_caught_exception = PG_EXCEPTION_DETAIL; END $func_4fe094a05f394fe8a0ec032506b86e21$ LANGUAGE plpgsql; SELECT (testfunc.response).*, testfunc.sequelize_caught_exception FROM pg_temp.testfunc(); DROP FUNCTION IF EXISTS pg_temp.testfunc();' },
original:
{ error: null value in column "id" violates not-null constraint
at Connection.parseE (C:\d\code\js\emps_bbone\node_modules\pg\lib\connection.js:601:11)
at Connection.parseMessage (C:\d\code\js\emps_bbone\node_modules\pg\lib\connection.js:398:19)
at Socket.<anonymous> (C:\d\code\js\emps_bbone\node_modules\pg\lib\connection.js:120:22)
at Socket.emit (events.js:182:13)
at addChunk (_stream_readable.js:283:12)
at readableAddChunk (_stream_readable.js:264:11)
at Socket.Readable.push (_stream_readable.js:219:10)
at TCP.onStreamRead [as onread] (internal/stream_base_commons.js:94:17)
name: 'error',
length: 465,
severity: 'ERROR',
code: '23502',
detail:
'Failing row contains (null, null, 2TPk6DpsxPwttaemAAAA, null, null, t).',
hint: undefined,
position: undefined,
internalPosition: undefined,
internalQuery: undefined,
where:
'SQL statement "INSERT INTO "socketlists" ("id","socket_id","active") VALUES (NULL,\'2TPk6DpsxPwttaemAAAA\',true) RETURNING *"\nPL/pgSQL function pg_temp_3.testfunc() line 1 at SQL statement',
schema: 'public',
table: 'socketlists',
column: 'id',
dataType: undefined,
constraint: undefined,
file:
'd:\\pginstaller.auto\\postgres.windows-x64\\src\\backend\\executor\\execmain.c',
line: '2041',
routine: 'ExecConstraints',
sql:
'CREATE OR REPLACE FUNCTION pg_temp.testfunc(OUT response "socketlists", OUT sequelize_caught_exception text) RETURNS RECORD AS $func_4fe094a05f394fe8a0ec032506b86e21$ BEGIN INSERT INTO "socketlists" ("id","socket_id","active") VALUES (NULL,\'2TPk6DpsxPwttaemAAAA\',true) RETURNING * INTO response; EXCEPTION WHEN unique_violation THEN GET STACKED DIAGNOSTICS sequelize_caught_exception = PG_EXCEPTION_DETAIL; END $func_4fe094a05f394fe8a0ec032506b86e21$ LANGUAGE plpgsql; SELECT (testfunc.response).*, testfunc.sequelize_caught_exception FROM pg_temp.testfunc(); DROP FUNCTION IF EXISTS pg_temp.testfunc();' },
sql:
'CREATE OR REPLACE FUNCTION pg_temp.testfunc(OUT response "socketlists", OUT sequelize_caught_exception text) RETURNS RECORD AS $func_4fe094a05f394fe8a0ec032506b86e21$ BEGIN INSERT INTO "socketlists" ("id","socket_id","active") VALUES (NULL,\'2TPk6DpsxPwttaemAAAA\',true) RETURNING * INTO response; EXCEPTION WHEN unique_violation THEN GET STACKED DIAGNOSTICS sequelize_caught_exception = PG_EXCEPTION_DETAIL; END $func_4fe094a05f394fe8a0ec032506b86e21$ LANGUAGE plpgsql; SELECT (testfunc.response).*, testfunc.sequelize_caught_exception FROM pg_temp.testfunc(); DROP FUNCTION IF EXISTS pg_temp.testfunc();' }
Here is the model definition:
const SocketList = db.define('socketlist', {
id: {type: Sql.INTEGER,
primaryKey:true,
min: 1
},
user_id: { type: Sql.INTEGER
},
socket_id: {type: Sql.STRING,
unique: true,
min: 1
},
event_id: {type: Sql.INTEGER,
min: 1
},
server_id: {type: Sql.STRING
},
active: {type: Sql.BOOLEAN,
defaultValue: true,
},
}....
you have to create data include id
what about try this?
id: {
type: Sql.INTEGER,
primaryKey:true,
autoIncrement: true
}

Duplicate key value violates unique constraint with Postgres, Knex, and Promises

I'm having a very weird issue. When I insert five roles into my "repository" table with unique ids, the following error below comes up multiple times (same id being mentioned!). I'm not using autoincrement for PK.
Error saving repo { error: duplicate key value violates unique constraint "repository_pkey"
at Connection.parseE (/Users/macintosh/node-projects/risingstack/node_modules/pg/lib/connection.js:554:11)
at Connection.parseMessage (/Users/macintosh/node-projects/risingstack/node_modules/pg/lib/connection.js:379:19)
at Socket.<anonymous> (/Users/macintosh/node-projects/risingstack/node_modules/pg/lib/connection.js:119:22)
at emitOne (events.js:116:13)
at Socket.emit (events.js:211:7)
at addChunk (_stream_readable.js:263:12)
at readableAddChunk (_stream_readable.js:250:11)
at Socket.Readable.push (_stream_readable.js:208:10)
at TCP.onread (net.js:601:20)
name: 'error',
length: 202,
severity: 'ERROR',
code: '23505',
detail: 'Key (id)=(80073079) already exists.',
hint: undefined,
position: undefined,
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: 'public',
table: 'repository',
column: undefined,
dataType: undefined,
constraint: 'repository_pkey',
file: 'nbtinsert.c',
line: '434',
routine: '_bt_check_unique' }
Postgres code generated by knex:
insert into "repository" ("description", "full_name", "html_url", "id", "language", "owner_id", "stargazers_count") values ('Node.js JavaScript runtime :sparkles::turtle::rocket::sparkles:', 'nodejs/node', 'https://github.com/nodejs/node', 27193779, 'JavaScript', 9950313, 56009)
insert into "repository" ("description", "full_name", "html_url", "id", "language", "owner_id", "stargazers_count") values (':closed_book:《Node.js 包教不包会》 by alsotang', 'alsotang/node-lessons', 'https://github.com/alsotang/node-lessons', 24812854, 'JavaScript', 1147375, 13989)
insert into "repository" ("description", "full_name", "html_url", "id", "language", "owner_id", "stargazers_count") values ('Node.js based forum software built for the modern web', 'NodeBB/NodeBB', 'https://github.com/NodeBB/NodeBB', 9603889, 'JavaScript', 4449608, 9399)
insert into "repository" ("description", "full_name", "html_url", "id", "language", "owner_id", "stargazers_count") values (':baby_chick:Nodeclub 是使用 Node.js 和 MongoDB 开发的社区系统', 'cnodejs/nodeclub', 'https://github.com/cnodejs/nodeclub', 3447593, 'JavaScript', 1455983, 7907)
insert into "repository" ("description", "full_name", "html_url", "id", "language", "owner_id", "stargazers_count") values ('Mysterium Node - VPN server and client for Mysterium Network', 'mysteriumnetwork/node', 'https://github.com/mysteriumnetwork/node', 80073079, 'Go', 23056638, 478)
Knex schema for repository:
return knex.schema.createTable('repository', (table) => {
table.integer('id').primary();
table.integer('owner_id');
table.foreign('owner_id').references('user.id').onDelete('CASCADE').onUpdate('CASCADE');
table.string('full_name');
table.string('description');
table.string('html_url');
table.string('language');
table.integer('stargazers_count');
})
Code run to insert Repository:
const fn = composeMany(withOwner, removeIrrelevantProperties, defaultLanguageAndDescToString, saveAndPublish);
const tRepos = r.map(fn);
return Promise.all(tRepos);
const saveAndPublish = (r) => {
return User
.insert(r.owner)
.catch(e => console.log('Error saving User', e))
.then(() => {
const { owner, ...repo } = r;
const q = Repository.insert(repo);
console.log(q.toQuery());
return q;
})
.catch(e => {
console.log('Error saving repo', e)}
);
Sounds like your database already had a row inserted with primary key id == 80073079.
To be sure about it try to query DB rows with that key just before inserting. I just wonder how are those ids generated, since you are clearly not using id sequence for it.
It is possible that input data, where IDs were fetched is corrupted and has duplicate ids