I have a procedure which performs inserts and updates a number of tables. I want to have an exception block where I can store all the
Query errors.
I tried to find some solutions and saw that maybe it was not possible to
Use dml statements in except block.
Need some help please. Is it really possible or please suggest an alternate solution.
Related
I have an application that uses a library that sends simple INSERTs to Postgres.
Here is an illustrative example:
INSERT INTO COMPANIES (company_name)
VALUES ('Acme, Inc.');
However, I need to take this a bit further and embed the simple INSERT into a transaction that also inserts into other tables, like this:
BEGIN;
-- some insert to another table first
INSERT INTO COMPANIES (company_name)
VALUES ('Acme, Inc.');
-- some insert to yet another table last
COMMIT;
The simplest solution would be to modify the library's code to do what I need. However, I don't really want to do that as it makes patch management for the library very difficult.
An alternative would be to catch and modify the INSERT via the ORM just before it gets handed over to Postgres. However, this is also not a great approach as it requires dealing with undocumented internals of the ORM, which might change with new versions of the ORM.
I have therefore been looking for a way to do this inside Postgres.
My focus has been on a TRIGGER that would fire on INSERT to the relevant table, using the INSTEAD OF specifier.
And this is the point where I'm stuck, as this seems to only work for views.
Is what I'm trying to do feasible? Might somebody have a code example?
Thank you very much.
As per the official documentation, it depicts as though we can insert into multiple tables from a task. Which sounds inaccurate since
Once consumed the offsets of the stream are reset
It is possible to execute only one SQL statement from a task
am I missing something here? I want to be able to insert into 2 tables reading out of a stream through the task.
You can do this with a multi-table insert:
https://docs.snowflake.com/en/sql-reference/sql/insert-multi-table.html
You can do this. Multi-table inserts are one way, but there is another.
The pointer in the stream is only advanced at the end of a transactions. Therefore, you can enclose multiple DML statements that read from the stream in a single transaction. Unfortunately, tasks can only execute a single SQL statements, so you will have to embed your queries in a stored procedure.
Hope this helps.
Before I try to insert a row into a PostgreSQL table, should I query whether the insert would violate a constraint?
I do check when the insert would cause unwanted side-effects (e.g., auto-increment) upon an error.
But, if there are no possible side effects, is it OK to just blindly try to insert into a table? Or, is it better practice to prevent errors by anticipating them when possible (as advised in Objective-C)?
Also, when performing the insert inside an SQL function, will other queries (e.g., CTEs) inside the function get rolled back if the insert fails?
In general testing before hand is not a good idea because it requires you to explicitly lock tables to prevent other clients from changing or inserting data between your test and inserts. Explicit locking is bad for concurrency.
Serials getting auto incremented by failed inserts is in general not a problem. Just don't assume the values inserted into the database are consecutive.
A database and obj-c are two completely different things. Let the database check for problems, it is much easier to add the appropriate constraints to your schema then it is to check everything in your client program.
The default is to rollback to the start of the transaction. But you can control it with savepoints and rollback to savepoint. However a CTE is part of the query and queries are always rolled back completely when part of them fails. However you might be able to work around that by splitting the CTE of into a full query that creates a temp table. Then you can use the temp table instead of the cte.
I want to make a loop with one INSERT query for each fetched row from a previous query with prepared statements. In a more visual way:
MAKE QUERY
While Fetching
- Make new query
I can't close my statement since I need it to keep working...
The problem is that when we use prepared statements, we have to fetch all data before doing a new prepare(). So, how could I do this? Maybe without statements, but it's not a nice solution.
You are going to kill your DB (and if you have a DBA your DBA is going to kill you) if you try to do this. The problem is that what you want to do is send one inser request per line to the database. You have to create and dispose of all these commands over and over again for each line of data. That is expensive.
If you are hard-set on doing it, Nothing prevents you from creating a second prepared statement (with a different name of course) within a loop which reads from the first, but I highly advise against it. At the very least, buffer your incoming data and insert a few hundred rows at a time.
I need to write an update script that will check to see if certain tables, indexes, etc. exist in the database, and if not, create them. I've been unable to figure out how to do these checks, as I keep getting Syntax Error at IF messages when I type them into a query window in PgAdmin.
Do I have to do something like write a stored procedure in the public schema that does these updates using Pl/pgSQL and execute it to make the updates? Hopefully, I can just write a script that I can run without creating extra database objects to get the job done.
If you are on PostgreSQL 9.1, you can use CREATE TABLE ... IF NOT EXISTS
On 9.0 you can wrap your IF condition code into a DO block: http://www.postgresql.org/docs/current/static/sql-do.html
For anything before that, you will have to write a function to achieve what you want.
Have you looked into pg_tables?
select * from pg_tables;
This will return (among other things) the schemas and tables that exist in the database. Without knowing more of what you're looking for, this seems like a reasonable place to start.