Does PostgreSQL allow running stored procedures in parallel? - postgresql

I'm working with an ETL tool, Business Objects Data Services, which has the capability of specifying parallel execution of functions. The documentation says that before you can do this, you have to make sure that your database, which in our case is Postgres, allows "a stored procedure to run in parallel". Can anyone tell me if Postgres does that?

Sure. Just run your queries in different connections, and they will run in parallel transactions. Beware of locking though.

You can also call different stored procedures from the same connection (and effectively still run them in parallel) by using DBLink.
See this SO answer to see an example.

Related

How to set up background worker for PostgreSQL 11 server?

Recently I've been asigned to migrate part of the database from Oracle to PostgreSQL enviroment, as testing experiment. During that process, major drawback that occured to me was lack of simple way to implement parallelism which is required due to multiple design reasons, which aren't so relevant here. Recently I've discovered https://www.postgresql.org/docs/11/bgworker.html following process, which occured to me as some way to solve my problems.
Yet not so truly, as I couldn't easly find any tutorial or example how to implement it even for a simple task as writing debugmessages into logger, while the process is running. I've tried some old ways, presented in some plugin specifications from version 9.3, but they weren't much of help.
I would like to know how to set up those workers properly. Any help would be appriciated.
PS: Also if some good soul found workaround to implement bulk collect for cursors into PostgreSQL it would be most kind of you, to share it.
The documentation for bgworker that you linked to is for writing C code, which is probably not what you want. You can use the pg_background extension, which will do what you want. ora2pg will optionally use pg_background when converting oracle procedures with the autonomous transaction pragma. The other option is to use dblink to open a connection to the current db.
Neither solution is great, but it's the only way to go if you need to store data in a table whether or not the enclosing transaction succeeds. If you can get by with just putting stuff into the logs, you can use RAISE NOTICE instead.
As far as bulk collect for cursors go, I'm not sure exactly how you are using them, but set returning functions may help you. Functions in postgres can return multiple rows without fiddling with cursors.

Giving access to execute SQL queries on a static database

I am working on a project where i want to give people the possibility to execute SQL queries on an PostgreSQL database. I then only need to prevent people from hacking/attacking my database.
I thought that maybe a way to do that, is by giving only view access to de database connection. And using EXPLAIN ANALYSE to calculating the cost of the SQL query.
Is EXPLAIN ANALYSE trustworthy enough to make sure there are no cheap ways to get the website down?
Do you have suggestions?
EXPLAIN ANALYSE will execute the query, including any side-effects it may have. PostgreSQL also allows running arbitrary Perl and Python code if configured to do so, so be careful. You're likely better off running PostgreSQL instances in per-request VMs or in similar highly isolated environments.

PostgreSQL: no native scripting and transactions?

Sorry for potential FAQ, RTFM, etc. If I understand correctly, transactions could not be used in native scripting units (functions, including anonymous do-blocks). What would PostgreSQL guys recommend as the least "not natural" way to combine scripting and transactions?
I think you are talking about autonomous transactions.
If so, you are correct that PostgreSQL doesn't support true stored procedures with autonomous transactions yet. (Feel free to sponsor work or contribute time...)
Your options are:
Use dblink to make connections-to-self and do the discrete units of work that way
Use an external process that connects to Pg
Use an in-db script with pl/python, pl/perl, etc that connects to the DB using psycopg2 / DBD::Pg / etc, rather than using the SPI, and does the work that way. Essentially you code the script like an externally connecting script, but run it within the DB for convenience.

run an execution plan directly in PostgreSQL

is it possible to run an execution plan directly in PostgreSQL?
I did not find anything about it after quite some search in the PostgreSQL document and on the internet.
No, it is not possible to directly execute a query plan in PostgreSQL. You must run actual SQL.
In theory you could customise the PostgreSQL executor to accept plans without the corresponding SQL by feeding in plan trees. This would be a pretty big job and I'm sure there are many things that'd make it harder that I don't even know about.
You really need to just run SQL.
There is no reverse-compiler to turn an execution plan back into SQL.

postgresql procedures/triggers

Is it possible to write a stored procedure or trigger that will be executed automatically inside of a database on particular time without any calls from application? If yes, then could anybody give me an example or link to some resource where I can read how to do that.
Check out pgAgent. If that doesn't work for you, there's always cron in Unix/Linux and the Task Scheduler service in Windows.
I don't think there's anything built-in, but you might want to check out
pgjobs or pgAgent.
You can use Stored Procedures. Stored Procedure is a set of statements, which allow ease and flexibility for a programmer because stored procedure is easy to execute than reissuing the number of individual SQL statements but they need to perform the same database operations.Using the stored procedure less information needs to be sent between the server and the client.
You can visit These links :-
Postgres Procedures
Best way to use stored Procedures