I want to run pgbadger for postgresql logs in GCP cloudsql. I am unabel to run pgbadger report for all the logs, i can do it for single line command though.
If anyone has run pgbadger for GCP postgreSQL.
Related
I'm not sure that I actually have a problem, but I am confused. I wanted to move a PostgreSQL database from PythonAnywhere to an AWS RDS server. I connected to both servers with pgAdmin from my Windows PC (ssh tunnel with PuTTY). I then did a backup of the database from PythonAnywhere and then did a restore to a clean database on the RDS server.
The backup had no issues, but, while the restore seemed to run fine, pgAdmin showed the process "Failed". The database on the RDS server looks fine. I checked row counts on a few tables, and they matched what I had on PythonAnywhere. I don't see any messages in pgAdmin other than that the process failed. I don't see anything in the pgAdmin logs to indicate what might be wrong. Do I have a problem? Should I use a command line restore instead?
Thanks for any insights.
--Al
Using cli is a better way, the ide may not give full errors. pg_dump will work for you. Maybe you should build a EC2 for this job.
I'm using DBeaver to connect the postgres db and want to access the logs via DBeaver.
I run the command below to find log destination and got stderr as the location.
show log_destination ;
How can I reach that file on DB app? FYI, I want all logs, not the logs that are from DBeaver.
On recent PostgreSQL versions, that is simple:
SELECT pg_current_logfile();
For old versions, proceed as follows:
Verify that the logging collector is started:
SHOW logging_collector;
If not, the location of the log depends on how PostgreSQL was started.
If yes, the log will be in log_directory:
SHOW log_directory;
If that is a relative path, it is relative to the PostgreSQL data directory.
Since the log file is on the database server, you probably won't be able to access it with a client tool.
https://tableplus.com/blog/2018/10/how-to-show-queries-log-in-postgresql.html#:~:text=The%20location%20of%20the%20log,pgsql%2Fdata%2Fpg_log%2F%20.
In Ubuntu is under /var/log/postgresql
You can run pg_lsclusters.
> pg_lsclusters
Ver Cluster Port Status Owner Data directory Log file
11 main 5432 online postgres /var/lib/postgresql/11/main /var/log/postgresql/postgresql-11-main.log
I downloaded the postgresql .dmp file from the chembl database.
I want to import this into gcp cloudsql.
When I run it with the console and gcloud command, I get the following error:
Importing data into Cloud SQL instance...failed.
ERROR: (gcloud.sql.import.sql) [ERROR_RDBMS] exit status 1
The input is a PostgreSQL custom-format dump.
Use the pg_restore command-line client to restore this dump to a database.
Can I import custom-format dmp files without using the pg_restore command?
https://cloud.google.com/sql/docs/postgres/import-export/importing
There is a description of pg_restore in the document on that site, but I didn't get it right.
In the case of custom-format files, is it necessary to pg_restore after uploading them to the cloud shell?
According to the CloudSQL docs:
Only plain SQL format is supported by the Cloud SQL Admin API.
The custom format is allowed if the dump file is intended for use with pg_restore.
If you cannot use pg_restore for some reason, I would spin up a local Postgres instance (i.e., on your laptop) and use pg_restore to restore the database.
After loading into your local database, you can use pg_dump to dump to file in plaintext format, then load into CloudSQL with the console or gcloud command.
I am running a set of SQL statements sequentially in the AWS Redshift Query editor.
sql-1
sql-2
sql-3
....
sql-N
However, in the Redshift Query editor, I cannot run multiple SQL statements. So currently I am running SQL statements one by one manually.
What is the alternative approach for me? For me looks like, I can use the DBeaver.
Is there more programmatic approach i.e. just by using a simple bash script?
If you have a Linux instance that can access the cluster you can use the psql command line tool. For example:
yum install postgresql
psql -h my-cluster.cjmul6ivnpa4.us-east-2.redshift.amazonaws.com \
-p 5439 \
-d my_db \
-f my_sql_script.sql
We recently announced a way to schedule queries: https://aws.amazon.com/about-aws/whats-new/2020/10/amazon-redshift-supports-scheduling-sql-queries-by-integrating-with-amazon-eventbridge/ And even more recently published this blog post that walks you through all the steps using the Console or the CLI: https://aws.amazon.com/blogs/big-data/scheduling-sql-queries-on-your-amazon-redshift-data-warehouse/ Hope these links help.
I'd like to monitor the queries getting sent to my database from an application. To that end, I've found pg_stat_activity, but more often then not, the rows which are returned read " in transaction". I'm either doing something wrong, am not fast enough to see the queries come through, am confused, or all of the above!
Can someone recommend the most idiot-proof way to monitor queries running against PostgreSQL? I'd prefer some sort of easy-to-use UI based solution (example: SQL Server's "Profiler"), but I'm not too choosy.
PgAdmin offers a pretty easy-to-use tool called server monitor
(Tools ->ServerStatus)
With PostgreSQL 8.4 or higher you can use the contrib module pg_stat_statements to gather query execution statistics of the database server.
Run the SQL script of this contrib module pg_stat_statements.sql (on ubuntu it can be found in /usr/share/postgresql/<version>/contrib) in your database and add this sample configuration to your postgresql.conf (requires re-start):
custom_variable_classes = 'pg_stat_statements'
pg_stat_statements.max = 1000
pg_stat_statements.track = top # top,all,none
pg_stat_statements.save = off
To see what queries are executed in real time you might want to just configure the server log to show all queries or queries with a minimum execution time. To do so set the logging configuration parameters log_statement and log_min_duration_statement in your postgresql.conf accordingly.
pg_activity is what we use.
https://github.com/dalibo/pg_activity
It's a great tool with a top-like interface.
You can install and run it on Ubuntu 21.10 with:
sudo apt install pg-activity
pg_activity
If you are using Docker Compose, you can add this line to your docker-compose.yaml file:
command: ["postgres", "-c", "log_statement=all"]
now you can see postgres query logs in docker-compose logs with
docker-compose logs -f
or if you want to see only postgres logs
docker-compose logs -f [postgres-service-name]
https://stackoverflow.com/a/58806511/10053470
I haven't tried it myself unfortunately, but I think that pgFouine can show you some statistics.
Although, it seems it does not show you queries in real time, but rather generates a report of queries afterwards, perhaps it still satisfies your demand?
You can take a look at
http://pgfouine.projects.postgresql.org/