I would like to use Cloud SQL with PostgreSQL with an internal cron-like tool. In the extensions list I can't find anything like that.
Do you know any solution/alternative which wouldn't include having an extra Compute Engine instance from where we can make calls ?
You can't run cron internally in Cloud SQL as it is fully managed and you only get the access to the database itself.
Workaround
If you need to run this inserts and selects each minute then you can use Google's Task queues to send a request to an App Engine service that will do all that.
cron:
- description: "make some select/insert"
url: /tasks/populate
schedule: every 1 mins
target: populate_postgres
You have 28 free instance hours a day to use with App Engine instances.
You can run cron internally in Cloud SQL thanks to pg_cron extension support added on November 19, 2021.
pg_cron - Provides a cron-based job scheduler. This extension enables cron syntax to schedule PostgreSQL commands directly from the database. For more information about the extension, see the pg_cron section.
Cloud SQL for PostgreSQL uses version 10 (or higher).
gcloud sql instances patch INSTANCE_NAME --database-flags=cloudsql.enable_pg_cron=on
CREATE EXTENSION pg_cron;
-- Delete old data on Saturday at 3:30am (GMT)
SELECT cron.schedule(
'delete outdated events',
'30 3 * * 6',
$$ DELETE FROM events WHERE event_time < now() - '1 week'::interval $$
);
Related
Does anyone know a way of updating the auto-pause details for a SQL Azure Serverless database using T-SQL? We have a process that deploys new databases from a database copy command, and we would like to set the auto-pause delay (including the ability to turn it off) using T-SQL.
I have scoured the documentation, and I can find a reference for doing this via the Azure CLI, however our pipeline is built using T-SQL.
I'm currently using the following T-SQL to create a copy of the database:
CREATE DATABASE "TemplateDb"
AS COPY OF "NewDb" (SERVICE_OBJECTIVE = 'GP_S_Gen5_2');
When using T-SQL to create or alter a serverless database, default values are applied for the min vcores and auto-pause delay.
The min vcores and auto-pause delay default values:
Min vCores: 0.5 vCores
Autopause delay: 60 minutes
These default values can later be changed from the portal or via other management APIs (PowerShell, Azure CLI, REST API).
Reference: Using T-SQL
I'm trying to schedule a function to periodically run and delete records from my google cloudsql (Postgresql) database. I want this to run a couple of times a day and will run under 10 minutes. What options do I have to schedule this function?
Thanks
Ravi
Your best option will be to use Cloud Scheluder to schedule a job that publishes to a Pub/Sub topic. Then, have a Cloud Function subscribed to this topic so it get's triggered by the message sent.
You can configure this job to run as a Daily routine x times a day.
Try pgAgent
pgAgent is a job scheduling agent for Postgres databases, capable of running multi-step batch or shell scripts and SQL tasks on complex schedules.
pgAgent is distributed independently of pgAdmin. You can download pgAgent from the download area of the pgAdmin website.
I tried to install pgAgent, but since it is not supported on Amazon I don't know how to schedule postgres jobs without going with Cron jobs and psql directly. Here is what I got on Amazon RDS:
The following command gave the same result:
CREATE EXTENSION pg_cron;
I have total of three options right now on top of my head for this:
1.)AWS Lambda
2.)AWS Glue
3.)Any small EC2 instance (Linux/Windows)
1.)AWS Lambda:
you can use postgres connectvity python module like pg8000 or psycopg2, to connect and create cursor to your target RDS.
and you can pass your sql jobs code /your SQL statements as an input to lambda. If they are very few, you can just code the whole job in your lambda, if not you can pass it to lambda as a input using DynamoDB.
You can have a cron schedule using cloudwatch event, so that it will trigger lambda whenever you need.
Required tools: DynamoDB, AWS Lambda, Python, Postgres python connectivity module.
2.)AWS Glue
AWS Glue also works almost same. You have a option to connect to your RDS DB directly there and you can schedule your jobs there.
3.)Ec2 instance:
Create any small size ec2 instance, either windows or linux and have setup your cron/bat jobs.
On October 10th, 2018, AWS Lambda launched support for long running functions. Customers can now configure their AWS Lambda functions to run up to 15 minutes per execution. Previously, the maximum execution time (timeout) for a Lambda function was 5 minutes. Using longer running functions, a highly requested feature, customers can perform big data analysis, bulk data transformation, batch event processing, and statistical computations more efficiently.
You could use Amazon CloudWatch Events to trigger a Lambda function on a schedule, but it can only run for a maximum of 15 minutes (https://aws.amazon.com/about-aws/whats-new/2018/10/aws-lambda-supports-functions-that-can-run-up-to-15-minutes/?nc1=h_ls).
You could also run a t2.nano Amazon EC2 instance (about $50/year On-Demand, or $34/year as a Reserved Instance) to run regular cron jobs.
I need to execute a query on a teradata database on a daily basis (select + insert).
Can this be done within the (teradata-) database or should I consider external means (e.g. a cron-job).
Teradata doesn't have a built-in scheduler to run jobs. You will need to leverage something like cron or Tivioli Workload Scheduler to manage your job schedule(s).
I have been using PG Backups add-on recently and everything has worked fine, however this morning the backup process triggered at 10:00 A.M. in the morning generating some blocks and timeouts in my application.
Is there a way to specify the schedule of the backups made with this add-on? I've been searching and haven't found anything specific.
Use Cron for Manual Backup Scheduling
Heroku gives you two types of backups: automated and user-initiated. Each plan has a different number of daily, weekly, and manual backups that are retained. You can't control when the automated backups occur with PG Backups Auto, but you can use cron to trigger a "manual" backup at any time.
For example:
# Trigger a "manual" backup every four hours.
0 */4 * * * source $HOME/database_credentials; heroku pgbackups:capture
See Creating a Backup for more information about using the pgbackups command.
No, there is no way to do it currently, aside from using an external process to fire the calls.
An email to support might reveal more.
While the original question is old, Heroku does have a schedule option for PGBackups now:
https://devcenter.heroku.com/articles/heroku-postgres-backups#scheduling-backups