I'm using GKE (Google Kubernetes Engine) on Google Cloud, and I have a Postgres container.
I want to configure Postgres to send its logs to Stackdriver in JSON format.
I did not find documentation for this, and I'm a newbie in Postgres. How can I do this?
Related
I host a PostgreSQL database on Heroku but most of our operations is focused in Google Cloud, for instance Logging and Monitoring.
Heroku exposes some useful information in their dashboard as well via the addons, but I would love to get the statistics into Google Cloud Monitoring, which is not supported.
I know there's a way to install Ops Agent and configure it to collect PostgreSQL logs, but it's aimed for Google Cloud VMs.
Is there a way to connect it to a PostgreSQL instance on Heroku? Can I install it in Heroku dyno? Maybe there's some other way to pipe Heroku's PostgreSQL diagnostics to Google Cloud?
I've been exploring for a possible solution that would help to export certain metrics from prometheus to postgres for analytical purpose.
I came across the prometheus-postgres-adapter, unfortunately it will store the metrics in its own postgres, i.e a statefulset in k8s and doesn't support an external postgres like AWS RDS. There's an open issue for this: https://github.com/timescale/prometheus-postgresql-adapter/issues/10
Is there any other alternatives? or should we write our own adapter?
I used this tutorial to install wordpress using kubernetes.
https://kubernetes.io/docs/tutorials/stateful-application/mysql-wordpress-persistent-volume/
It is working as expected. But I will prefer to use Amazon RDS instead of mysql pods. I am not sure what changes are required.
In the wordpress deployment you just need to update the host and credentials for your amazon db
you don't need to deploy any of the mysql resources from the tutorial.
I am a new Dataproc user and I am trying to run a PySpark job that is supposed to use the MongoDB connector to retrieve data from a MongoDB replicaset hosted within a Googke Kubernetes Engine cluster.
Is it there a way to achieve this as my replicaset is not supposed to be accessible from the outside without using a port-forward or something?
In this case I assume by saying "outside" you're pointing to the internet or other networks than your GKE cluster's. If you deploy your Dataproc cluster on the same network as your GKE cluster, and expose the MongoDB service to the internal network, you should be able to connect to the databases from your Dataproc job without needing to expose it to outside of the network.
You can find more information in this link to know how to create a Cloud Dataproc cluster with internal IP addresses.
Just expose your Mogodb service in GKE and your should be able to reach it from within the same VPC network.
Take a look at this post for reference.
You should also be able to automate the service exposure through an init script
I have a kubernetes cluster in google cloud created by cloud container clusters create command. I want to use elasticsearch logging. How should I install "fluentd-elasticsearch" addon? Or where is another way?
When launching the cluster you can disable the default logging to Cloud Logging by passing the --no-enable-cloud-logging flag. Once that is done you can install the fluentd elasticsearch cluster addon from the Kubernetes repo.