how to transfer database from gce instance to openshift gear - mongodb

I have an application running on the google compute engine. It's a nodejs application connected to a mongodb instance. This instance has documents, the accounts of people registered on my website. I want to transfer this data to my openshift gear. It's the same application, I just want to migrate the data to mongodb on openshift.

Related

Integrating Google Cloud Monitoring with Heroku PostgreSQL

I host a PostgreSQL database on Heroku but most of our operations is focused in Google Cloud, for instance Logging and Monitoring.
Heroku exposes some useful information in their dashboard as well via the addons, but I would love to get the statistics into Google Cloud Monitoring, which is not supported.
I know there's a way to install Ops Agent and configure it to collect PostgreSQL logs, but it's aimed for Google Cloud VMs.
Is there a way to connect it to a PostgreSQL instance on Heroku? Can I install it in Heroku dyno? Maybe there's some other way to pipe Heroku's PostgreSQL diagnostics to Google Cloud?

Google Cloud Run is not able to connect to MongoDB instance in Compute Engine

This is the infrastructure:
Cloud Run: Container with a NodeJS API.
Compute Engine: MongoDB instance.
Network: Both in the same region.
Network: VPC Connector created on default network provided by Google.
What I have seen:
Compute Engine's MongoDB is authenticating the user from Google cloud but it start killing the connections
I have a public API and I am able to connect from my local machine. Database, users and so are properly created.
I got error 503. See compete error at the very bottom
Mongo Configuration
Compute Engine on Ubuntu 18...
No any special configuration rather than updating the mongo.conf.
URI mongodb://<user>:<pass>#12.34.56.78:27017/?authSource=cubicup-int&maxPoolSize=300&waitQueueTimeoutMS=750000&maxConnecting=300
Driver
"doc":{"driver":{"name":"nodejs","version":"3.6.11"}
Mongo.conf
530 error

It is possible to recover a database in GKE, after service stop (Google Cloud)?

My account in Google Cloud does not have support and Google indicates me to ask here for any issue...
I had a GKE with 2 Nodejs apps and a MongoDB, each in a separate docker, Mongo one with persistence disk of course.
There was an issue with the credit card and the payment and Google stopped my GKE service.
When I entered a correct credit card and paid the bill I was able to see the Compute Engine (VM instances and Storage disks), but not the GKE.
Is there a way to export/download previous MongoDB data in Google Cloud, from the Storage Disk, to recover the information?
Thanks in advance

Connect Google Cloud Run to MongoDB Atlas

I'm evaluating a move from Google Kubernetes Engine to Google Cloud Run, to improve cost and resource efficiency within our company. I'm also in the process of transitioning our workflows from monolithic PHP and Ruby apps to a more nimble Node.js setup, using MongoDB.
For a small organization like ours, I like the idea of managed services such as Google Cloud Run and MongoDB Atlas, however, I'm concerned about the security. In MongoDB Atlas, it seems the only real security measure is to whitelist IP, which I obviously don't have access to through Google Cloud Run.
I'm definitely not a network expert, so I'm wondering if anyone has any ideas for securely connecting Cloud Run to MongoDB Atlas, while still maintaining scalability. If I have to remain on GKE, so be it, I just want to know all of my options before I move forward.
IP whitelist - by its very nature, Google Cloud Run would seem to be anti-static-IP, so this seems to be a non-starter.
I evaluated items such as Cloud NAT and Cloud VPC Peering, but from what I can tell Cloud Run does not have access to the VPC, so it seems like this wouldn't help either.
Cloud Run and Cloud Function have the same underlying infrastructure. Cloud Function have the capability to be connected to a VPC. Thereby, Cloud Run will support a day this capability, I hope by the end of 2019.
If you can, I just recommend you to wait!
Update (October 2020): Cloud Run has now launched VPC egress feature that lets you configure a static IP for outbound requests through Cloud NAT. You can follow this step by step guide in the documentation to configure a static IP to connect to MongoDB Atlas.

Amazon Web Service Deploy one Application to Several EC2 Instances

I have recently started using the AWS EC2 service and have deployed my application to a single EC2 instance. The EC2 instance and the load balancer were created automatically by eclipse. I want to deploy the same application to multiple instances at the same time, does anyone know how I could do that?
I think you are after Elastic Beanstalk.
You can either upload application via an S3 bucket or push just the changes with GIT (aws.push command)
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/command-reference-branch-environment.html