Can we use Google Cloud SQL off Google App Engine ?
I'm also looking for examples of its use.
Yes, but you have to use the custom JDBC driver.
https://developers.google.com/cloud-sql/docs/external
You should expect high latency since your client will not be colocated with the mysqld.
You can also use the JDBC driver for admin tools:
https://developers.google.com/cloud-sql/docs/admin_tools
Related
I haven't been able to find how to take a Postgres instance on Google Cloud SQL (on GCP) and hook it up to a grafana dashboard to visualize the data that is in the DB. Is there an accepted easy way to do this? I'm a complete newbie to grafana and have limited experience with GCP(used cloud sql proxy to connect to a postgres instance)
Grafana display the data. Google Cloud Monitoring store the data to display. So, you have to make a link between both.
And boom, magically, a plug-in exists!
Note: when you know what you search, it's easier to find it. Understand your architecture to reach the next level!
I would like to create an automated data pulling from our PostgreSQL database to a Google sheet. I've tried JDBC service, but it doesn't work, maybe incorrect variables/config. Does anyone already try doing this? I'd also like to schedule the extraction every hour.
According the the documentation, only Google Cloud SQL MySQL, MySQL, Microsoft SQL Server, and Oracle databases are supported by Apps Script's JDBC. You may have to either move to a new database or develop your own API services to handle the connection.
As for scheduling by the hour, you can use Apps Script's installable triggers.
I installed oracle SQL Developer, but I am not having any connection string to connect to database.
How can I connect first time?
SQL Developer is a desktop application. It serves as a client for working with an Oracle database.
But, it’s NOT a database, nor does it come with one.
And you’re going to need a database.
You have lots of options going forward. I've tried to detail all of them here.
TL;DR version:
Oracle XE, free to use for any purpose (works just fine in Docker)
Our VirtualBox Appliance - very easy to get going, free to use for learning purposes
LiveSQL - a website you can use to learn Oracle, with nothing to install
And something my article doesn't address, Always Free Oracle Autonomous Database Cloud Service.
As a part of my thesis project, I have been given a MongoDB dump of size 240GB which is on my external hard drive. I'll have to use this data to run my python scripts for a short duration. However, since my dataset is huge and I cannot mongoimport on my local mongodb server (since I don't have enough internal memory), my professor gave me a $100 google cloud platform coupon so I can use the google cloud computing resources.
So far I have researched that I can do it this way:
Create a compute engine in GCP and install mongodb on remote engine. Transfer the MongoDB dump to remote instance and run the scripts to get the output.
This method works well but I'm looking for a method to create a remote database server in GCP so I that I can run my scripts locally, which is something like one of the following.
Creating a remote mongodb server on GCP so that I can establish a remote mongo connection to run my scripts locally.
Transferring the mongodb dump to google's datastore so then I can use the datastore API to remotely connect and run my scripts locally.
I have given a thought of using MongoDB atlas but because of the size of the data, I will be billed hugely and I cannot use my GCP coupon.
Any help or suggestions on how of either of the two methods can be implemented is appreciated.
There is 2 parts to your question
First, you can create a compute engine VM with MongoDB installed and load your backup on it. Then, open the right firewall rules for allowing the connexion from your local environment to the Google Compute Engine VM. The connexion will be performed with a simple login/password.
You can use a static IP on your VM. By the way, in case of reboot on the VM you will keep the same IP (and it will be easier for your local connexion).
Second, BE CAREFUL to datastore. It's a good product, serverless NoSQL database, document oriented, but it's absolutely not the MongoDB equivalent. You can't perform aggregate, you are limited in search capabilities,... It's designed for specific use case (I don't know yours, but don't think that is the MongoDB equivalent!).
Anyway, if you use Datastore, you will have to use a service account or to install Google Cloud SDK on your local environment to be authenticated and to be able to request Datastore API. No login/password in this case.
I'm looking at Google app engine for deploying my Java web app which connects to a mongodb database through the mongodb java driver.
Plan 1:
Deploy MongoDb Database in Google Compute Engine
Deploy Java Web app in Google App Engine and pass the address of the MongoDb (in GCE) to the mongodb driver
Plan 2:
Deploy both Mongodb and Java web app in Google Compute engines
Question 1: Will Plan 1 work ?
Question 2: I think Plan 2 will probably work. But is it the most efficient method ? If not, can you suggest a more efficient method?
Note: Im on planning on using Google Datastore.
Your plan 1 is feacible, the only thing you need to ensure is that your App Engine and Compute Engine (MongoDB) are in the same virtual network. As it is stated here.
The plan 2 won't be cost effective due to the requirement of having the intance(s) running your app on 24x7. Even if there are no traffic you will have to pay for them. Google App Engine will give you the chance of having a free quota.