i have question when someone finish his flutter app with nodjs backend how can i make the application online like when u store queries in DB every body can reach the DB
like firebase but other platforms
I think you are asking how to set up a backend. You can use a server from cloud providers like google cloud platform /AWS /Oracle cloud or you can use a hosting provider like hostinger. Then you can upload your backend (Nodejs) code there.
If you plan to use a cloud prvider like GCP, AWS, Oracle cloud there are free compute machines awailable for startups. At first there is a bit setup to do, which you can easily do by waching a youtube tutorial.
Related
I'm making a Flutter app with a login page, and I've been trying to query data from a Google Cloud MySQL server and serve it to my app. I was able to replicate the result I want in python using bigquery but I need something like that but in a dart/flutter version(any other ways work too as long as the result is the same)
Any help would be really appreciated!
It is not recommended to connect to the Cloud SQL databases directly over the internet from the client application. So one should avoid connecting to the Cloud SQL database directly from the Flutter Application. In this case, it would be a good idea to create an API endpoint to do the operations in Cloud SQL and hit that endpoint from the Flutter Application.
To achieve the above, we can follow these steps:
Deploy a Http Cloud Functions which connects to the Cloud SQL
database
In this step we have to deploy a Http Cloud Functions and from the Cloud Functions we can connect to the Cloud SQL database. Inside the Cloud Functions we can do the operations on the Cloud SQL database. This document explains how to connect to Cloud SQL from Cloud Functions.
Call the Http Cloud Functions URL from the Flutter Application
After deploying the Cloud Functions, we will get a unique URL for it, which looks like this https://GCP_REGION-PROJECT_ID.cloudfunctions.net/FUNCTION_NAME. We have to call this URL from the Flutter Application. This blog explains how to call the Cloud Functions URL from the Flutter Application.
I need to sync (not mirror) files between a local disk and a cloud bucket. I can think of something like the Google Drive app, that works also in offline mode (and when the local PC goes online it automatically syncs data). This is useful for the app I'm going to develop, because of offline usage.
I dig a lot into the documentation but I didn't find any useful resource.
I can use gcloud rsync in combination with a Cloud Function to listen to cloud bucket events.
And a custom, local, trigger for events on the local hard disk (let's assume I'm developing a Node.JS local app).
But then I've to handle edge situations like: offline, concurrent operations, very long transfers, permissions, etc.
I don't want to re-invent the wheel and I think this is a common pattern, like the previously mentioned GDrive app.
Also, Firestore Native Mode does implement something really close to, although it's related to documents and not files.
Does Google Cloud Platform and/or Firebase allow the synchronization of local folder with cloud bucket with ease?
What do you think about my approach?
As you mentioned, these functions are implemented in Google Drive / One, and for these products, this is the main intention "to be a cloud drive" (basically all the time it is in sync with your local devices).
On the other hand, Google Cloud Storage is a service with a different approach, this is an object-based storage and was designed as part of the Cloud architecture to interact with cloud services (always online services), at this time Google does not offer a similar software client (as Google Drive does) for syncing local and cloud folders.
I found this third party software (not supported by Google) that allows syncing between local folders and cloud storage
Also I reviewed the pricing for Google One and cloud storage and Google One is significantly cheaper, for example.
2 TB / month G ONE: $ 10 USD
2TB / month G storage: $ 40 USD
Based on this, you should also add the price of additional services, for example.
pubsub service
cloud function service
outgoing network traffic
Your approach sounds good (it takes a lot of effort but it's okay) but unfortunately you are trying to use a service in an off-design scenario.
If you want to save code in the cloud, you can use Google Cloud Repositories which basically works like Github, but has the advantage of being easily integrated with CI/CD services like Google Cloud Build
Google Apps Script JDBC doesn't support a connection to PostgreSQL directly but Google Data Studio supports a connection to PostgreSQL to pull data and build reports. I've also heard they support a low-key export to .csv option. Is it then possible to exploit the Data Studio Service in Google Apps Script to populate Google Sheets with that data, effectively creating a workaround?
All I need is a one-way access from PostgreSQL into Google Sheets by means of Google Apps Script, I do NOT expect to import anything back into my database.
Looking at the reference documentation, the built-in Apps Script service for DataStudio does not allow you to pull data from a connected data source. It can be used to create connectors but its does not allow direct access to connected data sources.
However, you can try creating a custom API or server-less mirco-service in a language that supports PostgreSQL, and then expose that service as HTTP endpoints that you can call via URLFetchApp. You can leverage Google Cloud Functions to do this and write the mirco-service in either back-end Javascript(Node.js), Python or Go. This approach will take you well-outside the bounds of your typical GAS script, but it is a viable option.
My Java backend server has to upload files to the Google Cloud Storage (GCS).
Right now I just run
public void store(MultipartFile multipartFile) throws IOException {
Storage storage = StorageOptions.getDefaultInstance().getService();
storage.create(
BlobInfo.newBuilder(
BUCKET_NAME,
Objects.requireNonNull(multipartFile.getOriginalFilename()))
.build(),
multipartFile.getBytes()
);
}
Having set GOOGLE_APPLICATION_CREDENTIALS=$PROJECT_DIR$/project-1234-abcdefg.json in my environment.
However, this makes things complicated for my deployment setup. I don't know how I would go about making this file available to my service.
Is there another way to get access to GCS for my service account?
Background
I am deploying my server to Heroku as a compiled jar file and I don't know how to make the credentials available to my server during deployment.
You need a Google Account to access to GCS, either personal or technical. Technical is a service account.
However, you have another solution, but not really easy to implement. I wrote an article for securing serverless product with Cloud Endpoint with and API Key. Here your serverless solution can be Cloud Storage. But that implies that you call GCS with REST API and not with the java library, not very fun. That also implies additional cost for the hosting and the processing time of Cloud Endpoint.
Note: you can improve the authorization from API Key to Firebase auth or something else if you prefer. Check the Cloud Endpoint authentication capabilities
Note2: Google is working on another authentication mechanism but I don't know at which stage are the developments, and if it's plan for 2020. In any case, your constraint is known and addressed by Google
Does anybody know if we can maintain the DNS entries using google cloud console or any other UI? I couldn't find any place in cloud console for DNS admin.
Does the REST API only way to maintain the zone and DNS entries?
Update: There is now a UI for Cloud DNS in the Networking tab of the Developers Console.
Click here to check it out: https://console.developers.google.com/project/_/dns/zones
We have written one in Rails:
https://github.com/mainio/gcdns
It's not perfect (and might be buggy) but we're using it ourselves.