I have data getting stored in Google firebase (i.e. output of google vision API). I need the same data to get stored in MongoDB which is running on AWS. Is there a connectivity possible between Google cloud and AWS for data migration?
There is no out of the box solution for what you're trying to accomplish. Currently Firestore supports exporting it's data as documented here though the format of the export is probably not something MongoDB could import right away. Even if the format was compatible you would need some kind of data processing pipeline to handle the flow from side to side.
Depending on how you're handling the ingestion of Vision API results you might be able to include code to also send that data to MongoDB. If that's not the case you might need to design a custom solution for this particular use case.
Related
I'm working on a project that has a REST API deployed in Firebase Cloud Functions. This API manages all the Firestore reads and writes of the end users. Since the API is in Cloud Functions we use the Firebase Admin SDK which will bypass the security rules.
My goal is to run the queries using the API but as a user so that the reads and writes are checked by the security rules.
I found a solution which is using the Firestore REST API coupled with the firestore-parser library to get a JSON result similar to what i'm used to with the SDK. It was okay until i had to make queries with filters. To do so i need to use a structured query but the syntax is really complex and i will have to do a lot of mapping to parse a where clause from the JS writing.
So my questions are :
Is there a way to use directly the Firestore client SDK from the Cloud Functions?
If not, is there a simpler way to use the Firestore REST API?
I have a Flutter app (still in development) that currently uses Firebase for the backend. More specifically, I use Firebase Authentication, Storage, Cloud Functions, Firestore and in the future I am willing to use Remote Config, Dynamic Links, Cloud Messaging and more of Firebase's features.
I got to a point where Firestore is not enough anymore for my purposes: Full-text search, geographical querying and advanced queries in general. I know that I can use 3rd party services like Algolia for this but It's too expensive and I wanted something already integrated with my database.
I was thinking of start using MongoDB as my database (while keeping all other Firebase services) but before I do that I need to understand what is the best way to do it.
Can I host MongoDB on Firebase Hosting (I don't know if this possible at all?) or just use MongoDB Atlas and access it directly (See my next question) from my application?
What is the best way to connect my application to MongoDB? From the app directly (using Rest API) or using Firebase Cloud Functions (so I won't expose my database)?
Can I use Firebase Authentication tokens to access MongoDB or do I have to use MongoDB's authentication service?
If there is more things I need to consider before I start switching to MongoDB please point it to me.
Firebase Hosting is a CDN for hosting static websites. So it is not possible to host an application like MongoDB server. You can't host MongoDB on any Firebase services. You have to deploy it somewhere else. There are several options. You can either get a VPS and install MongoDB server on it. But you will have to manage your own DB which can be difficult and can take quite some time. Another option is to use a Cloud Database like MongoDB Atlas. This is a faster and more secure solution. However, pricing can be high. So you have to decide depending on your needs.
Once you have a running MongoDB server, you need to write an API for client apps to communicate securely. Client apps should never talk with a DB instance directly. In this case you can use Firebase Cloud Functions to create an api.
You can use Firebase Auth service with Firebase Cloud Functions. You should have a look at the Firebase Callable Functions which can pass auth context to the function body. Here you can just ensure the user is authenticated or perform some access control logic depending on your authorization needs.
Overall, you are going to add an another layer to your architecture. It is possible but will take your time to set things up and you will loose some firestore benefits like offline persistency.
I see no way in the Cloud Firestore database interface in my dashboard to (say) import a bunch of "documents" into a collection via JSON (or similar). Am I missing something? I have no problem creating some sideband code in Go to preload/refresh the database. Is this the intended method?
Historically I've done this with SQL files - this is my first foray into NoSQL...
There is no option in the Firebase console to import a JSON into Firestore.
But you can use the gcloud command line tool to both export and import data with Firestore. For a step-by-step walkthrough, see the documentation on exporting and importing data.
A quick search also gave me this firestore-backup-restore npm module. While I'm not sure how up to date it is, its code might be a good starting point in case you need something more custom than what the gcloud CLI gives you.
Alternatively you can of course read the JSON yourself, and call the Firestore API to write the documents. That's pretty much what both tools above do under the hood.
I'm creating a pipeline in Google Data Fusion that allows me to export my bing-ads data into Bigquery using my bing-ads developer token. I couldn't find any data sources that should be added to my pipeline in data fusion. Is fetching data from API calls even supported on Google Data Fusion and if it is, how can it be done?
HTTP based sources for Cloud Data Fusion are currently in development and will be released by Q3. Could you elaborate on your use case a little more, so we can make sure that your requirements will be covered by those plugins? For example, are you looking to build a batch or real-time pipeline?
In the meantime, you have the following two, more immediate options/workarounds:
If you are ok with storing the data in a staging area in GCS before loading it into BigQuery, you can use the HTTPToHDFS plugin that is available in the Hub. Use a path that starts with gs:///path/to/file
Alternatively, we also welcome contributions, so you can also build the plugin using the Cloud Data Fusion APIs. We are happy to guide you, and can point you to documentation and samples.
I will try to explain the context of the project on which I'm working and the problem which I currently try to overcome. I would like to collect mobile data for analytics purpose, e.g. using Firebase from Google. After collecting thoses data, I'd like to store them in a local database, such as PostgreSQL or MongoDB. But the thing is mobile data collecting platform such as Firebase mostly doesn't support connecting to a local database. I've found out that there is a possibility to export raw data from Firebase to import into a local database, but I have no detail information about this. I've searched through many documentations and I couldn't find anything.Has everyone ever had this kind of problem and can give a clear guidance about exporting data from Firebase and import into a local database? Thanks in advance.
Google Analytics for Firebase collects enormous amount of data and send it to the servers in form of batches or bundles. Three things here with your design:
SDK collects that data and does not expose it for you to parse in the app and store it in local DB.
Storing this much data locally, you are going to re-invent the wheel. In addition, this could cause performance issues.
If you need all the data collected by Firebase in raw form, just link Firebase to Big Query and all data would be available to you.