I see no way in the Cloud Firestore database interface in my dashboard to (say) import a bunch of "documents" into a collection via JSON (or similar). Am I missing something? I have no problem creating some sideband code in Go to preload/refresh the database. Is this the intended method?
Historically I've done this with SQL files - this is my first foray into NoSQL...
There is no option in the Firebase console to import a JSON into Firestore.
But you can use the gcloud command line tool to both export and import data with Firestore. For a step-by-step walkthrough, see the documentation on exporting and importing data.
A quick search also gave me this firestore-backup-restore npm module. While I'm not sure how up to date it is, its code might be a good starting point in case you need something more custom than what the gcloud CLI gives you.
Alternatively you can of course read the JSON yourself, and call the Firestore API to write the documents. That's pretty much what both tools above do under the hood.
Related
I have data getting stored in Google firebase (i.e. output of google vision API). I need the same data to get stored in MongoDB which is running on AWS. Is there a connectivity possible between Google cloud and AWS for data migration?
There is no out of the box solution for what you're trying to accomplish. Currently Firestore supports exporting it's data as documented here though the format of the export is probably not something MongoDB could import right away. Even if the format was compatible you would need some kind of data processing pipeline to handle the flow from side to side.
Depending on how you're handling the ingestion of Vision API results you might be able to include code to also send that data to MongoDB. If that's not the case you might need to design a custom solution for this particular use case.
I use node-red to create an Api from a server. I want to read and send data via http. I use the browser-based programming method. I want to send data from a postgresql database. I installed the package node-red-contrib-postgres-multi. I don´t know how to put data into the database and how to read data from the database, because I cannot find examples.
Does anybody knows, how I can do that?
You can use postgrestor postgrestor package where you can define query or data modification statement directly in node window. It also allows parametrized queries via template engine.
The problem I've encountered is working with more than one db instance in the same flow but if you need only one db instance connection it should work for you.
I've done quite a bit of searching, but haven't been able to find anything within this community that fits my problem.
I have a MongoDB collection that I would like to normalize and upload to Google Big Query. Unfortunately, I don't even know where to start with this project.
What would be the best approach to normalize the data? From there, what is recommended when it comes to loading that data to BQ?
I realize I'm not giving much detail here... but any help would be appreciated. Please let me know if I can provide any additional information.
If you're using python, easy way is to read collection chunky and use pandas' to_gbq method. Easy and quite fast to implement. But better to get more details.
Additionally to the answer provided by SirJ, you have multiple options to load data to BigQuery, including loading the data to Cloud Storage, local machine, Dataflow any more as mentioned here. Cloud Storage supports data in multiple formats such as CSV, JSON, Avro, Parquet and more. You also have various options to load data using Web UI, Command Line, API or using the Client Libraries which support C#, GO, Java, Node.JS, PHP, Python and Ruby.
Can't find any option to import my SQL Database into Firestore database.
Realtime Database has this function, for example.
Is it suggested to be available in Firestore after the Beta ?
I don't think there is a direct import right now.
You have a few options though:
Use the Firebase libraries to export data from your SQL application into Firestore manually (alternatively write a cloud function and get the data via an API). This can also be a good option, if you need to keep both DBs in sync for some transition time.
If you need to get your user accounts over to Firebase Auth, read this article
If you already have your data in a Firestore, gcloud now has an import in beta (https://firebase.google.com/docs/firestore/manage-data/export-import)
From your description I fear that the first option is the way to go for you. Unfortunately Firestore is still in beta and we have to create a lot of functionality ourselves (like SQL imports, data migrations, test mocks, …).
I will try to explain the context of the project on which I'm working and the problem which I currently try to overcome. I would like to collect mobile data for analytics purpose, e.g. using Firebase from Google. After collecting thoses data, I'd like to store them in a local database, such as PostgreSQL or MongoDB. But the thing is mobile data collecting platform such as Firebase mostly doesn't support connecting to a local database. I've found out that there is a possibility to export raw data from Firebase to import into a local database, but I have no detail information about this. I've searched through many documentations and I couldn't find anything.Has everyone ever had this kind of problem and can give a clear guidance about exporting data from Firebase and import into a local database? Thanks in advance.
Google Analytics for Firebase collects enormous amount of data and send it to the servers in form of batches or bundles. Three things here with your design:
SDK collects that data and does not expose it for you to parse in the app and store it in local DB.
Storing this much data locally, you are going to re-invent the wheel. In addition, this could cause performance issues.
If you need all the data collected by Firebase in raw form, just link Firebase to Big Query and all data would be available to you.