Firestore: Copy/Import data from local emulator to Cloud DB - google-cloud-firestore

I need to copy Firestore DB data from my local Firebase emulator to the cloud instance. I can move data from the Cloud DB to the local DB fine, using the EXPORT functionality in the Firebase admin console. We have been working on the local database instance for 3-4 months and now we need to move it back to the Cloud. I have tried to move the local "--export-on-exit" files back to my storage bucket and then IMPORT from there to the Cloud DB, but it fails everytime.
I have seen one comment by Doug at https://stackoverflow.com/a/71819566/20390759 that this is not possible, but the best solution is to write a program to copy from local to cloud. I've started working on that, but I can't find a way to have both projects/databases open at the same time, both local and cloud. They are all the same project ID, app-key, etc.
Attempted IMPORT: I copied the files created by "--export-on-exit" in the emulator to my cloud storage bucket. Then, I selected IMPORT and chose the file I copied up to the bucket. I get this error message:
Google Cloud Storage file does not exist: /xxxxxx.appspot.com/2022-12-05 from local/2022-12-05 from local.overall_export_metadata
So, I renamed the file metadata file to match the directory name from the local system, and the IMPORT claims to initiate successfully, but then fails with no error message.
I've been using Firestore for several years, but we just started using the emulator this year. Overall I like it, but if I can't easily move data back to the Cloud, I can't use it. Thanks for any ideas/help.

Related

how to do local and remote file storage in a flutter app

I'm doing a flutter app that needs to open a binary file, display the content to a user and allow them to edit and save. File size would be between 10K and 10 MB. The file also needs to be in the cloud for sharing and accessing from other devices. To minimise remote network egress data charges and also local user mobile data charges, I'm envisaging that when the user saves the file it would be saved locally rather than written to the cloud and only written to the cloud when the user closes the file or maybe at regular intervals or no activity. To minimise network data charges I would like to keep a permanent local copy and the remote copy of the file would have a small supporting file that identified by who and when the remote file was last written. When the app starts, it checks if its local copy is up to date by reading the supporting file. The data does not need high security.
The app will run on Android, IOS, the web and preferably on the desktop - though I know that google firebase SDK for Windows is incomplete/ unavailable.
Is google firebase cloud storage the easiest and best way to do this. If not what is the easiest way.
Are there any cloud storage providers that don't charge for network egress data, just for storage.

How to export MongoDB data from Google Cloud Compute Engine>Storage>Disk?

I had a GKE cluster, with different webapps and a MongoDB database, that was deleted due to some problems with the billing. GKE just dissapeared but I still have the source code in another repo, so no problem to redeploy it.
If I go to "ComputeEngine>Storage>Disk" section, I can see the disk of my MongoDB but I cannot figure out how to export the data, in order to use it to populate a new Mongo database in another cloud. I have tried to create an image, but I'm not sure how to manage the image to extract the Mongo data.
I have not found any guide or tutorial about this, any help please?
Thanks too much
As #guillaume blaquiere mentioned :
Create a VM with a boot disk.
Add an additional disk with your MongoDB data.
Choose the Cloud Storage location to export your data to by clicking Browse.
Once you choose a Cloud Storage, choose a filename for the exported data. You can use the default filename, or you can choose your own filename.
After choosing a Cloud Storage, and entering a filename for the data, click Select.
From the Export image page, click Export. After choosing Export, the Cloud Console displays the Image export history, where you can view the data export process.
Go to the Storage page to access your exported data.
Checkout the link on Exporting an image for more information.

MLflow Artifacts Storing artifacts(google cloud storage) but not displaying them in MLFlow UI

I am working on a docker environment(docker-compose) with a jupyter notebook docker image and a postgres docker image for running ML models and using google cloud storage to store the model artifacts. Storing the models on the cloud storage works fine but i can't get to show them within the MLFlow UI. I have seen similar problems but non of the solutions used google cloud storage as the storage location for artifacts. The error message says the following Unable to list artifacts stored under <gs-location> for the current run. Please contact your tracking server administrator to notify them of this error, which can happen when the tracking server lacks permission to list artifacts under the current run's root artifact directory.What could possibly be causing this problem?
I had the exactly the same issue. Keywords are docker-compose, google cloud storage, success in storing in GCS, but failure in listing artifacts in UI.
In my case, it turns out that in docker-compose file, if you assign the env vars by reading from a .env file (eg. GOOGLE_APPLICATION_CREDENTIALS), the server might start before the assignment. The quick solve is to assign the env var directly with key environment: instead of using key env_file:.
For sensitive data that you still need to put in .env file, you can add wait time for the server, and add depends on: in docker-compose file to make sure that the database container starts before the mlflow server if you are using database-backed store.
I faced a same issue when running mlflow from local. The issue got resolved after adding GOOGLE_APPLICATION_CREDENTIALS to the environment variables.
https://googleapis.dev/python/google-api-core/latest/auth.html

Google Cloud SQL export failing with error "Could not complete the operation"

I have a google storage bucket to which I want to export my google cloud sql database. I go to the Export tab, select a location on my bucket, give a filename, and choose the database I want to export. But I'm always greeted with Could not complete the operation.
It has been happening to me for the last 2 days. This flow had worked a couple of weeks ago, and I haven't tweaked around with the settings since then.
Is there a way I can get a more descriptive response, so I can identify the error? Also, how do I export my cloud sql db until then? Do I connect with the psql client and figure out a way from there?

Best way to stage file from cloud storage to windows machine

I am wanting to store a data file for Quickbooks in the cloud. I understand that the data file is more of a database-in-a-file, so I know that I don't want to simply have the data file itself in a cloud directory.
When I say 'cloud', I'm meaning something like Google Drive or box.com.
What I see working is that I want to write a script (bat file, or do they have something new and improved for Windows XP, like some .net nonsense or something?)
The script would:
1) Download the latest copy of the data file from cloud storage and put it in a directory on the local machine
2) Launch Quickbooks with that data file
3) When the user exits Quickbooks, copy the data file back up into the cloud storage.
4) Rejoice.
So, my question(s)... Is there something that already does this? Is there an easily scriptable interface to work with the cloud storage options? In my ideal world, I'd be able to say 'scp google-drive://blah/blah.dat localdir' and have it copy the file down, and do the opposite after running QB. I'm guessing I'm not going to get that.
Intuit already provides a product to do this. It is called Intuit Data Protect and it backs up your Quickbooks company file to the cloud for you.
http://appcenter.intuit.com/intuitdataprotect
regards,
Jarred