Auto upload remote files into Google cloud storage via FTP? - google-cloud-storage

I download a lot of csv files via ftp from different sources on a daily basis. I then upload these files into Google Cloud Storage.
Are there any programs/api/tools to automate this?
Looking for a best way, if possible, to load these files directly into Google Cloud Storage without having to locally download them. Something that I can deploy on Google Compute, so I don't need to run a local programs like Filezilla/CrossFTP. The program/tool will keep checking the remote location on a regular basis and load new files into Google Cloud Storage; ensuring a checksum match.
I apologize in advance if this is too vague/generic question.

Sorry, no. Automatically importing objects from a remote FTP server is not currently a feature of GCS.

Related

Local Database File Associated with .dwg File

I am planning on building an AutoCAD extension application that will require custom data be stored outside of the standard .dwg file for AutoCAD drawings. I would like there to be a local file that this custom data is stored in so that the data can be read into AutoCAD or saved from AutoCAD while offline. I have been imagining that each .dwg file would have it’s own separate database file associated with it, but I am also open to the idea of having a single data file locally stored in order to allow for offline reading/writing of my custom data. Does MongoDB support this type of local data storage? There will be a cloud-based database where the data can be read from/written to, but I want there to be a local storage system to allow for offline read/write and also improved speed. I am just a bit confused about this because most resources online seem to address cloud storage and I am having a hard time understanding how to use MongoDB to implement a reliable local storage system.
It's possible to install the MongoDB Community Server edition locally on your machine.
You can download the installer here.
Installation instructions can be found here.
This post addresses where the data is stored. Basically it's one storage location per machine (where you can put all your databases).
You may need a GUI interface to browse all your databases. The community edition installer will prompt you to install Compass. I'm using a different software called Robo 3T for that.
Something like nedb-promises may be of interest for creating a database local to the application.
(I've also been looking into how to use MongoDB locally, so the above is a summary of what I've found so far.)

Get files from a google cloud virtual machine to a google collaboratory notebook

I have files in a virtual machine hosted and created in Google Cloud and I want to be able to access them in google colab to run selenium.
Should I send the files to Google storage? It seems that I would then be able to ve found a tutorial there, it shows me how to access Google Cloud Storage files in Colab Notebooks.
You can use many methods to upload the files from your Compute Engine machine, for example uploading the files to Google Drive, or Google Cloud Storage. This document explains it well. Note that your CE machine wouldn't be stricty "local" to you but it would act the same way in that context.
With Cloud Storage, you can use gsutil to upload the files, with this command:
gsutil cp $file_to_upload gs://$bucket_name

How to edit files directly on Google Cloud Storage using VS Code?

Is there a way to directly load / edit / save files to a given bucket in Google Cloud Storage without having to download the file, edit it, and then upload it again?
We have a GCS bucket with about 20 config files that we edit for various reasons. We would really just like to load the bucket into VS Code and then browse between updating the files and saving edits.
I have tried the vscode-bucket-explorer extension for VS Code but this just seems to provide viewing capability with no editing/saving capability. Unless I am missing something?
Is there a way to mount a bucket as a drive on a Mac? With read/write ability?
Is there a way to directly load / edit / save files to a given bucket
in Google Cloud Storage without having to download the file edit it and then upload it again
No, blobs objects in Google Cloud Storage can not be edited in place.
As with buckets, existing objects cannot be directly renamed. Instead,
you can copy an object, give the copied version the desired name, and
delete the original version of the object. See Renaming an object for
a step-by-step guide, including instructions for tools like gsutil and
the Google Cloud Console, which handle the renaming process
automatically.
Is there a way to mount a bucket as a drive on a Mac? With read/write
ability?
You can use Cloud Storage FUSE where the mounted bucket will behave similarly to a persistent disk.
Cloud Storage FUSE is an open source FUSE adapter that allows you to
mount Cloud Storage buckets as file systems on Linux or macOS systems.
It also provides a way for applications to upload and download Cloud
Storage objects using standard file system semantics. Cloud Storage
FUSE can be run anywhere with connectivity to Cloud Storage, including
Google Compute Engine VMs or on-premises systems

Compress / zip multiple files on google cloud storage without downloading

I want to compress / zip multiple files in google cloud storage bucket into single zip file without downloading them.
Is there any gsutil cli method which takes multiple path input and cp zip / compressed of all those input files.
Thank you in advance.
Nope, there's no functionality in GCS that supports this. And if the API doesn't support it, no tools or client libraries can, as they're simply making API calls under the hood.
Here it is, though not natively but you can host on ur machine or Google cloud for better
https://www.npmjs.com/package/zip-bucket

How to connect to Google Storage with Bucket Explorer

I generated Access & Secret Key put it into Bucket Explorer just as I saw on their screenshots but I get an error that Keys are not recognized on AWS, so it keeps trying to connect to AWS. I have latest version for OS X.
Looking at the Bucket Explorer website, it sounds like you need a special version of their software -- however, the download link does not actually seem to have that version available. Have you contact the developers of Bucket Explorer about it?
Alternatively, you can use the Google Cloud Console to upload and access your data in Google Cloud Storage without depending on a particular client application.