GCS bucket Premium File Transfer Task No Pubsub Notification - google-cloud-storage

I am using the Kingsway Soft Premium File Transfer task, docs here https://www.kingswaysoft.com/products/ssis-productivity-pack/help-manual/premium-file-pack/premium-file-transfer-task, to copy local files to a GCS bucket and it is supposed to trigger a pubsub notification however no notification happens even though the upload succeeds. If however I then go into GCS console and download the file and upload it manually, I get a pubsub notification.
Any idea why this may be?

Related

Send raspberryPi data into google cloud storage

I am working on a project that there is a raspberry pi that generates some data as CSV files in a laptop connected to it. My goal is to send these CSV files regularly into GCS (real-time or each 15minutes).
Then I will be using google cloud functions to send the data from GCS to BigQuery.
The raspberry pi is registered in the network (I am not sure how it can help)
My question: How do send CSV files from the laptop connected to raspberry Pi into Google cloud storage buckets?
You have to use the GCS client libraries:
https://cloud.google.com/storage/docs/reference/libraries
The one from python may be your best fit for the raspberry https://pypi.org/project/google-cloud-storage/
You will need a GCP project, create a bucket and create a ServiceAccount with permission to upload files. Download the ServiceAccount file to your raspeberry and use it as specified in the Client Library you chose: basically specify the credentials file location in your script or as a env var.
BTW to insert the file into BigQuery you could use the Cloud Storage pubsub notifications that create messages when new files are uploaded, then with a Push subscription to your Cloud Function may load it into the BigQuery with the BigQuery Client Library. Take a look to: https://cloud.google.com/storage/docs/pubsub-notifications?hl=es-419

Data Transfer between Google Storage different Service Accounts

I have two Google Service Credentials and a bucket on each account .I have to transfer files from one bucket to another. How can I do this programmatic ally?
Can I achieve this with two Storage objects or using the Cloud storage Transfer service?
Yes, with Storage Transfer Service you can create a transfer job and send the data to a destination bucket (in another project), keep in mind that it is documented that:
To access the data source and the data sink, this service account must
have source permissions and sink permissions.
Meaning that you can't use two different service accounts, you will need to grant access to only one of the two service accounts you have.
If you want to transfer files from one bucket to another programmatically. First, you must grant permission to the service account associated with the Storage Transfer Service so it can access the data sink(destination bucket), please follow these steps.
Please note that if you are not creating the transfer job in the same project where the source bucket is located, then you must grant permissions to access it.
With Storage Transfer Service you can create a transfer job programmatically with Java and Python, examples include creating the transfer job and checking the transfer operation status. Full code example can be found for Java and Python.

How to notify of object change in cloud store using gsutil without ApplicationUrl

I would like to notify my script running on linux of an object change in a bucket.
After reading the documentation I can notify an Application through url but this is not what I am looking for.
Is there any way I may listen for an object change through gsutil in my script?
Cloud Pub/Sub is the recommended solution for getting notified of changes to a bucket. With the Cloud Pub/Sub integration, you can subscribe to changes from your script to the topic being published to.
If you want to receive the notifications from a command, you can use gcloud pubsub subscriptions pull.

Set all files in Google Cloud Storage bucket to gzip by default

I am trying to set a Google Cloud Storage bucket so that any files I upload are automatically gzip'd and "Content-Encoding: gzip" is set.
I tried "gsutil defacl set public-read gs://bucket" based upon Set all files in Google Cloud Storage Bucket to public by default but was unsuccessful.
Any ideas?
There's no way to configure a bucket to automatically gzip files being uploaded there by default. One possibility would be to configure object change notifications on the bucket and implement code that responds to the notifications by reading the new input object and writing a compressed equivalent and then deleting the original.

Upload files to Google Cloud Storage without downloading them locally?

I want to upload 120 files, each around 1.2GB so about 150GB in total, from an HTTPS website onto my Google Cloud Storage.
I really, really don't want to have to download them all locally, and then upload them individually.
Is there any way around this? Surely I can just give Google Cloud Storage a URL to pull from? I don't control the HTTPS server.
It seems to be possible to upload from S3 to Google Cloud Storage, but S3 seems to suffer from the same problem.
If your website allows public access you can use the GCS Transfer Service to do it: https://cloud.google.com/storage/transfer/