how to delete uploaded files from google cloud using slingshot package - google-cloud-storage

after successfully uploading files on google cloud storage using slingshot package, i didn't find documentation about how to delete files from google cloud throw my application. is there any idea to work around ? Thanks.

Related

Get files from a google cloud virtual machine to a google collaboratory notebook

I have files in a virtual machine hosted and created in Google Cloud and I want to be able to access them in google colab to run selenium.
Should I send the files to Google storage? It seems that I would then be able to ve found a tutorial there, it shows me how to access Google Cloud Storage files in Colab Notebooks.
You can use many methods to upload the files from your Compute Engine machine, for example uploading the files to Google Drive, or Google Cloud Storage. This document explains it well. Note that your CE machine wouldn't be stricty "local" to you but it would act the same way in that context.
With Cloud Storage, you can use gsutil to upload the files, with this command:
gsutil cp $file_to_upload gs://$bucket_name

How to download a text to speech file created in Watson Studio

I am using Text to Speech in Watson Studio. The output file is '.wav'. Does anyone have got any idea where is the file stored? I want to download it from the IBM cloud to my pc. How should I do this? I have searched entire cloud storage, but couldn't find the speech file.
When running the TTS API from within Watson Studio on Cloud notebooks, the files you write go to the underlying python runtime container's filesystem, which is not persistent.
So, you would have to explicitly copy that file to Cloud Object Storage.
An easy way to do that in WSC is to use the project_lib API (see https://dataplatform.cloud.ibm.com/docs/content/wsj/analyze-data/project-lib-python.html), which will let you create a Data Asset in your project.
You could also use the COS Client API https://ibm.github.io/ibm-cos-sdk-python/reference/services/s3.html#S3.Client.upload_file to copy that file to an arbitrary bucket that you have access to.
Regards,
Philippe Gregoire - IBM Ecosystem Advocacy Group - Data&AI Technical enablement

Auto upload remote files into Google cloud storage via FTP?

I download a lot of csv files via ftp from different sources on a daily basis. I then upload these files into Google Cloud Storage.
Are there any programs/api/tools to automate this?
Looking for a best way, if possible, to load these files directly into Google Cloud Storage without having to locally download them. Something that I can deploy on Google Compute, so I don't need to run a local programs like Filezilla/CrossFTP. The program/tool will keep checking the remote location on a regular basis and load new files into Google Cloud Storage; ensuring a checksum match.
I apologize in advance if this is too vague/generic question.
Sorry, no. Automatically importing objects from a remote FTP server is not currently a feature of GCS.

How to add Google Cloud Storage to existing project

I am trying to get my brain wrapped around Google Services. I have an existing Google App Engine running and would like to add Google Cloud Storage to the project?
Any help appreciated.
Regards
Have a look at the following documentation and see if that helps:
Activate Google Cloud Storage

Can't get Google cloud storage working on CKAN 2.1a

I installed the CKAN from source and trying to activate the Cloud Filestore option without success.
I double-checked my Google API console and activated interoperable access keys (GOOG...) to no avail. I keep getting "Unable to Upload File" when I try to upload.
Couldn't get it to work so just switched to S3 which was more straightforward. The recent S3 price drops helped too :)