How to download a text to speech file created in Watson Studio - ibm-cloud

I am using Text to Speech in Watson Studio. The output file is '.wav'. Does anyone have got any idea where is the file stored? I want to download it from the IBM cloud to my pc. How should I do this? I have searched entire cloud storage, but couldn't find the speech file.

When running the TTS API from within Watson Studio on Cloud notebooks, the files you write go to the underlying python runtime container's filesystem, which is not persistent.
So, you would have to explicitly copy that file to Cloud Object Storage.
An easy way to do that in WSC is to use the project_lib API (see https://dataplatform.cloud.ibm.com/docs/content/wsj/analyze-data/project-lib-python.html), which will let you create a Data Asset in your project.
You could also use the COS Client API https://ibm.github.io/ibm-cos-sdk-python/reference/services/s3.html#S3.Client.upload_file to copy that file to an arbitrary bucket that you have access to.
Regards,
Philippe Gregoire - IBM Ecosystem Advocacy Group - Data&AI Technical enablement

Related

How to edit files directly on Google Cloud Storage using VS Code?

Is there a way to directly load / edit / save files to a given bucket in Google Cloud Storage without having to download the file, edit it, and then upload it again?
We have a GCS bucket with about 20 config files that we edit for various reasons. We would really just like to load the bucket into VS Code and then browse between updating the files and saving edits.
I have tried the vscode-bucket-explorer extension for VS Code but this just seems to provide viewing capability with no editing/saving capability. Unless I am missing something?
Is there a way to mount a bucket as a drive on a Mac? With read/write ability?
Is there a way to directly load / edit / save files to a given bucket
in Google Cloud Storage without having to download the file edit it and then upload it again
No, blobs objects in Google Cloud Storage can not be edited in place.
As with buckets, existing objects cannot be directly renamed. Instead,
you can copy an object, give the copied version the desired name, and
delete the original version of the object. See Renaming an object for
a step-by-step guide, including instructions for tools like gsutil and
the Google Cloud Console, which handle the renaming process
automatically.
Is there a way to mount a bucket as a drive on a Mac? With read/write
ability?
You can use Cloud Storage FUSE where the mounted bucket will behave similarly to a persistent disk.
Cloud Storage FUSE is an open source FUSE adapter that allows you to
mount Cloud Storage buckets as file systems on Linux or macOS systems.
It also provides a way for applications to upload and download Cloud
Storage objects using standard file system semantics. Cloud Storage
FUSE can be run anywhere with connectivity to Cloud Storage, including
Google Compute Engine VMs or on-premises systems

Google Cloud Container Builder from Cloud Source Repository

Is it possible to build a docker container using Google Cloud Container Builder from source code in Google Cloud Source Repository?
The docs say the code must be in Cloud Storage so I assume the answer is no but this seems crazy. Am I missing something? Is code in Google Source Code accessible via Cloud Storage?
This feature is now publicly supported, see API docs at https://cloud.google.com/container-builder/docs/api/reference/rest/v1/projects.builds#RepoSource
Let us know if you have any problems or feature requests or use cases this doesn't cover.
Unfortunately the API is fairly low level, and so building from GSR or other source control systems directly is not currently possible.
However it is possible to write a service which can watch for source code changes from your favorite SCM, copy that source into a GCS bucket (handling your SCM auth as necessary) and then trigger the Container Builder API to build an image.
Google is running an Alpha program for additional tools that are built on top of this API. Those who are interested are encouraged to sign up here.

how to delete uploaded files from google cloud using slingshot package

after successfully uploading files on google cloud storage using slingshot package, i didn't find documentation about how to delete files from google cloud throw my application. is there any idea to work around ? Thanks.

Auto upload remote files into Google cloud storage via FTP?

I download a lot of csv files via ftp from different sources on a daily basis. I then upload these files into Google Cloud Storage.
Are there any programs/api/tools to automate this?
Looking for a best way, if possible, to load these files directly into Google Cloud Storage without having to locally download them. Something that I can deploy on Google Compute, so I don't need to run a local programs like Filezilla/CrossFTP. The program/tool will keep checking the remote location on a regular basis and load new files into Google Cloud Storage; ensuring a checksum match.
I apologize in advance if this is too vague/generic question.
Sorry, no. Automatically importing objects from a remote FTP server is not currently a feature of GCS.

How to connect to Google Storage with Bucket Explorer

I generated Access & Secret Key put it into Bucket Explorer just as I saw on their screenshots but I get an error that Keys are not recognized on AWS, so it keeps trying to connect to AWS. I have latest version for OS X.
Looking at the Bucket Explorer website, it sounds like you need a special version of their software -- however, the download link does not actually seem to have that version available. Have you contact the developers of Bucket Explorer about it?
Alternatively, you can use the Google Cloud Console to upload and access your data in Google Cloud Storage without depending on a particular client application.