Cannot find audio file in google bucket with google speech API - google-cloud-storage

With the Google Speech API (using the python sample code), you need to have your audio files on google cloud when longer than 1 minute. According to some sample code, you can use a path like
gs://python-docs-samples-tests/speech/audio.flac.
So I put my audio files in a bucket, and use (I believe) the correct path (i.e. gs://bucket-name/foldername/myaudiofile.wav), yet I get an error .
NotFound: 404 No such object: bucket-name/foldername/myaudiofile.wav
Even if I put the permission on public (which I rather not do), it cannot find the file. I have the feeling I am forgetting something very trivial here... But still haven't found it.

Go to the cloud console > select the project > go to Storage > Browse the buckets and make sure the file is actually there. Best way you can tell, IMO.

Related

Get names of files contained in Dropbox directory

I'm in need to read a large number of files from a permanent Dropbox webpage.
The reading part is fine, but I'm having troubles finding a way to list all the files contained within.
To be more precise, I would need something like
files = dir([url_to_Dropbox_directory,'*.file_extension']);
returning the names of all the files.
I've found an example in php, but nothing for MATLAB. Using dir was just an example, I'm looking for any solution to this problem.
How can I get the file list from a permanent Dropbox webpage?
You should use the Dropbox API where you can acces that data by a http request. "file-list-folder" is the specific endpoint that you are looking for.
Here is the documentation for it:
https://www.dropbox.com/developers/documentation/http/documentation#files-list_folder
In addition you could use the SDK for PHP (or others programming languages). I have used the SDK for JS and it's easy and works well.
PHP SDK:
https://dropbox.github.io/dropbox-sdk-php/api-docs/v1.1.x/

gsutil no urls matched, yet appears in cloud storage browser

I'm attempting to retrieve a database backup we've put into Cloud Storage. To make a long story short, the url is gs://servername/year/date/data.sql
It's a little more complicated than that, but for the sake of this question, it'll do.
Anyway, when I use the Storage Browser (Projects -> Storage -> Storage Browser), I can get into /server/2014/2014-09/04/ - but this is where things get weird.
The subfolders/directories phase in and out of existence, only for this date. I can go in and out of the bucket subfolder of 2014-09-04 all day, and it'll have different results every time. Sometimes the incremental data is there, sometimes only the schema data is there. Trying to download any file from the storage browser gives a big fat "Not Found" blank error page. No links, no http response codes. Just, "Not Found". All our older dated folders are fine.
If I use gsutil to attempt to retrieve the entire subfolder, it says:
CommandException: No URLs matched: gs://servername/2014/2014-09-04
The command I ran was:
gsutil.py mv gs://server/2014/2014-09-04 c:\dbrestore\
Yet there it is in the storage browser, clear as day. (There is only one ACL, so I know that's not the problem) To make sure I wasn't doing something funky, I have copied dates surrounding it ok, so 2014-09-03 and 2014-09-05 are both completely accessible from the storage browser, and gsutil.
I am out of ideas as to what could be wrong. Frankly, something about the bucket looks stuffed. Has anyone run into this problem before, and if so, what did you do to go about correcting it?

Which is the best method to get a local file URI and save it online?

I'm working on a web project but the scenario has some restrictions for a specific user case. We have been investigating a web-only solution and a dropbox-like native way to solve this.
The main restriction is that we shouldn't upload local files to a cloud. We can only track local URI's.
The use cases are:
As a developer, I should be able to link the URI of a local file to a webapp. Thus, I can click on a webapp element and the local file should be opened.
As a user, I should be able to add a directory and view the same structure on the webapp (clicking opens the file). The files are not uploaded.
Possible solutions:
We started trying the FileSystem API but when the specs. were fully defined, we figured out that a local sandbox was not enough, and we can't access to the local URI due to security issues.
We are considering a Dropbox-like native app. The Invision Sync App is closer to what we want.
The less optimal solution would be a complete native application.
The question:
Which is the more efficient way to achieve this? Any idea on some native libraries for doing this faster? Any web-only workaround?
Thanks in advance.

Download / upload file using the Add-On SDK

I am currently trying to download a small binary file from the web, in order to upload that to another website, both using the API.
Previous versions seemed to have the "file" API module for such purposes, but I can't see anything similar as of the latest (1.14).
The file to be downloaded would be saved in some form of cache (browser cache, preferably), its path stored somewhere, to be then uploaded to another URL via POST.
How would I go about it, when the process should happen completely in the background?
I checked out the how to download a file page, but can't figure out where to download.
Is there a variable URI for the "Downloads" directory, and does a regular Add-On has write privileges in it?.
This is important, because the add-on must be able to function properly on various platforms.
You can use the pref, browser.download.lastDir, which should work for windows/mac as it will be saved in the OS format. However the pref may not always be set if the person has never downloaded anything before. In that case you'll have to build the directory yourself.
var dir = require("sdk/preferences/service").get('browser.download.lastDir');
To build the directory yourself you're going to have to go a little deeper. Check this article on MDN about File I/O which has examples. The DfltDwnld key should give you the directory you want.
Your add-on will have write permissions to everything Firefox has write permission to.

google cloud storage sample not working

anyone else having an issue getting google cloud storage API sample to work?
https://developers.google.com/appengine/docs/python/googlestorage/overview#Complete_Sample_App
I followed all the directions which are very straight forward. You simply paste the code form the sample and it should work. However you do have to update your bucket names. I am updating this line
# TODO: Change to a bucket your app can write to.
READ_PATH = '/gs/bucket/obj'
to
READ_PATH = 'gs://mybucketname'
it does not work?
I updated it as such because that's how i access my bucket via gsutil
Anyone got this to work?
In the Files API, the path does not follow the gs:// URL scheme. As the example states, you need to make it:
/gs/mybucketname/myobjectname