Audio encoding, sample rate, and re-encoding in Google Cloud - encoding

Is it possible to lookup the audio metadata for a file stored in Google Cloud without having to download it? When building a Google Speech-to-Text API service you pass it a gs://bucket/file.flac, and I know the sox and ffmpeg bash and Python commands for locally stored files metadata lookup, but I can't seem figure out a way to lookup audio file metadata on Google Cloud Storage file.
Additionally if I have a gs://bucket/audio.wav, can I re-encode that using sox/py-sox and write the new audio.flac directly to gs://bucket/audio.flac? Or do I have to download the audio.wav to re-encode it?
Any thoughts or directions appreciated.

No, it is not possible to access the metadata you want directly in google Cloud Storage. Using the command gsutil ls -L gs://[bucket_name]/[file_name] will prompt the metadata of that file within the bucket. You can modify these metadata, but not the ones you are referring to. You will need to download the files, re-encode them and upload them again.
You cannot do that re-encoding operation in Cloud Storage, you will need to download the file and process it the way you want before uploading it again to your bucket. However, here is a workaround if it works for you:
Create a Cloud Function triggered when your file is uploaded. Then, retrieve the file that you just uploaded and perform any operation you want with it (such as re-encoding into .flac). After that, upload it again (careful! If you give the new file the same name and extension, it will overwrite the older one in the bucket).
About your library, Cloud Functions use Python 3.7, which for the time being does not support the py-sox library, so you will need to find another one.

Related

Data Fusion: GCS create creating folders not object

I am trying to create an GCS object (File) with GCS create plugin of Data Fusion.
but it is creating a folder instead.
How I can have a file created instead of a folder ??
It seems that the description of the plugin leads to a misunderstanding. Cloud Storage doesn't work like a conventional filesystem, so you cannot "strictly" create empty files. The gsutil command doesn't have an equivalent to a touch command (on Linux) and all "basic" operation in this product is limited to the cp command (upload and download files).
Therefore, since there is no file when you specify the storage url, it's expected that a folder will be created instead of a file.
Based on this, I would like to suggest you two workarounds:
If you are using this plugin to create a file as a ‘flag’, you can continue using the plugin since the created folder also serves as a flag (to trigger a Cloud Function, for example)
If you need to create a file, you can create files with the GCS plugin located in ‘Sink’ plugins group to write records to one or more files in a directory on Google Cloud Storage. Files can be written in various formats such as csv, avro, parquet, and json.

Using azure blob storage for installing ios apps

I am trying to set up a private app store using azure. I'm trying to use azure blob storage for storing .ipa and .plist files.
here is the reference:http://gknops.github.io/adHocGenerate/
I am able to upload files to azure blob storage and the files are available in the container with container access(i am able to browse to the file in my mobile).
Now when I try to use itms-services and open the .plist files hyperlink it is throwing that "cannot connect to xxxxxxx.blob.core.windows.net". But when I open the links to my ipa and plist they are working.
All the urls(html, ipa nad plist) are https only and using a mobile network so there are no network restrictions.
ex:
I am uploading the stored file as: myFile_v2-1-6.ipa, myFile_v2-1-6.plist
I am referring it in html as
https://xxxx.blob.core.windows.net/xxxxxxx/myFile_v2-1-6.plist">
Can someone help me with this issue? Your help will be very much appreciated.
EDIT:
my html is there in https://xxxxxxxxx.azurewebsites.net
Azure Blob storage was working fine.it was the problem with my plist file.
i am editing it using XmlDocument and saving it, then XmlDocument adding braces [] to the DTD element because of which the plist file is corrupted and throwing this error.
took some to understand this and now i am removing the [] using string replace and using that plist file. app installation is all working fine.

Google Compute Startup Script PHP Files From Bucket

I'd like to automatically load a folder full of php files from a bucket when an instance starts up. My php files are normally located at /var/www/html
How do I write a startup script for this?
I think this would be enormously useful for people such as myself who are trying to deploy autoscaling, but don't want to have to create a new image with their php files every time they want to deploy changes. It would also be useful as a way of keeping a live backup on cloud storage.

Fetching .kmz files from local disk and importing them into Google Earth via API

I need to import 3D models of each building in New York City into the Google Earth API. I fetched their .kmz files using google.earth.fetchKml. Since this command uses the URL of the kmz files and the number of files to imported is a lot it is very slow.
Is there any way I can fetch these files from my local disk?
Are there other formats I can use instead of .kmz? For example .dae files?
You cannot use a local file (non http) url to fetch your KML data.
But you could run a local webserver and use that.
For example, if you have python installed you could go to your directory with your KML files and run "python -m SimpleHTTPServer 8000", at which point pointing to http://localhost:8000/myfile.kml would load them up.
That said you should also note that the terms of use for the plugin require your site be publicly available, amongst other things - so hopefully you are only using this setup for local testing :)

cURL ftp transfer scenario

I'm trying to automate uploading and downloading from an ftp site using cURL inside MAtlab, but I'm having difficulties. Essentially I want one computer continuously uploading new files to an ftp, yet since there is a disk quota on the ftp, I want another computer continuously downloading and removing those same files from the ftp.
Easy enough, but my problem arises from wanting to make sure that I don't download a file that is still being uploaded, thereby resulting in an incomplete file.
First off, is there a way in cURL to make it so that the file wouldn't be available for download from the ftp site until the entire file has been uploaded?
One way around this is that I could upload files to one directory, and once they are finished uploading, then I could transfer them to a "Finished" directory on the ftp site. Then the download program would only look for files inside that "Finished" directory. However, I don't know how to transfer files within an ftp site using cURL.
Is it possible to transfer files between directories on an ftp site using cURL without having to download the file first?
And if anyone else has better ideas on how to perform this task, I'd love to hear em!
Thanks!
You can upload the files using a special name and then rename it when done, and have the download client only download files with that special "upload completed" name style.
Or you move them between directories just as you say (which is essentially a rename as well, just changing the directory too).
With the command line curl, you can perform "raw" commands after the upload with the -Q option and you can even find a tiny example in the curl FAQ: http://curl.haxx.se/docs/faq.html#Can_I_use_curl_to_delete_rename