Azure media player How to get the source url for the player - azure-media-services

I have an azure media services account with some uploaded videos, but these videos only play on the browser with some additional parameters like these (?sv=2017-04-17&sr=c&sig=QMSr...) something like authentication keys.
I want a generic permanent progressive video URL that can be played anytime, I tried to use the azure media player with my video URLs with .ism/manifest and .mp4 but both couldn't be played
exp:
https://<MY_BLOBSTORAGE_ACCOUNT>.blob.core.windows.net/<Asset-ID>/Video_FILE_NAME>.ism/manifest
https://<MY_BLOBSTORAGE_ACCOUNT>.blob.core.windows.net/<Asset-ID>/<Video_FILE_NAME>_1280x720_AACAudio.mp4
I have tried the player from this official microsoft documentation:
http://amp.azure.net/libs/amp/latest/docs/index.html#full-setup-of-azure-media-player
Also note that the Azure Media Services V3 documentation & the community of the ams itself is very poor and weak in terms of explaining the steps for programmatically getting the video urls for the player.

With AMS v3, you will have to create a streaming locator, and you can use a prebuilt streaming policy. There are policies for
- streaming only
- streaming and download
- download only
With the download policy, you will get a URL for each blobs in the asset. For example :
https://myaccount-uswc.streaming.media.azure.net/1544fcae-a248-4f53-b653-cd02074b04b6/video_848x480_2200.mp4
With a streaming policy (recommended), you will get a DASH, HLS and Smooth URL like:
https://myaccount-uswc.streaming.media.azure.net/0eef6f88-47c6-4662-9111-60305d7c1000/video.ism/manifest(format=mpd-time-csf).mpd

It appears that you're mixing progressive download and streaming. I wrote a blog post on the differences as it relates to Azure Media Services at https://blogs.msdn.microsoft.com/randomnumber/2016/03/23/progressive-download-and-streaming-differences-with-azure-media-services/. If you encoded the video into an adaptive bitrate MP4 set then more than likely you'll want to stream the video instead of using progressive download on a single MP4. This might help with the streaming side: https://learn.microsoft.com/en-us/azure/media-services/latest/dynamic-packaging-overview

I found the solution with help of a friend, after creating the streaming locator, I have to make sure that the streaming endpoint is running and then get build URLs by looping over paths which I need to get using StreamingLocators.ListPathsAsync, below is the code snippet.
private async Task<StreamingLocator> CreateStreamingLocatorAsync(
IAzureMediaServicesClient client,
string resourceGroup,
string accountName,
string assetName,
string locatorName)
{
StreamingLocator locator = await client.StreamingLocators.CreateAsync(
resourceGroup,
accountName,
locatorName,
new StreamingLocator
{
AssetName = assetName,
StreamingPolicyName = PredefinedStreamingPolicy.ClearStreamingOnly
});
return locator;
}
private async Task<IList<string>> GetStreamingUrlsAsync(
IAzureMediaServicesClient client,
string resourceGroupName,
string accountName,
String locatorName)
{
const string DefaultStreamingEndpointName = "default";
IList<string> streamingUrls = new List<string>();
StreamingEndpoint streamingEndpoint = await client.StreamingEndpoints.GetAsync(resourceGroupName, accountName, DefaultStreamingEndpointName);
if (streamingEndpoint != null)
{
if (streamingEndpoint.ResourceState != StreamingEndpointResourceState.Running)
{
await client.StreamingEndpoints.StartAsync(resourceGroupName, accountName, DefaultStreamingEndpointName);
}
}
ListPathsResponse paths = await client.StreamingLocators.ListPathsAsync(resourceGroupName, accountName, locatorName);
foreach (StreamingPath path in paths.StreamingPaths)
{
UriBuilder uriBuilder = new UriBuilder();
uriBuilder.Scheme = "https";
uriBuilder.Host = streamingEndpoint.HostName;
uriBuilder.Path = path.Paths[0];
streamingUrls.Add(uriBuilder.ToString());
}
return streamingUrls;
}
and in my service method I do the below:
StreamingLocator locator = await CreateStreamingLocatorAsync(client,
config.ResourceGroup, config.AccountName, outputAsset.Name, locatorName);
IList<string> streamingUrls = await GetStreamingUrlsAsync(client, config.ResourceGroup, config.AccountName, locator.Name);
foreach (var url in streamingUrls)
{
urls.Add(url);
Console.WriteLine(url);
}
myModel.StreamingUrls = urls;

Related

Upload Blob Url to Firebase Storage | Flutter

So I already read about the topic but I simply didn't understand the solutions on stack.
I came up with this code:
Im saving a url looking like this:
final String myDataUrl = file.url;
print(myDataUrl);
blob:http://localhost:51947/2952a3b1-db6a-4882-a42a-8e1bf0a0ad73
& then Im trying to add it into Firebase Storage with the putString operator, that I guessed that suited me best while reading the Documentation. I thought that I have a Url and therefore should be able to upload it like this:
FirebaseStorage.instance
.ref()
.child("bla")
.putString(myDataUrl, format: PutStringFormat.dataUrl);
But it doesn't work, it says that:
Error: Invalid argument (uri): Scheme must be 'data': Instance of '_Uri'
So Im guessing that it somehow can't format my url to one that is accepted.
What can I do different to upload a blob successfully to firebase Storage?
-----------------Answer----------------------
Answer in the comment of the answer.
You have to convert your Blob to a Uint8List & upload it like:
Future<Uint8List> fileConverter() async {
final reader = html.FileReader();
reader.readAsArrayBuffer(file!);
await reader.onLoad.first;
return reader.result as Uint8List;
}
and then put it into your Storage:
Future uploadFile(String uid) async {
if (file == null) return;
final path = "nachweise/$uid";
Uint8List fileConverted = await fileConverter();
try {
FirebaseStorage.instance
.ref()
.child(path)
.putData(fileConverted)
.then((bla) => print("sucess"));
} on FirebaseException catch (e) {
return null;
}
}
The Firebase Storage SDKs can upload local data as either a File, an array of bytes, or a base-64 encoded string. The only URLs it accepts are so-called data URLs, which start with data:// and contain the complete data of the object. They cannot upload data directly from URLs that you more commonly see, such as http:// or https://.
You'll need to first download the data from that URL to the local device, and then upload it from there.

Flutter store list in Firestore

I want to store my uploaded images urls to Firestore as list.
Also I'm trying store multiple images for profile. This is how I am storing images. If there is any better way to do that please let me know.
await _firebaseFirestore.collection("noktalar").doc().set({
'aciklama': aciklama,
'lat': 123,
'long': 123,
'mailAdresi': mailAdresi,
'noktaAdi': noktaAdi,
'telefon': tel,
'webAdresi': webAdresi,
'yetkiliAdiSoyadi': yetkili,
'resimler': FieldValue.arrayUnion(resimler)
});
and I tried this also
'resimler': imagesUrls.map((value) => value.toJson()).toList(),
You can basically store image URLs as a Strings and when you'll get them in code, you can just use split method to make list from String. That's pretty easy method to solve this problem
You can follow the below steps:
First upload your image to Firebase Storage and get the download
URL.
Now you have the download URL and you can Upload your Enter
object to Cloud FireStore with the url.
You can use the below function to get the download url of uploaded image and then store them in a location in Firestore.
Future<String> uploadImage(var imageFile ) async {
StorageReference ref = storage.ref().child("/photo.jpg");
StorageUploadTask uploadTask = ref.putFile(imageFile);
var dowurl = await (await uploadTask.onComplete).ref.getDownloadURL();
url = dowurl.toString();
return url;
}
You may also refer to the blog.

How to pass ACL properties to flutter amplify while uploading file to S3?

Or How to upload an image to s3 with public access by Flutter Amplify?
In my current flutter project, I can't pass ACL:public-read property while uploading files to S3 using amplify.
And because of this, whenever I'm uploading a new file to s3, I need to make it public manually.
So I just want to upload a new file with public read access for everyone.
I found some solutions for the Javascript project but not in the Flutter project.
Below is a method, I'm using to upload.
Future<String> uploadFile(String fileName, File local) async {
try {
Map<String, String> metadata = <String, String>{};
metadata['name'] = 'filename';
metadata['desc'] = 'A file';
S3UploadFileOptions options = S3UploadFileOptions(accessLevel: StorageAccessLevel.guest, metadata: metadata);
UploadFileResult result = await Amplify.Storage.uploadFile(key: fileName, local: local, options: options);
return result.key;
} catch (e) {
print('UploadFile Err: ' + e.toString());
}
return null;
}
I think you should be using Dio for declaring the client object that will be used for posting the request
You can find an example code in the following answer
So far Flutter Amplify is not giving any option to upload images with public access.
It always uploads with private read access.
So I updated a few things in my project as described below.
Before Amplify integration I was uploading images to S3 and storing that URL to my server, and wherever I have to display, I'm just fetching URL from my server and loading images.
But now I'm storing key(that is used to upload images to S3 by Amplify) to my server.
And to display the image I'm getting the image URL from Amplify using that key(which is stored in my server).
Amplify adds a token to the image URL with a default validity of 7 days
Future<String> getUrl(String key) async {
try {
S3GetUrlOptions options = S3GetUrlOptions(accessLevel: StorageAccessLevel.guest, expires: 10000);
GetUrlResult result = await Amplify.Storage.getUrl(key: key, options: options);
String url = result.url;
return url;
} catch (e) {
print('GetUrl Err: ' + e.toString());
}
return null;
}
So it can be displayed by ImageView.

MediaRecorder No Metadata on Download

I'm using MediaRecorder (along with the Web Audio API) to record and process audio and download the blob that it generates. The recording and downloading work great, but there is no metadata when the file is downloaded (length, sample rate, channels, etc.)
I'm using this to create the blob, and I've also tried the mimetype with no luck:
const blob = new Blob(chunks, {
'type' : 'audio/wav'
});
chunks = [];
const audioURL = window.URL.createObjectURL(blob);
audio.src = audioURL;
console.log("recorder stopped");
var new_file = document.getElementById('downloadblob').src
var download_link = document.getElementById("download_link");
download_link.href = new_file;
var name = generateFileName();
download_link.download = name;
How could I ensure the length of the recording, sample rate, and other metadata are included in the download?
I don't know of any browser which allows you to record something as audio/wav. You can get the mimeType of your recording from the instance of the MediaRecorder.
const blob = new Blob(chunks, {
'type': mediaRecorder.mimeType
});
Please note that the length will only be correct if you omit the timeslice parameter when calling mediaRecorder.start(). Otherwise the browser doesn't know the final length of the file when generating the metadata.

416 Error when creating url from a Blob

I'm using the Web Audio API to record a stream of audio source nodes. My code looks like this:
var context,
bufferLoader,
destination,
mediaRecorder,
source,
bufferList,
chunks = [],
sound_paths = [],
audioRecordings = [];
//fill in sound paths
sound_paths = ['sound.mp3', 'sound2.mp3'];
bufferLoader = new BufferLoader(
context,
sound_paths,
callback
);
//fill bufferList with bufferdata
bufferLoader.load();
destination = context.CreateMediaStreamDestination();
mediaRecorder = new MediaRecorder(destination);
mediaRecorder.ondataavailable = function(e){
chunks.push(e.data);
}
mediaRecorder.onstop = function (e) {
var blob = new Blob(chunks, {'type': 'audio/ogg; codecs=opus'});
var audio = document.createElement('audio');
audio.src = URL.createObjectURL(blob);
audioRecordings.push(audio);
chunks = [];
};
function startRecording(){
mediaRecorder.start();
source = Recorder.context.createBufferSource();
source.buffer = bufferList[0];
source.connect(Recorder.destination);
}
function stopRecording(){
mediaRecorder.stop();
}
//call startRecording(), then source.start(0) on user input
//call stopRecording(), then source.stop(0) on user input
I am using a the BufferLoader as defined here: http://middleearmedia.com/web-audio-api-bufferloader/
This works for the most part, but sometimes I get a 416 (Requested Range Not Satisfiable) when creating a Blob and creating a URL from it. This seems to happen more often when the web page begins to lag. I'm guessing this is because the Blob is undefined when creating the URL, or something like that. Is there a safer way to handle the onstop event for the media recorder? Maybe it would be better to use srcObjet and a MediaStream instead of a Blob?
For my website http://gtube.de (just an example no commercial) i am using recorder.js=>https://github.com/mattdiamond/Recorderjs. It works very good. Perhaps you should give that a try to record the context.
If you load the mp3.s in buffers with the web audio api and play them just at the same time it will devinitely work. => https://www.html5rocks.com/en/tutorials/webaudio/intro/
But thats already the way you do it => The code is missing in your example above so i had to read the article => Perhaps next time you try to make a shorter example.
Sorry i don't know enough about mediaStream API => I suppose it's broken ;-)
If something in web-audio doesn't work just try another way. It is still not very stable => Especially the Mozilla people are supporting it badly.