MediaRecorder No Metadata on Download - web-audio-api

I'm using MediaRecorder (along with the Web Audio API) to record and process audio and download the blob that it generates. The recording and downloading work great, but there is no metadata when the file is downloaded (length, sample rate, channels, etc.)
I'm using this to create the blob, and I've also tried the mimetype with no luck:
const blob = new Blob(chunks, {
'type' : 'audio/wav'
});
chunks = [];
const audioURL = window.URL.createObjectURL(blob);
audio.src = audioURL;
console.log("recorder stopped");
var new_file = document.getElementById('downloadblob').src
var download_link = document.getElementById("download_link");
download_link.href = new_file;
var name = generateFileName();
download_link.download = name;
How could I ensure the length of the recording, sample rate, and other metadata are included in the download?

I don't know of any browser which allows you to record something as audio/wav. You can get the mimeType of your recording from the instance of the MediaRecorder.
const blob = new Blob(chunks, {
'type': mediaRecorder.mimeType
});
Please note that the length will only be correct if you omit the timeslice parameter when calling mediaRecorder.start(). Otherwise the browser doesn't know the final length of the file when generating the metadata.

Related

Upload CSV file Flutter Web

I am using the file_picker plugin to pick a CSV file in Flutter Web. Although I am able to pick the file it is converting the file into bytes (Uint8List). Is there any way I can get the original CSV file or if I can convert these bytes to CSV maybe I can get the path of the file?
Code:
void pickCSV() async {
FilePickerResult? result = await FilePicker.platform.pickFiles(type: FileType.custom, allowedExtensions: ['csv']);
if (result != null) {
var fileBytes = result.files.first.bytes;
csfFileName.value = result.files.first.name;
} else {
// User canceled the picker
}
}
I know it's a bit late but you have a couple of choices and maybe it helps others out aswell.
Both of the choices requires server-side processing, so you will need to read on how to do that.
Get the content of the CSV file send it to the server and make a new file on the server with that content. You can use String.fromCharCodes to read the content, in the web, after you select the file.
Convert the Uint8List into a base64 string, using base64Encode function, send it to the server, process it there.
Alternatively, if you use Firebase Storage you can use putData like so:
final metaData = SettableMetadata(contentType: mimeType);
final task = await _storage.ref().child(cloudPath).putData(fileData, metaData);
/// Get the URL
await task.ref.getDownloadURL()
Storing the mimeType ensures proper handling of the file when used after download
cloudPath means the location in FirebaseStorage, such as: storageDirectory/filename.extension
fileData is the Uint8List you provided

How can I get a continuous stream of samples from the JavaScript AudioAPI

I'd like to get a continuous stream of samples in JavaScript from the audio API. The only way I've found to get samples is through the MediaRecorder object in the JavaScript Audio API.
I set up my recorder like this:
var options = {
mimeType: "audio/webm;codec=raw",
}
this.mediaRecorder = new MediaRecorder(stream, options);
this.mediaRecorder.ondataavailable = function (e) {
this.decodeChunk(e.data);
}.bind(this);
this.mediaRecorder.start(/*timeslice=*/ 100 /*ms*/);
This gives me a callback 10 times a second with new data. All good so far.
The data is encoded, so I use audioCtx.decodeAudioData to process it:
let fileReader = new FileReader();
fileReader.onloadend = () => {
let encodedData = fileReader.result;
// console.log("Encoded length: " + encodedData.byteLength);
this.audioCtx.decodeAudioData(encodedData,
(decodedSamples) => {
let newSamples = decodedSamples.getChannelData(0)
.slice(this.firstChunkSize, decodedSamples.length);
// The callback which handles the decodedSamples goes here. All good.
if (this.firstChunkSize == 0) {
this.firstChunkSize = decodedSamples.length;
}
});
};
This all works fine too.
Setting up the data for the file reader is where it gets strange:
let blob;
if (!this.firstChunk) {
this.firstChunk = chunk;
blob = new Blob([chunk], { 'type': chunk.type });
} else {
blob = new Blob([this.firstChunk, chunk], { 'type': chunk.type });
}
fileReader.readAsArrayBuffer(blob);
The first chunk works just fine, but the second and later chunks fail to decode unless I combine them with the first chunk. I'm guessing what is happening here is that the first chunk has a header that is required to decode the data. I remove the samples decoded from the first chunk after decoding them a second time. See this.firstChunkSize above.
This all executes without error, but the audio that I get back has a vibrato-like effect at 10Hz. A few hypotheses:
I have some simple mistake in my "firstChunkSize" and "splice" logic
The first chunk has some header which is causing the remaining data to be interpreted in a strange way.
There is some strange interaction with some option when creating the audio source (noise cancellation?)
You want codecs=, not codec=.
var options = {
mimeType: "audio/webm;codecs=pcm",
}
Though MediaRecorder.isSupported will return true with codec= it is only because this parameter is being ignored. For example:
MediaRecorder.isTypeSupported("audio/webm;codec=pcm")
true
MediaRecorder.isTypeSupported("audio/webm;codecs=pcm")
true
MediaRecorder.isTypeSupported("audio/webm;codecs=asdfasd")
false
MediaRecorder.isTypeSupported("audio/webm;codec=asdfasd")
true
The garbage codec name asdfasd is "supported" if you specify codec instead of codecs.

Azure media player How to get the source url for the player

I have an azure media services account with some uploaded videos, but these videos only play on the browser with some additional parameters like these (?sv=2017-04-17&sr=c&sig=QMSr...) something like authentication keys.
I want a generic permanent progressive video URL that can be played anytime, I tried to use the azure media player with my video URLs with .ism/manifest and .mp4 but both couldn't be played
exp:
https://<MY_BLOBSTORAGE_ACCOUNT>.blob.core.windows.net/<Asset-ID>/Video_FILE_NAME>.ism/manifest
https://<MY_BLOBSTORAGE_ACCOUNT>.blob.core.windows.net/<Asset-ID>/<Video_FILE_NAME>_1280x720_AACAudio.mp4
I have tried the player from this official microsoft documentation:
http://amp.azure.net/libs/amp/latest/docs/index.html#full-setup-of-azure-media-player
Also note that the Azure Media Services V3 documentation & the community of the ams itself is very poor and weak in terms of explaining the steps for programmatically getting the video urls for the player.
With AMS v3, you will have to create a streaming locator, and you can use a prebuilt streaming policy. There are policies for
- streaming only
- streaming and download
- download only
With the download policy, you will get a URL for each blobs in the asset. For example :
https://myaccount-uswc.streaming.media.azure.net/1544fcae-a248-4f53-b653-cd02074b04b6/video_848x480_2200.mp4
With a streaming policy (recommended), you will get a DASH, HLS and Smooth URL like:
https://myaccount-uswc.streaming.media.azure.net/0eef6f88-47c6-4662-9111-60305d7c1000/video.ism/manifest(format=mpd-time-csf).mpd
It appears that you're mixing progressive download and streaming. I wrote a blog post on the differences as it relates to Azure Media Services at https://blogs.msdn.microsoft.com/randomnumber/2016/03/23/progressive-download-and-streaming-differences-with-azure-media-services/. If you encoded the video into an adaptive bitrate MP4 set then more than likely you'll want to stream the video instead of using progressive download on a single MP4. This might help with the streaming side: https://learn.microsoft.com/en-us/azure/media-services/latest/dynamic-packaging-overview
I found the solution with help of a friend, after creating the streaming locator, I have to make sure that the streaming endpoint is running and then get build URLs by looping over paths which I need to get using StreamingLocators.ListPathsAsync, below is the code snippet.
private async Task<StreamingLocator> CreateStreamingLocatorAsync(
IAzureMediaServicesClient client,
string resourceGroup,
string accountName,
string assetName,
string locatorName)
{
StreamingLocator locator = await client.StreamingLocators.CreateAsync(
resourceGroup,
accountName,
locatorName,
new StreamingLocator
{
AssetName = assetName,
StreamingPolicyName = PredefinedStreamingPolicy.ClearStreamingOnly
});
return locator;
}
private async Task<IList<string>> GetStreamingUrlsAsync(
IAzureMediaServicesClient client,
string resourceGroupName,
string accountName,
String locatorName)
{
const string DefaultStreamingEndpointName = "default";
IList<string> streamingUrls = new List<string>();
StreamingEndpoint streamingEndpoint = await client.StreamingEndpoints.GetAsync(resourceGroupName, accountName, DefaultStreamingEndpointName);
if (streamingEndpoint != null)
{
if (streamingEndpoint.ResourceState != StreamingEndpointResourceState.Running)
{
await client.StreamingEndpoints.StartAsync(resourceGroupName, accountName, DefaultStreamingEndpointName);
}
}
ListPathsResponse paths = await client.StreamingLocators.ListPathsAsync(resourceGroupName, accountName, locatorName);
foreach (StreamingPath path in paths.StreamingPaths)
{
UriBuilder uriBuilder = new UriBuilder();
uriBuilder.Scheme = "https";
uriBuilder.Host = streamingEndpoint.HostName;
uriBuilder.Path = path.Paths[0];
streamingUrls.Add(uriBuilder.ToString());
}
return streamingUrls;
}
and in my service method I do the below:
StreamingLocator locator = await CreateStreamingLocatorAsync(client,
config.ResourceGroup, config.AccountName, outputAsset.Name, locatorName);
IList<string> streamingUrls = await GetStreamingUrlsAsync(client, config.ResourceGroup, config.AccountName, locator.Name);
foreach (var url in streamingUrls)
{
urls.Add(url);
Console.WriteLine(url);
}
myModel.StreamingUrls = urls;

416 Error when creating url from a Blob

I'm using the Web Audio API to record a stream of audio source nodes. My code looks like this:
var context,
bufferLoader,
destination,
mediaRecorder,
source,
bufferList,
chunks = [],
sound_paths = [],
audioRecordings = [];
//fill in sound paths
sound_paths = ['sound.mp3', 'sound2.mp3'];
bufferLoader = new BufferLoader(
context,
sound_paths,
callback
);
//fill bufferList with bufferdata
bufferLoader.load();
destination = context.CreateMediaStreamDestination();
mediaRecorder = new MediaRecorder(destination);
mediaRecorder.ondataavailable = function(e){
chunks.push(e.data);
}
mediaRecorder.onstop = function (e) {
var blob = new Blob(chunks, {'type': 'audio/ogg; codecs=opus'});
var audio = document.createElement('audio');
audio.src = URL.createObjectURL(blob);
audioRecordings.push(audio);
chunks = [];
};
function startRecording(){
mediaRecorder.start();
source = Recorder.context.createBufferSource();
source.buffer = bufferList[0];
source.connect(Recorder.destination);
}
function stopRecording(){
mediaRecorder.stop();
}
//call startRecording(), then source.start(0) on user input
//call stopRecording(), then source.stop(0) on user input
I am using a the BufferLoader as defined here: http://middleearmedia.com/web-audio-api-bufferloader/
This works for the most part, but sometimes I get a 416 (Requested Range Not Satisfiable) when creating a Blob and creating a URL from it. This seems to happen more often when the web page begins to lag. I'm guessing this is because the Blob is undefined when creating the URL, or something like that. Is there a safer way to handle the onstop event for the media recorder? Maybe it would be better to use srcObjet and a MediaStream instead of a Blob?
For my website http://gtube.de (just an example no commercial) i am using recorder.js=>https://github.com/mattdiamond/Recorderjs. It works very good. Perhaps you should give that a try to record the context.
If you load the mp3.s in buffers with the web audio api and play them just at the same time it will devinitely work. => https://www.html5rocks.com/en/tutorials/webaudio/intro/
But thats already the way you do it => The code is missing in your example above so i had to read the article => Perhaps next time you try to make a shorter example.
Sorry i don't know enough about mediaStream API => I suppose it's broken ;-)
If something in web-audio doesn't work just try another way. It is still not very stable => Especially the Mozilla people are supporting it badly.

It can RecorderJs processing a record of a file without emit any sound in speakers meanwhile?

While reading some about RecorderJs I ask myself if is possible record a sound without emit any sound in the speakers, all in a background, somebody knows if that is possible? because I don't see something similar in the Recorderjs Repository.
If you really want to use recorder.js, I guess there is a way to feed it directly with a MediaStream, that you'll get from the streamNode.stream.
Reading quickly the source code of this lib, it seems it only accepts AudioContext Source Nodes, not directly streams, and anyway, you just have to comment the line 38 of recorder.js file.
this.node.connect(this.context.destination); //this should not be necessary
comment from the author
And indeed it is.
Otherwise, you can also achieve it vanilla style (except that it will save as ogg instead of wav), by using the official MediaRecorder API, available in latests browsers.
The main key is the MediaStreamDestination which doesn't need to be connected to the AudioContext's destination.
var audio = new Audio();
audio.crossOrigin = 'anonymous';
audio.src = 'https://dl.dropboxusercontent.com/s/agepbh2agnduknz/camera.mp3';
audio.onloadedmetadata = startRecording;
var aCtx = new AudioContext();
var sourceNode = aCtx.createMediaElementSource(audio);
var streamNode = aCtx.createMediaStreamDestination();
sourceNode.connect(streamNode);
function startRecording() {
var recorder = new MediaRecorder(streamNode.stream),
chunks = [];
recorder.ondataavailable = function(e) {
chunks.push(e.data);
}
recorder.onstop = function() {
var blob = new Blob(chunks);
var url = URL.createObjectURL(blob);
var a = new Audio(url);
a.controls = true;
document.body.appendChild(a);
}
audio.onended = function() {
recorder.stop();
};
audio.play();
recorder.start();
}