Audio recording with HTML5 Web Audio Api - web-audio-api

Does anyone know if the Web Audio API provides the ability to save audio played using the WebAudioContext?

I actually wrote a small utility called RecorderJS that might help.

There is a startRendering function in Chrome at least (haven't checked Safari). I think it's undergoing some rework and thus isn't included in the spec, but might be added at a later stage (or not). If you want to check out the current implementation, have a look at the answer at Is there a way to use the Web Audio API to sample audio faster than real-time?

There is a W3C specification for a recording API http://www.w3.org/TR/mediastream-recording/ , but as of now it is being implemented only in Firefox.
Client side there is available only the ScriptProcessorNode hack (which is what Record.js is based on).
Alternatively, for some use cases it might make sense to stream the audio to a server using WebRTC and write a server side recorder using Libjingle.

This library work fine, web audio api only (meaning no i.e users):
https://github.com/higuma/web-audio-recorder-js
But we can fairly use it now:
http://caniuse.com/#feat=audio-api
Anyway like you said your sound is already in an audiocontext, so I think you are looking for how to use the AudioDestinationNode, the final node of the web audio api. As soon as you can playing your audio through a regular html audio player, you will gain the record function on right click, like playDataUri do. You need to add the attribute "controls" to the player, or you can make a special link with download attribute.
I made a small enhancement of the Mdn script to send the data to an player, it should give you a good idea:
var audioCtx = new AudioContext();
var source = audioCtx.createMediaElementSource(myMediaElement);
myMediaElement = document.createElement("audio");
myMediaElement.setAttribute("autoplay", true);
myMediaElement.setAttribute("src", uri);
myMediaElement.setAttribute("controls", "controls");
document.getElementById('player').appendChild(myMediaElement);
source.connect(audioCtx.destination);
The AudioDestinationNode interface represents the end destination of
an audio graph in a given context — usually the speakers of your
device. It can also be the node that will "record" the audio data when
used with an OfflineAudioContext.
https://developer.mozilla.org/en-US/docs/Web/API/AudioDestinationNode

This is now available in the latest browsers, its called media recorder you can find more information here

Easiest way is to create a stream as
var dest = audioCtx.createMediaStreamDestination();
var options = {
audioBitsPerSecond : 320000,
sampleSize: 16,
channelCount: 2,
mimeType : 'audio/ogg'
}
var mediaRecorder = new MediaRecorder(dest.stream, options);
var chunks = [];
var isrecording = "Not Recording";
function rec(){
mediaRecorder.start();// start record
dataavailable = true;
isrecording = mediaRecorder.state;
}
you can can check out an example of my soundrec
App here. Although its also a full on Multiband compressor and 5 band paragraphic here.
Oh and if you check out my link . the most important thing to get your head around is the ondataavilable method. And Saving as a blob is a bit of ahead bash too.
Ps if anyone wants to help me get this working on chrome sen me an Email. Thanx.
It will only work in Firefox at the moment.
MultiBand Compressor

Related

How to stop tokbox screen sharing in SWIFT

I am working on an IOS application(SWIFT) in which i have used tokbox for screensharing, i am able to share the screen but not able to stop screensharing.
This is the code I have used for screensharing.
publisher?.videoType = .screen
publisher?.audioFallbackEnabled = false
let cap = ScreenCapturer(withView:view)
publisher?.videoCapture = cap
session?.publish(publisher, error: &error)
Can anyone guide to stop screensharing in swift.
To stop screen sharing will need to stop the publisher from streaming. To do that you can call:
[OTSession unpublish:error:]
More info is available on the Video API guides
For your case, where you are adding screen sharing to an existing call, you will need to create an additional publisher for the screen sharing rather than editing the existing one. To use the existing publisher it will require the publisher to be reinitialised to switch between publishing a camera feed vs a screen which will stop publishing audio too.
In addition to creating a new publisher, you need to create a new subscriber for the other user, you can do that in the subscriberDidConnect delegate function on the OTSubscriberDelegate.
Additionally, you will need to handle the destruction of both the new publisher and subscriber. This will be done in the delegate functions are you using already on the OTSessionDelegate and OTPublisherDelegate.
I have created a demo app which demonstrates this behaviour.

Voice message capture no longer working regardless of Secure Context / HTTPS on mobile

I am building an appointment scheduling web app (calendar like) that allows users to leave voice messages (explanations) at certain time points that others can retrieve remotely at their convenience. It works well on PC but I can't get it to work on mobiles anymore. Apparently new security context standards are in place (WebRTC?) and a Secure Context must be provided, but though I am using HTTPS (SSL Site/Domain), I can't get it to work in any of the major browsers. How else can I provide such a Secure Context?
The code I attach is quite popular and used to work well (it still does on laptops!). What can I do to provide the right context on mobile-phones (IOS, Android, Tablets)? Thank you for any help.
navigator.mediaDevices.getUserMedia(constraints).then(function(stream) {
//console.log("getUserMedia() success, stream created, initializing Speakit_recorder.js ...");
audioContext = new AudioContext();
//update the format
/* assign to gumStream for later use */
gumStream = stream;
/* use the stream */
input = audioContext.createMediaStreamSource(stream);
/* Create the Recorder object, configure to record mono sound(1channel). Recording 2channels will double the file size */
rec = new Recorder(input,{numChannels:2})
//start the recording process
rec.record();
//console.log("Recording started");
}).catch(function(err) {
//do whatever is necesssary if getUserMedia() fails
});
This might be due to the autoplay changes that landed in Chrome 71. See https://developers.google.com/web/updates/2017/09/autoplay-policy-changes as well as https://bugs.chromium.org/p/chromium/issues/detail?id=835767 for details.
You might want to log audioContext.state is 'running'. If not, you need to call audioContext.resume() on a user action (such as a button click)

Show Twitch.tv stream within a unity3D application

I would like to make an Unity3D Application in which one can watch a current Twitch.tv live stream.
I am not sure if this is possible, for example, with the twitch api (https://github.com/justintv/twitch-api)
I know about the video textures in Unity3D and I know how to use the default twitch api basics but I do not have an idea how to integrate a running twitch stream into my application.
Can someone please give me a hint if this is possible?
Thanks very much and best regards
Meph
Ultimately the stream coming from twitch is MPEG-4 (H264/M3U). So if you can render that in Unity, then you can render twitch streams.
There's a few steps to get the right URLs requested from twitch, and that can change over time. You'll have to inspect a current twitch page while a stream is playing to see how the javascript builds the request (url and headers).
In the end, the javascript will build an access token then use it to request a file called index-live.m3u8 from one of the twitch edge servers. That file contains a list of files names for the last few seconds of the live stream (and some other meta data). It looks something like this
https://video-edge-c61b44.lax01.hls.ttvnw.net/v0/[some-long-access-token]/index-live.m3u8
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:5
#ID3-EQUIV-TDTG:2017-10-27T16:53:27
#EXT-X-MEDIA-SEQUENCE:8788
#EXT-X-TWITCH-ELAPSED-SECS:17576.000
#EXT-X-TWITCH-TOTAL-SECS:17589.870
#EXTINF:2.000,
index-0000008788-Y6OH.ts
#EXTINF:2.000,
index-0000008789-l0QY.ts
#EXTINF:2.000,
index-0000008790-gCUV.ts
#EXTINF:2.000,
index-0000008791-1ngg.ts
#EXTINF:2.000,
index-0000008792-wpQL.ts
#EXTINF:2.000,
index-0000008793-koO4.ts
You then swap out index-live.m3u8 with the name of a file in the list and request it to get that clip. Something like:
https://video-edge-c61b44.lax01.hls.ttvnw.net/v0/[the-same-long-access-token]/index-0000008793-koO4.ts
It will be an MPEG-4 stream about 1 second long. The list is about 6 files long, so if you request them all you can get about a 6 second buffer.
Every second or two, index-live.m3u8 is updated and older files roll off as new ones are added to the bottom. You need to re-request this file every few seconds as your buffer runs out, and request the new clips in it to keep your playback going.
using Ted Bigham's answer you can get the m3u8 stream of the video. You can then feed this stream into a plugin like AVPro video from Renderheads to play it directly in unity on any shape or model you want, with a wide variety of customisation and settings.
However this is not a free plugin, and there may be alternatives. I am in no way affiliated to them, but have used this plug-in in the past with good results.
p.s i originally wanted to post this as just a comment, and believe Ted Bigham answered the grunt of your question. But am not allowed due to reputation.
#Ted Bigham is there anyway to do this just using the default unity video player?
Everything you've said so far has worked great! With just chrome, I'm able to open the .m3u8 and download each 1 second .ts file. With just any video player, I can view each video file just fine.
When I try to pass the url through Video Player though, it says "Cannot read file." I even tried replacing .ts with .mp4 since that is a supported format. This works when I test it outside of Unity, but gives me the same error.
Here's my code.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Networking;
using UnityEngine.Video;
public class TwitchStreamer : MonoBehaviour
{
public string m3u8 = "https://video-weaver.atl01.hls.ttvnw.net/v1/playlist/Cq4EuEYkh5IqurC4gy3nHqGhPVtUwH_" +
"5QcQbCatC5Fhit9qbivMQ2rdh2MS_m_2OuLd3VS2mF0eTrCKrz8YTuDO19mcbIBMJL3BUMz4jnNuU_t-e53V51TOtaN3vcCk9n3Qr" +
"dP0WyLREnVR0n_30d4PlUjjW4_si5Wr2XuePQ0dPtxP6jnsenKnX56YTLohtCN2-FdfSvHQMdfd0aw68FA4h9wawHoIM9-U6YRmPa" +
"fDsfdCiZr_iToSR6lZi81VoYPVjt7Ygf7xKwhjlNwrvA5SnsAnWQGIOVt4UjDkNLw-hmNMAr7RT0iiDghKXZY1VI6Tuc-umB1VXYE" +
"7BH5hHbDfHgB3_IYNb0fjoudtSuaZxISyWoazPrw3AibEO7k1-quhdcjarBTGpIi_dlPEp-yZlQOy98_OZY_tqjk8ZWTBIaAAYEG_" +
"miwqsgH4d6eIfkh3ehyMvPQH1C5dVG9tQcSWPUYU6D6hWhxvJhEr-UC0_BYWIVzX7z_Uf74FJGIEqSQc0d6igiowdMM_lyD8ZV9BE" +
"7wqQs3RegMPqux-AOfF-_Q7Ki2MBv9u7D9ZRXMH_cm20bTx5-ShEDRnWMApSfXK-9bAGNXUcw8YlBbHYeSN5VxEZMC2oGjcivBsGs" +
"RPMTQ_yNBSM1S6GxFRIR4nqA-mbdXg3rXMW3V6MNybBb1lrrQeEqF1tdYE0rfxe3Ki5WWkxeKmSjMGbMl1tHCwMaReTYkQnX5Qhjl" +
"HXXtKLtEIEEhB3cXW3oF05-E_q87s68JQIGgyIEKPiQlTsANR9zRc.m3u8";
public float updateClip = 1f, updateFile = 6f;
public VideoPlayer vp;
Queue<string> urls;
void OnEnable()
{
StartCoroutine(UpdateFile());
//InvokeRepeating("UpdateFile", 0f, updateFile);
InvokeRepeating("UpdateClip", updateClip, updateClip);
urls = new Queue<string>();
vp.prepareCompleted += Vp_prepareCompleted;
vp.errorReceived += Vp_errorReceived;
}
private void Vp_errorReceived(VideoPlayer source, string message)
{
Debug.Log("4567: Playback preparation failed. " + message, source);
}
private void Vp_prepareCompleted(VideoPlayer source)
{
Debug.Log("4567: Playback preparation complete.");
source.Play();
}
void OnDisable()
{
CancelInvoke();
StopAllCoroutines();
vp.prepareCompleted -= Vp_prepareCompleted;
vp.errorReceived -= Vp_errorReceived;
}
void UpdateClip()
{
//vp.Stop();
if (urls.Count > 0)
{
vp.url = urls.Dequeue();
vp.Prepare();
Debug.Log("4567: Prepare playback for " + vp.url);
}
}
IEnumerator UpdateFile()
{
UnityWebRequest www = UnityWebRequest.Get(m3u8);
yield return www.SendWebRequest();
if (www.isNetworkError || www.isHttpError)
{
Debug.Log(www.error);
}
else
{
// Show results as text
//Debug.Log(www.downloadHandler.text);
string[] raw = www.downloadHandler.text.Split('\n');
foreach (string line in raw)
{
if (line.Contains("http"))
{
string cnvrt = line.Replace(".ts", ".mp4");
urls.Enqueue(cnvrt);
Debug.Log(line + " has been added to stream queue.");
}
}
// Or retrieve results as binary data
//byte[] results = www.downloadHandler.data;
}
yield return new WaitForSeconds(updateFile);
}
}
I believe what this person wants to know is whether or not the Twitch API actually retrieves a live video feed and passes the data through to the application. The short answer is, it doesn't. The Twitch API doesn't allow you to take a live video feed, only get the information about it.
You can get strings telling you the name of the channel, the game and links to take you to the channel. You can also get their channel background or previews as image files. What you cannot do is get video files or stream video data. The best you can do is get the URLs and link to them.
The only method I can think of to get a Live Stream in any game is for it to be the player's. Essentially you would code in a method by which they can stream their game within the game itself, taking the stream data and copying it into some variable or class somewhere, before it is uploaded to Twitch, so you can use the data at the same time it is being broadcast. But if you want to retrieve some random streamer's gameplay and display it on a computer screen in your game, I'm afraid you can't do that. Not in Unity, anyway.
You can embed a twitch stream into a website
There are web browser plugins for Unity(see)
So if you combine these ideas, you can show a twitch stream in unity by running an in-app browser that will run a custom web page which embed the stream that you want. I don't know what kind of performance you'll get out of this but that sounds simple enough to do. It probably is not as simple as directly showing the stream on a texture but it's at least simpler than learning C, network programming, video processing and etc like Thomas suggested.
This is a tricky thing to do. You would need an API key from Twitch, a program that can establish not only a connection with your chosen API, but also a data stream. I would suggest starting with learning basic network programming in a less specialized language like C, and work from there. After you understand the basics, please come back and ask more specific questions. I can't do that much with this question in the state that it's in.
As with everything else, this is definitely possible, just maybe not with C#. For the sake of not destroying your game's performance, I would suggest writing a separate program that feeds the data into the game without the game directly grabbing data.
Once you gain an understanding of network programming, I would hop over to the twitch dev chat on the API, found here

Soundcloud API: how to play only a part of a track?

For an upcoming project I am investigating the possibility to play only a certain length (for example 20s) of a track, using the Soundcloud API.
Could anybody indicate me if that is possible, or should a different track with that limited length be created separately?
Many thanks !
Maarten (WebForDreams)
There are a few ways you can do this. Are you going to be using the JavaScript SDK or the player widget? For the player widget, you can just use seekTo().
If you're using the JavaScript SDK you can use the setPosition() method:
SC.initialize({
client_id: 'foo'
});
SC.whenStreamingReady(function() {
var sound = SC.stream(52933447);
sound.setPosition(2000); // position, measured in milliseconds
sound.play();
});
If you want to stop at a particular point, you could use onPosition().
Hope that helps!

Seeking with Streaming API

I am trying to create an aggregator that pulls song links to a central site from many different blogs using youtube, soundcloud, etc. I was wondering where I can find more info about the js streaming api. I'd like to be able to seek with the js streaming api, along with have a callback be called when that song is done playing.
Two things you'll want to check out:
Our HTML5 Widget has an API that allows you to bind code to events and control the player widget (e.g. seek to a specific point in a track). This'll work if you want to use our player widget anyway.
The JavaScript SDK uses SoundManager 2 under the hood. Certain methods, such as SC.stream will return a soundObject which you can use to seek and bind code to events.
Hope that helps. Comment if you have any questions and I'll edit my answer.
This doesn't really help me. I can't find a single example of stopping a stream anywhere. In fact I can't seem to bind to any of the methods described in sound manager. Neither of these work.
+++
SC.initialize({
client_id: "xxx"
});
// Stream the audio
sound = SC.stream("/tracks/293");
sound.play();
+++
SC.stream("/tracks/293", {
autoPlay: true,
ontimedcomments: function(comments){ console.log(comments[0].body);
},
onplay: function(sound){
alert(sound);
}
});
+++
Really, I just want to create a custom streaming player where I can change the music, and stop and hit play (not in jQuery). I haven't found much on this topic.
use SC object to stop the playback. more information at
http://developers.soundcloud.com/docs/api/sdks#methods
SC.recordStop();
top play the song again
'SC.recordPlay();'
hope this helps!
Jesse, Use this to play songs.
SC.get('/tracks/294', function(track){
console.log(track)
// SC.oEmbed(track.permalink_url, document.getElementById('player'))
})
Make sure you have a div#player in your html somewhere. Then, if you uncomment that line you'll see SoundCloud's branded music player.