How do I record streams in chunks on Flash Media Server - streaming

I want to record a stream which is published with Flash Live Encoder to FMS 3.5, but split the recording in files with predefined length. For example if a stream 'webcam' is published I want to record it in chunks of 10 minutes: 'webcam1.flv', 'webcam2.flv' ...
From what I can tell there's no facility to work with timers. The only solution I could think of was using stream.record() with a time limit parameter but that seems like a hack because it triggers NetStream.Record.DiskQuotaExceeded on the stream when the recordin should stop and start recording another chunk.
Has anyone done something similar?

On the server side why not just republish and record the stream with some timestamped name. Then run a timer that fires every ten minutes (or whatever) which stops the recording of that stream, and creates a new server side stream playing the client stream.
Something along the lines of:
setInterval("setNewStream", 600000);
function setNewStream() {
var now = new Date();
serverStream.record(false);
var filename = "recording-"+ now.getHours() + "-" + now.getMinutes();
serverStream = Stream.get(filename);
serverStream.play("clientStream");
serverStream.record();
}

Related

How to skip ahead n seconds while playing track using web audio API?

Using Web Audio API, I'm trying to build an mp3 player with a "Skip ahead 15 seconds" feature.
I'm able to load an mp3 using a source buffer, and can get it to start playing. I want to do something like this, though I know currentTime is not a settable property:
context.currentTime += 15
How do you skip forward n seconds once the song is already playing?
Unfortunately there is no single API call to achieve the desired effect but it's doable. Every AudioBufferSourceNode can only be used once which is why we have to create a new one in order to change something.
Let's imagine we have two variables coming from somewhere called audioContext and audioBuffer. In addition we define two more variables to store the initial startTime and the currently running AudioBufferSourceNode.
let audioBufferSourceNode;
let startTime;
The first time we play the audioBuffer we play it directly from the start. The only special thing here is that we keep a reference to the audioBufferSourceNode and that we remember the startTime.
audioBufferSourceNode = audioContext.createBufferSource();
audioBufferSourceNode.buffer = audioBuffer;
audioBufferSourceNode.connect(audioContext.destination);
startTime = context.currentTime;
audioBufferSourceNode.start(startTime);
If you want to skip ahead some time later the previously started audioBufferSourceNode needs to be stopped first.
const currentTime = context.currentTime;
audioBufferSourceNode.stop(currentTime);
audioBufferSourceNode.disconnect();
In addition a new one needs to be created by reusing the same audioBuffer as before. The only difference here is that we apply an offset to make sure it skips 15 seconds ahead.
audioBufferSourceNode = audioContext.createBufferSource();
audioBufferSourceNode.buffer = audioBuffer;
audioBufferSourceNode.connect(audioContext.destination);
audioBufferSourceNode.start(currentTime, currentTime - startTime + 15);
To be prepared to skip another time it's necessary to update the startTime.
startTime -= 15;
This is of course an oversimplified example. In reality there should be a check to make sure that there is enough audio data left to skip ahead. You could also apply a little fade-in/out when skipping to avoid click sounds. ... This is only meant to illustrate the general idea.

Is it possible to change MediaRecorder's stream?

getUserMedia(constrains).then(stream => {
var recorder = new MediaRecorder(stream)
})
recorder.start()
recorder.pause()
// get new stream getUserMedia(constrains_new)
// how to update recorder stream here?
recorder.resume()
Is it possible? I've try to create MediaStream and use addTrack and removeTrack methods to change stream tracks but no success (recorder stops when I try to resume it with updated stream)
Any ideas?
The short answer is no, it's not possible. The MediaStream recording spec explicitly describes this behavior: https://w3c.github.io/mediacapture-record/#dom-mediarecorder-start. It's bullet point 15.3 of that algorithm which says "If at any point, a track is added to or removed from stream’s track set, the UA MUST immediately stop gathering data ...".
But in case you only want to record audio you can probably use an AudioContext to proxy your streams. Create a MediaStreamAudioDestinationNode and use the stream that it provides for recording. Then you can feed your streams with MediaStreamAudioSourceNodes and/or MediaStreamTrackAudioSourceNodes into the audio graph and mix them in any way you desire.
Last but not least there are currently plans to add the functionality you are looking for to the spec. Maybe you just have to wait a bit. Or maybe a bit longer depending on the browser you are using. :-)
https://github.com/w3c/mediacapture-record/issues/167
https://github.com/w3c/mediacapture-record/pull/186

Why is socket slowing down after a number of reads?

I've written a simple test client (VB.Net) that uses the TcpClient class to send a small "command" byte array (6 or 7 bytes) to a hardware device, across the company network. The device responds with a byte array (approx 48kb) which my client app reads. I know the exact length of the returned array, so I repeatedly read the socket in a loop until I have it all:-
Dim read_data() As Byte
Dim stream As NetworkStream = tcpClient.GetStream()
' Send the "command"
stream.Write(data, 0, commandByteArray)
' Read the response
Using ms As New MemoryStream()
Dim sw As Stopwatch = New Stopwatch()
sw.Start()
Dim temp_read(16384) As Byte
Try
Do
Dim bytes_read = stream.Read(temp_read, 0, temp_read.Length)
ms.Write(temp_read, 0, bytes_read)
Loop While ms.Length < expectedResponseSize
Catch ex As IOException
' No more data
End Try
read_data = ms.ToArray()
Debug.WriteLine(sw.ElapsedMilliseconds)
End Using
This code is wired up to a button click event. Each time I run it, the stopwatch shows it takes an average of 5ms to read the 48kb byte array back. However if I repeatedly click the button several times per second, after a few seconds the time reported by the stopwatch starts to increase, and eventually settles at around 50ms. If I disconnect, reconnect and try again then the time returns to 5ms (until I start rapidly clicking the button again).
Any idea what's causing this? I'm assuming it's something in the network protocol or network hardware, e.g. something adapting the connection to the increased data throughput?
Edit: as mentioned in the comment below, I've tried disabling Nagling (socket.NoDelay = true) but it had no effect, although my understanding is that this would be more likely to improve throughput of very small messages, not large ones like this.

Red5 Pause/Play/Rewind Live Stream

Can some one tell me that how it is possible to rewind the live stream via RED5 Server.?
Is it possible or not.
Code Snippet may help.Reply soon.
Also. I know pause has to deal with the flash player but i want to know from which position stream starts playing(from runtime,where it was stopped).??
Awaiting Quick Response.
B/R
I think you cannot rewind a live stream. As how i understand the live stream is spreaded directly to all connected clients. The frames will not be saved at the server. So the server isnt able to "go back".
You need to record the stream if you want to be able to rewind.
If you pause the stream the last frame is frozen on your screen. The server continues the broadcast and you miss the frames which are broadcasted during that time. If you continue playing, then the next frame is the LIVE frame which is broadcasted at that time. You miss some frames.
Thats the nature of a live stream. It is "live"! If you pause or rewind it isnt live any more. Thats a recorded stream.
Create custom application or modify existing one (for example oflaDemo).
Create server stream in your class in appStart():
private IServerStream serverStream;
...
public boolean appStart( IScope app ) {
serverStream = StreamUtils.createServerStream( app , "MyOwnTVChannel" );
Add .flv files from /streams/ (oflaDemo example) to play:
serverStream.addItem( SimplePlayItem.build( "prometheus" , 0 , 20000 ) );
serverStream.addItem( SimplePlayItem.build( "someOthefFLVMovie" , 0 , 20000 ) );
20000 means 20 sec of playing - you can use setRepeat(true) after starting.
Start your stream:
serverStream.start();
Now, flash clients can watch your own TV channel with NetStream .play( "MyOwnTVChannel" );
Remeber that if you do not set repeating, your channel will end in 40 sec in this example.

Limiting gstreamer pipeline throughput to simulate live source

I'm developing an RTSP server that should emulate a live source, while streaming the data from a file.
What I currently have is mostly based on gst-rtsp-server example test-readme.c, only with the following pipeline:
gst_rtsp_media_factory_set_launch(factory, "( "
"filesrc location=stream.mkv ! matroskademux name=demuxer "
"demuxer. ! queue ! rtph264pay name=pay0 pt=96 "
"demuxer. ! queue ! rtpmp4gpay name=pay1 pt=97 "
")");
This works very well, except for one problem: when the RTSP client (which uses RTSP/TCP interleave transport) is not able to receive data, the whole pipeline locks up until the client is ready again, and then resumes at the original position without any jump.
Since I want to emulate live source which cannot buffer its video indefinitely, the desired behavior in this case is to continue playing the file, so when the client blocks for 5 seconds, it will lose 5 seconds of recording.
I've attempted to achieve this by limiting queue sizes and setting them as leaky (by setting them as queue max-size-bytes=1000000 max-size-time=1000000000 leaky=upstream, which should provide buffer to ~1 second of video, but no more). This did not work entirely as I hoped: the source and demuxer filled the queue and then completely emptied themselves in 0.1 sec.
I figured I need some way to throttle pipeline throughput before the queue, either by limiting the demuxer to real-time demuxing, or finding/making a gstreamer filter that will let through 1 second of data per 1 second of real time.
Do you have any hints on how to do this?
So it seems that while leaky queue and limiter can be done, they don't help much in this regard as GStreamer RTSP implementation has its own queue for outgoing TCP data. What appears to work is keeping the pipeline unchanged and patching gst-rtsp-server module to limit its queue length (to 1 MB in this case, recent version also limit message count to 100):
--- gst-rtsp-server-1.4.5/gst/rtsp-server/rtsp-client.c 2014-11-06 11:20:28.000000000 +0100
+++ gst-rtsp-server-1.4.5-r1/gst/rtsp-server/rtsp-client.c 2015-04-28 14:25:14.207888281 +0200
## -3435,11 +3435,11 ##
gst_rtsp_client_set_send_func (client, do_send_message, priv->watch,
(GDestroyNotify) gst_rtsp_watch_unref);
/* FIXME make this configurable. We don't want to do this yet because it will
* be superceeded by a cache object later */
- gst_rtsp_watch_set_send_backlog (priv->watch, 0, 100);
+ gst_rtsp_watch_set_send_backlog (priv->watch, 1000000, 100);
GST_INFO ("client %p: attaching to context %p", client, context);
res = gst_rtsp_watch_attach (priv->watch, context);
return res;