re-establishing of media link in azure live video analytics - azure-media-services

I am working on azure live video analytics module where I need to store the video based on event , so I followed document where signal gate node used in graph topology. Its stored video whenever signal gate node get triggered.
I have implemented the same way, trigger the gate processor node to record video and using rtsp simulator with input of some 8 min video.
I have issue that after 30-40 second, media established link with rtsp server(rtsp simulator), input video started over again and this is happening repeatedly.
what could be reason ?
how can i resolved this one ..?

The Live Video Analytics module was designed to process live video 24x7. When the RTSP simulator reaches the end of the file, it disconnects the RTSP session with the LVA module, which interprets this as a temporary glitch. LVA attempts to reconnect to the simulator which then restarts the stream.
Is your goal to try a PoC with the 8min video, and then deploy to production with cameras? Or is your requirement to process files?
Currently, the only workaround that is possible is for you to detect the first MediaSessionEstablished event, then count wall clock time to the duration of the video (eg. 8 minutes) and then de-activate the graph instance.

Related

iOS - Develop iPhone app to stream camera video to a computer?

I'm looking for a way to create an app that will allow captured camera video to be streamed on a computer. For example, one person could be walking an iPhone around a room and another person could have that video streamed on their computer. Something kind of like a one-way Facetime except the receiver is on a computer. Also, I can't just use an existing app as later I would like to change the program to do some computer vision processing on the incoming data.
At the moment, I've found that AV Foundation should be the correct option for the video capture (from this question). However, I'm having difficulty finding the method by which I can actually stream this data. In particular, searching for how to create the apps on the iPhone frequently results in existing apps that do the task, but not how to create the app.
Can anyone give me a pointer to the information on how to stream the video capture from the iPhone? Thank you much.
You can use "Wowza media Server" for Streaming purpose
For wowza media server doenload :
Wowza Download
After installing wowza Now you need to set up live setting in wowza for that purpose you need:
Setting Up Live Application
For iOS side there is library is useful for video streaming using RTMP connection
You can get Library at
RTMP library for Streaming
Library example
RTMP library for Streaming example
In this good example of Streaming from iOS side
I had success with ANGL lib and Wowza media server. It gives smooths RTMP stream.

get access to soundcloud tracks

I'd like to built an application for analysis and classifications of tracks (everyday sound tracks instead of speech or music) recorded and/or streamed by soundcloud.
The idea is to use existing soundcloud infastructure (database, record, share, comment...) and just add an analysis level in between.
It is possible trought the API to access to the track binary files? We'd like to process some of them.
Is there also a way to access to the audio stream durring recording? it's for live classification task.
Thanks
Boris
Some users allow for their tracks to be downloaded, others don't. On one of the downloadable tracks, the track information will have a download_url, and you can download and process that however you like. As for accessing the stream during recording, at that point in time, the file doesn't exist on SoundCloud yet (it's only uploaded once the recording is complete). You could write your own Flash recorder or use the Web Audio API to get audio information during recording.

Flash Media Server live streaming with multiple video files

I am using Flash Media server 4.5 and i read the tutorial if i want to stream the live feed, i may need to use the media live encoder. but what i found in media encoder is i have to manually setup everything and it only support camera devices.
But in my case i have multiple video files keep received from another program, my goal is use the Flash Media server to perform a live boardcasting with these video file one by one.
That means when client watching the streaming, they will not notice the server is play mov1, then mov2, then mov4, then mov5... and so on.
Also can FMS dynamically create a new streaming session (invoke by code), so that when client A uploading some video files to the server, the FMS open a new streaming session only stream cilent A video files?
Can FMS achieve such purposes? any tutorial provided would be very helpful!
Edit for Open Bounty
I want to basically deliver a live stream of video where a list of videos are source. I am currently using Flash Media Server with Cloudfront CDN to deliver content. So if I have video1, video2, and video3. I want to play them back to back as a live stream (so no skipping ahead in video), is it possible to do this? Bounty goes to clever workaround. Think of this as a television channel.
i have been working on the live streaming technologies for the past 1and half year . There is no option in flash live encoder for any file encoding.
1.To encoder your file you can use your dvd player devices or some thing else which supports usb devices playback options.and use the dvd player output to broadcast using flash media live encoder.
2.And the another set is to setup windows media encoder that supports file encoding(no need od dvd player) but it supports only windows media services.
At present i live webcast video file in this way only for my company http://www.malar.tv/live.php

Set buffer rate MPMoviePlayerController

My app was recently denied because it is set to stream 60+ mb files from my web server and play them; the MPMoviePlayerController downloaded the entire file in 5-10 minutes while simultaneously playing it. From the testing perspective, the app worked great, but Apple limits audio streaming to 5mb/5min.
How would I go about limiting the buffer rate to only buffer 5mb/5min?
I have no idea which direction to go. I am willing to overhaul as long as the player can still stream the files from my web server.
All replies are appreciated.
Live Streaming is limited due to cellular network limits, so the only way to do this is to do this:
Google "HTTP Live Streaming"
Download the tools from Apple's Dev Center and install them
You will need to use Terminal to mediafilesegment the flies
mediafilesegmenter /Path\ to\ File/Name.mp3
Then upload the .m3u8 files and the segments to your server (same directory) and stream the m3u8. Problem solved!

Record HTTP Live Streaming Video To File While Watching?

I am trying to create a streaming video DVR like functionality in an app I am developing. I have an HTTP Live Stream that I have successfully gotten to play on the iPad. I want the user to be able to push the "Record" button, and begin recording the video that is currently playing from that point. This video file will be accessible from the app or from the camera roll. Currently, I am using the MPMoviePlayerController object to play the video stream. I do not see any methods of accessing the data from the object in Apple's documentation. Here are some thoughts I had on ways of going about this.
1) Somehow access the video data from MPMoviePlayerController, and write this to a file. Or use another type of player object that will allow me to play the video and access the currently playing data.
2) Implement some sort of screen capture recording that gets a video capture of the iPad's screen. This would allow me to record the video in a "screenshot" sort of way.
3) Locate the HTTP Live Streaming video segments where they are stored by MPMoviePlayerController. Presumably they need to be stored somewhere on the iPad for playback. Is there a way of accessing these files?
4) Manually download the stream video segments over http while streaming the file. This seems like its not ideal since the stream would have to be downloaded twice.
5) This could work. Periodically download the video segments to the iPhone. Set up a local http server on the iPhone and server the videos to the MPMoviePlayerController. This way the video segments could be marked for recording and assembled into a video.
6) I do have control of the streaming server. I could write some server side code to record the video on the server end, then send the video to the iPad after the fact. I would rather not do this.
Has anyone done any of these things? Ideally the iPhone would just be able to access the video data somehow and easily record it. I would rather not get into options 4, 5, or 6 (above) if I don't have to.
Thanks in advance.
DVR on the device is somewhat not encouraged, due to the limited space available and other factors like battery life, processing power, cleanup procedures after the user stops the dvr, etc.
If you want to achieve DVR playback on iOS devices (or other devices using HLS), I suggest you keep the video server side. The live stream is already captured and segmented server side, all you would have to do is keep the segments a bit longer, instead of deleting them. By using the EXT-X-PLAYLIST-TYPE and EXT-X-MEDIA-SEQUENCE tags, you can suggest to the player that he's opening a live stream which has DVR (earlier) video available.
Alternatively, you can use a server that does that out of the box, for example Wowza. Here's an article on how to achieve this with Wowza