In-stream non-linear advertising with MPEG-DASH - overlay

Does MPEG-DASH provide the ability to add video overlays in the video stream (i.e. not added client side like youtube in-video advertising)?
e.g having an animated character walking around the screen, in front of the video.

MPEG DASH is essentially a mechanism for efficiently streaming media files from a server to clients - it does not really care what the media files are or what you use them for.
If what you wanted was more like a banner or text, then you could have a subtitle or captions track with your video, which mp4 containers support and DASH will allow you stream.

Related

How to play video while it is downloading using AVPro video in unity3D?

I want to play the video simultaneously while it is downloading via unitywebrequest. Will AVPro video support this? If so please provide me some guidance, as i am new to unity and avpro video. I can able to play the video which is downloaded fully through FullscreenVideo.prefab in AVPro demo. Any help will be much appreciated.
There are two main options you could use for displaying the video while it is still downloading.
Through livestream
You can stream a video to AVPro video using the "absolute path or URL" option on the media player component, then linking this to a stream in rtsp, MPEG-DASH, HLS, or HTTP progressive streaming format. Depending on what platforms you will be targeting some of these options will work better than others
A table of which file format supports what platform can be found in the AVProVideo Usermanual that is included with AVProVideo from page 12 and onwards.
If you want to use streaming you also need to set the "internet access" option to "required" in the player settings, as a video cannot stream without internet access.
A video that is being streamed will automatically start/resume playing when enough video is buffered.
This does however require a constant internet connection which may not be ideal if you're targeting mobile devices, or unnecessary if you're planning to play videos in a loop.
HLS m3u8
HTTP Live Streaming (HLS) works by cutting the overall stream into shorter, manageable hunks of data. These chunks will then get downloaded in sequence regardless of how long the stream is. m3u8 is a file format that works with playlists that keeps information on the location of multiple media files instead of an entire video, this can then be fed into a HLS player that will play the small media files in sequence as dictated in the m3u8 file.
using this method is usefull if you're planning to play smaller videos on repeat as the user will only have to download each chunk of the video once, which you can then store for later use.
You can also make these chunks of video as long or short as you want, and set a buffer of how many chunks you want to have pre-loaded. if for example you set the chunk size to 5 seconds, with a buffer of 5 videos the only loading time you'll have is when loading the first 25 seconds of the video. once these first 5 chunks are loaded it will start playing the video and load the rest of the chunks in the background, without interrupting the video (given your internet speed can handle it)
a con to this would be that you have to convert all your videos to m3u8 yourself. a tool such as FFMPEG can help with this though.
references
HLS
m3u8
AVPro documentation

are there any tools/scripts for analyzing/retrieving flash/html5 video information/metadata

I want to play youtube video with a certain resolution, like 360p
and capture the packets, and then extract the video from the packets
and then I want to analyzing/retrieving flash/html5 video information/metadata from these videos
BTW, are videos still with the same resolution when they are extracted from the captured packets?
note that these videos may not be complete
are there any good tools for analyzing/retrieving flash/html5 video information/metadata
like video bit rate, video resolution(like 360p, 480p), used audio/video codecs, video size and duration/duration
if the video is not complete, the information would ideally include the original video size, the actual video size, the original video length/duration and the actual video length/duration
I hope it is a script, if it is a tool. I hope it can be run through shell using command line coz I want automation.
A paper says perl could do this, but I don't how
thanks!
(long comment, not a complete answer)
IANAL, but your goals may not fit the YouTube Terms of Service:
Section 4. C
You agree not to access Content through any technology or means other than the video playback pages of the Service itself, the Embeddable Player, or other explicitly authorized means YouTube may designate.
Section 4. H
You agree not to use or launch any automated system, including without limitation, "robots," "spiders," or "offline readers," that […] sends more request messages to the YouTube servers […] than a human can reasonably produce in the same period by using a conventional on-line web browser. Notwithstanding the foregoing, YouTube grants the operators of public search engines permission to use spiders to copy materials from the site for the sole purpose of and solely to the extent necessary for creating publicly available searchable indices of the materials, but not caches or archives of such materials. […]
You may be able to access the required information directly using the YouTube Data API. Here is a reference, and here is a list of directly supported programming languages. Perl will work as well, as the underlying data format is plain XML or JSON.
You might also find these SO questions YouTube Player API: How to get duration of a loaded/cued video without playing it? and Youtube API get video duration from the XML enlightening.

Record HTTP Live Streaming Video To File While Watching?

I am trying to create a streaming video DVR like functionality in an app I am developing. I have an HTTP Live Stream that I have successfully gotten to play on the iPad. I want the user to be able to push the "Record" button, and begin recording the video that is currently playing from that point. This video file will be accessible from the app or from the camera roll. Currently, I am using the MPMoviePlayerController object to play the video stream. I do not see any methods of accessing the data from the object in Apple's documentation. Here are some thoughts I had on ways of going about this.
1) Somehow access the video data from MPMoviePlayerController, and write this to a file. Or use another type of player object that will allow me to play the video and access the currently playing data.
2) Implement some sort of screen capture recording that gets a video capture of the iPad's screen. This would allow me to record the video in a "screenshot" sort of way.
3) Locate the HTTP Live Streaming video segments where they are stored by MPMoviePlayerController. Presumably they need to be stored somewhere on the iPad for playback. Is there a way of accessing these files?
4) Manually download the stream video segments over http while streaming the file. This seems like its not ideal since the stream would have to be downloaded twice.
5) This could work. Periodically download the video segments to the iPhone. Set up a local http server on the iPhone and server the videos to the MPMoviePlayerController. This way the video segments could be marked for recording and assembled into a video.
6) I do have control of the streaming server. I could write some server side code to record the video on the server end, then send the video to the iPad after the fact. I would rather not do this.
Has anyone done any of these things? Ideally the iPhone would just be able to access the video data somehow and easily record it. I would rather not get into options 4, 5, or 6 (above) if I don't have to.
Thanks in advance.
DVR on the device is somewhat not encouraged, due to the limited space available and other factors like battery life, processing power, cleanup procedures after the user stops the dvr, etc.
If you want to achieve DVR playback on iOS devices (or other devices using HLS), I suggest you keep the video server side. The live stream is already captured and segmented server side, all you would have to do is keep the segments a bit longer, instead of deleting them. By using the EXT-X-PLAYLIST-TYPE and EXT-X-MEDIA-SEQUENCE tags, you can suggest to the player that he's opening a live stream which has DVR (earlier) video available.
Alternatively, you can use a server that does that out of the box, for example Wowza. Here's an article on how to achieve this with Wowza

How do I jump to specific time with duration of an IIS streamed video in iOS?

If I have a video streaming server (e.g. IIS Media Services) with a live video streaming, I want the user to select from a list of interesting parts of that video and then play just that bit.
For example, if I'm interested in a 25s clip 20m into the full length I want to jump to that bit and then close the video player at the end.
(Answering my own question until someone betters it.)
There is an AVPlayer in the AVFoundation framework that can be used when you're after more sophisticated playback functionality.
On the IIS side, there is something called a PlayList that can be used to chunk up content that may work as an alternative.

How to ensure YouTube API only returns videos that are streamable on iPhone?

I'm building some YouTube search functionality into an iPhone app and want to ensure that I only receive results that will be playable on the device. According to the Searching for videos section in the API reference doc this seems to be relatively straightforward:
The format parameter specifies that videos must be available in a particular video format. Your request can specify any of the following formats:
I've tried setting "format=1" to limit to:
RTSP streaming URL for mobile video playback. H.263 video (up to 176x144) and AMR audio.
This provides a high proportion of playable videos but some are still unplayable and I'm worried that it's not returning others that would be playable.
When I leave the format field blank I receive an even higher proportion of non-streamable URLs.
This does not sound appropriate. My understanding is that iPhone does not stream RTSP rather it supports Apple's HTTP Streaming of segmented files for live and HTTP streaming of MPEG4 video files via range requests. I'd also expect the video to be H.264 and AAC audio.
Your setting sounds appropriate for low-end cellphones In particular, the 176x144 is a QCIF resolution commonly used on non-smartphones.
When you look at the XML file returned by a call to
http://gdata.youtube.com/feeds/api/videos/<your video id>
then you will notice that videos which are not playable on the iPhone will have the following tag:
<yt:state name='restricted' reasonCode='limitedSyndication'>Syndication of this video was restricted by its owner.</yt:state>
Just make sure to look for the above tag and ignore the video if the tag is present.