Java HTTP Live Stream segmenter - iphone

A few months ago when I looked into HTTP-live streaming I thought I found a Java Library which can act as the segmenter to create a HTTP Live Stream. However, I cannot find it back. Is there anybody who knows about a way to directly segment the files?
Or, with iOS 5, if there are any libraries yet which can create the m3u8 files from a set of encoded files without physical file segmentation?
I have a server running the (Java) Play Framework and will use FFMPEG (possibly in a Java wrapper) to encode and now look for something to create the playlist files.

I am working with Xuggler at the moment, and have a segmenter and encoder working,must the .m3u8 file is not accepted by iOS devices so some work needs to be done, but it has the promise to be successfull.

Related

How to play all video formats in a Chrome Packaged App?

I'm writing an Chrome Packaged App that needs to be able to play a lot of local video files. I can use the tag to play files encoded in h.264 and mp3, but not much else. I'll require playback of at least DivX videos and AC3 audio. Is there any way to do this using the HTML5 platform or otherwise using some kind of plugin?
There are alternatives, but in my opinion the final solution is not going to be very good.
1 - You can try to use a plug-in, for example:
VLC Plug-in - sorry, I have not enough reputation to post more than 2 links :(
Divx Web Player - sorry, I have not enough reputation to post more than 2 links :(
But then you need to rely on the user installing the plug-in. For VLC, the plug-in is not compatible with the latest versions of Mac OS X.
2 - Encode to H.264 or VP8 from a server with an ffmpeg or using a cloud video provider.
3 - Encode from the client side using JavaScript! There is a port of the ffmpeg on javascript (http://bgrins.github.io/videoconverter.js/). I didn't try this method with large files.
4 - Encode from the client side using a Native Client component (https://developers.google.com/native-client/dev/). But seems a daunting task to me.
If you are going to go with the first option, assure that your audience is going to install/configure your player and that their OS are supported.
VLC ported to NaCL would be a great first step.
According to a poster on https://forum.videolan.org/viewtopic.php?f=5&t=107178, libVLC has been ported to NaCL, but I am not familiar with VLC internals so I could not say how far this gets you in terms of being able to decode different streams.

Encoding of audio (mp3, mp4, m4a, ogg) file for smooth streaming window media services

I want to encode the audio file (mp3, mp4, m4a, ogg) for the streaming and want to play (I want to play encoded file smoothly) using the HTML5 player but I think HTML5 player.
So now what I am doing, I am uplaoding a file and econding this file on windows Azure Media Services using the preset "AAC Good Quality Audio". It encode the file with .mp4 file format and then I create SAS locator to run this file, it works well but the problem is that user can download it too which I don't want to allow.
If I create the OnDemandOrigin locator of the same encoded asset, it gives me 404 erroe. It means we can not play it.
Below are the steps that I have used to upload the file on Azure Media Services:
Created the empty assest.
Upload the file into the asset.
Then create the new task job to encode the audio file.
I have successfully encoded the file but when I try to generate the origin url it generate the url but when I browse the file I get
the error 404.
My queries:
"AAC Good Quality Audio" preset is the right for my task?
How can I restrict the user to download the file, if I use sas locator.
Is it possible to play the encoded file using origin locator.
Can I encode audio files for smooth streaming ? If I can then which player I should use to run the encoded file for all browsers, IOS devices and android devices.
If you want further details please feel free to ask me.
Awaiting your response.
Thanks
If your user is able to listen to the audio you're publishing, they will also be able to download the file. This you can not prevent. At best, you can make it difficult, but not impossible. More to the point, Media Services at its current incarnation has no way for you to do authorization of any kind, so the only tool you've got is time-bombed SAS locators.
The typical solution for this problem is to use DRM. Media Services supports PlayReady encryption, but you need to either have a PlayReady server or purchase it as a service (there is currently a service in the Azure Marketplace that provides PlayReady for a monthly price).
See following article how to protect assets with Microsoft PlayReady technology
Origin Locators are something you would use to publish a Smooth Stream or HLS asset. It is not useful for regular media files, as it is internally something equivalent to an IIS Media Services endpoint. For regular media files, you can just as well host them in Blob Storage -- and refer to them via the SAS locator.
There is currently no single format that will play across all devices and operating systems. You can get Smooth Streaming to work on most Windows and Mac computers (possibly Linux, too), either with Silverlight or with the Smooth Streaming Plugin for the Flash-based OSMF. For iOS devices you will need to encode to HLS and use the HTML5 video tag. Microsoft Media Platform will support MPEG-DASH, a recently ratified ISO/IEC standard for dynamic adaptive streaming over HTTP.More details how to use DASH preview feature can be found here
If you want smooth streaming for audio only, it looks like you will have to create a video asset with an empty video stream -- although there is a Uservoice request to add support for audio only in the future.

Uploading & Storing audio files

We are in the stage of designing our audio application, and we need to support uploading audio files from desktop applications to a cloud server, and also playing those audio files in the desktop applications.
How should we process the file before uploading? should be turn them into base 64 thus increasing their size by ~ 30%?
Or should we upload it as a raw binary file?
What about the different audio formats, should we transcode it in the client / server into mp3 or something like that?
Does anybody know what is the SoundCloud approach in this case?
Thank you!
SoundCloud stores the original file, and it transcodes the original for streaming purposes. The original can be downloaded if the owner makes that option available.
If you use HTTP to communicate with your server you'll need to encode with Base64. If you choose the right framework, this should be easy and possibly transparent to you.

“Hook” libMMS to FFmpeg for iPhone Streaming

These days, I was researching the software architechture for iPhone Streaming (Base on MMS protocol).
As we know, in order to playback MMS audio stream, we should call libMMS to read wma stream data from remote media server, and then call FFmpeg to decode the stream data from wma format into PCM data buffer, and finally, enqueue the PCM data buffer into iPhone’s audioqueue to generate real sound.
The introduction above just describe the working process of iPhone streaming. If we only need to implement this simple functionality, that is not difficult. Just follow the introduction above to call libMMS, FFMpeg and audioqueue step by step, we can achieve the streaming function. Actually, I have implemented the code last week.
But, what I need is not only a simple streaming function! I need a software architechture makes FFmpeg accessing libMMS just like accessing local filesystem!
Does anybody know how to hook the libMMS interfaces like mms_read/mms_seek onto FFmpeg filesystem interfaces like av_read_frame/av_seek_frame?
I think I have to answer my own question again this time……
After several weeks reseach and debuging, I got the truth finally.
Actually, we don’t need to “hook” libMMS onto FFMpeg. Why? Because the FFMpeg already has its native mms protocol process module “mms_protocol” (see in mms_protocol.c in FFMpeg).
All we need to do is just configuring the FFMpeg to enable the mms module like this (see in config.h in FFMpeg):
#define ENABLE_MMS_PROTOCOL 1
#define CONFIG_MMS_PROTOCOL 1
After this configuration, FFMpeg will add mms protocol into its protocol list. (Actually, the protocol list has already contained “local file system protocol”). As result, the FFMpeg could be able to treat the “mms://hostserver/abc” media file like local media file. Therefore, we can still open and read the mms media file using:
av_open_input_file();
av_read_frame();
like we did on local media file before!
By the way, in my ffmpeg version, there are still many bugs in libAVFormat module for proccessing mms protocol. It took one week for me to debug it, however, I think it will be much shorter for the guy as smart as you:-)

How do i stream an audio file from the server to iphone?

I need to stream an audio file which is saved on my server. Is it possible for me to stream that file in order to play it on my iPhone? Or is there any other way to play an audio file from the server to iPhone? help me please.
Thanks,
Shibin
This link was useful to me : http://cocoawithlove.com/2008/09/streaming-and-playing-live-mp3-stream.html
He's got a project linked from that page http://projectswithlove.com/projects/iPhoneStreamingPlayer.zip
In this project, interesting lines are in iPhoneStreamingPlayerViewController.m, lines 82-89 start streaming the audio from a url.
I've manged to get this running on my iPhone and tested it using an mp3 on another server and it works fine. However, I've not picked through the code so I can't help you anymore than this, sorry!
Sam
NS To get the project to compile I had to change the SDK to 3.0 - if you right click on the project name and choose Get Info, then change the option called Base SDK to iPhone Device 3.0 and it should work.
There's a couple of ways to get the file playing on the iPhone, but the first problem is that you need to decide how to serve the file from your server.
One great way is to share the file out via HTTP using a Web Server. If the server is Windows, look into 'IIS'. If it's a Mac or Linux, Apache is your friend.
Once you've got the serving going, here are the options on the iPhone:
1) Use iPhone Safari to navigate to http://your-server/your-folder/the-file.ext. If the serving is correct, it'll open the mediaplayer and stream it.
2) Write an iPhone application that uses the AVMediaPlayer framework to play the file. Non-trivial, but there are plenty of samples.