I have a program on the server side that keeps generating a series of JPEG files, and I want to play these files on the client browser as a video stream, with a desired frame rates (this video should be playing while the new JPEG files are being generated). Meanwhile, I have a wav file that is handy and I want to play this wav file in the client side, when the streaming video is being played.
Is there anyway to do it? I have done a plenty of research but can't find a satisfactory solution -- they are either just for video streaming or just for audio streaming.
I know mjpg-streamer at http://sourceforge.net/projects/mjpg-streamer/ is capable of playing streaming videos in MJPG format from JPEG files, but it doesn't look like that it can play streaming audios.
I am very new to this area, so more detailed explanation will be extremely appreciated. Thank you so much!!!
P.S. a solution/library in C++ is preferred but anything else would help as well. I am working on linux.
The browser should be able to do this natively, no? Firefox can do this certainly, if you simply give it the correct url of the streaming mjpeg source. The mjpeg stream should be properally formatted.
I figured it out. The proper way of doing it is to use ffmpeg, libav and an RTMP server, such as red5.
Related
I am trying to save any m3u8 stream playlist as video to disk as 1 complete video file, similar to vlc. I can create an AVAsset and play it in an AVPlayer fine, however the m3u8 links i have tried all return false from asset.isExportable so using AVAssetExportSession does not work. I thought it might be possible opening the link as an InputStream and then writing it to an OutputStream but was lost on how to do this. Is this a viable option or will it only return the actual m3u8 file instead of the .ts video links? Any guidance in the right direction would be appreciated. I am fine doing the research on how to use the different classes, i'm just kinda lost on where to go from here.
Thank you,
Phil
Building a single video from all the streams in a m3u8 playlist may not actually give you what you want, depending on the m3u8 file.
This is because m3u8 playlists can contain multiple bit rate versions for a single video - so if you added them all together you would get the same video with different quality levels (bit rates) one after another.
Its also worth noting that some videos streams will be encrypted, in fact most high value streams such as Netflix etc will be, so downloading them will not allow you to play them back unless you do it as part of the providers own 'download and go' service.
Finally, some services may make it hard for you to access the streams by requiring some form of authentication in parallel with the video stream URL.
Assuming all the above is fine or does not apply in your case, then the video files themselves can be downloaded as files using a HTTP downloading function. Good examples of these exist such as: https://stackoverflow.com/a/35510812/334402
I am attempting to stream a video, in a format unity3d can access, like an mjpg. I have gone through several possible solutions, including gstreamer(only does client side as far as I could tell by the examples), yawcam(I couldn't find a way to access the image directly), and silverlight(due to simply not being able to find how the heck webcam streaming was doable) I am currently just looking for any more methods of getting video over from one side to the other. Could I possibly simply read the images into a byte array and send it over a socket? Maybe I missed something in the previous three possible solutions?
If you are looking to stream video from a server than you can use Ogg encoding + WWW.movie to map it to a texture. Assuming you have a Pro license, as I think this is a Pro only feature. If this is a local file, either bundled with the app or in external folder, we use the brilliant AVPro Windows Media or AVPro QuickTime. MJPEG does offers super smooth scrubbing with AVPro but generates enormous files. Definitely not ideal for streaming or even download!
Finally RenderHead also has a Live Camera capture plugin that could meet your needs.
I am new to iPhone Development. I need to capture video. While I'm capturing video it display on server too. Something like live streaming.
Anyone have idea from where I should have to start for this functionality?
Thanks in Advance.
Your question seems similar to this
Xcode ios: Streaming of video file while recording and removed redundant personal statements
First Half Solution
Using AVFoundation you can get video Buffer/frames while recording.
Second Half
But for uploading i didn't find any solution
There is Input Stream option there in iOS APIs but it need some file path. but as video is not recorded we didn't have any path.
Edit 1
Here is Best Example for AVFoundation provided by Apple, you can start with
I recommend you to use wowza wowza.com/https://www.wowza.com, it has all the features, from live stream, video on demand and etc.
I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for.
Could you guys give me any ideas which could help me further?
That's a tricky one. You should be able to do it, but it won't be easy.
One way that wouldn't be live (not answering your need, but worth mentioning) is to capture from the camera and save it to a video file. see the AV Foundation Guide on how to do that. Once saved you can then use the HTTP Live Streaming segmenter to generate the proper segments. Apple has applications for Mac OSX, but there's an open source version as well that you could adapt for iOS. On top of that, you'd also have to run an http server to serve those segments. Lots of http servers out there you could adapt.
But to do it live, first as you have already found, you need to collect frames from the camera. Once you have those you want to convert them to h.264. For that you want ffmpeg. Basically you shove the images to ffmpeg's AVPicture, making a stream. Then you'd need to manage that stream so that the live streaming segmenter recognized it as a live streaming h.264 device. I'm not sure how to do that, and it sounds like some serious work. Once you've done that, then you need to have an http server, serving that stream.
What might actually be easier would be to use an RTP/RTSP based stream instead. That approach is covered by open source versions of RTP and ffmpeg supports that fully. It's not http live streaming, but it will work well enough.
I have a webservice returning .flv file, it has to be played in iphone application, how do i play a .flv (flash file) in iphone?
Does anyone has faced this scenario? Programmatically is it possible to convert to some format and play in iphone?
Thanks.
IPhone doesn't and judging by the Apple official statements won't ever (or at least in the forseeable future) support flash content.
Converting the content to another format on the server side should be easy to do and would allow content playback on an iDevice.
SInce the video is probably already h.264 encoded inside the FLV container, you may want to try FLV Extract on the server to avoid recompression:
http://www.videohelp.com/tools/FLV_Extract
Basically you just need to run it once for each of the videos on the server and keep the results around.
I would recommend setting up your webservice to use something like ffmpeg ( http://www.ffmpeg.org/ ) to convert the .flv file to an mp4 file which can be played directly from the iPhone's web browser.
Pioto and Josaih are on the right track in suggesting that you should convert the video server-side using a tool like FFMpeg. As far as I know there is zero support for flv in any part of iOS, so you'd be unable to transcode it locally. Even if you could, it would make your users angry, since transcoding is a resource-intensive process that would kill their battery life and take a significant amount of time.
So, your solution is to transcode your videos to h.264 server-side. However, I'd caution against transcoding from flv->h.264 if there are any other options available. If you have the original, uncompressed (or at least less-compressed) source video available, you'll get higher-quality video by transcoding that to h.264. Each time lossy compression (eg, squeeze or h.264) is used on a file, you lose some information and quality. If you've ever seen a 3rd or 4th generation copy of a VHS tape, you can understand what I'm getting at.
Once you have a h.264 formatted video, you can play it on iOS. Not sure about the exact details of this.
You may be able to use ffmpeg or something on your server to transcode it to H.264. I'm not so sure you would really want to do that transcoding on the phone. Given Apple's current stance on Flash, this is probably your best option.
For FLV files, what I do is I upload them on Google Drive and watch them from Google Drive app.