I am using Red5 + RTMP in a client-server flash application.
There aren't audio/video streams in my application. RTMP is used to transfer messages from app to server and back.
Now I need to develop the application for iPhone:
is there any RTMP implementation on iPhone?
If not, how could I solve this problem? Is there any alternative to RTMP on iPhone?
And most important question: could it be solved without rewriting whole server part of application? (Red5 + RTMP)
If all you need is to pass "messages" and no a/v, might I suggest using a REST service or something similar such as doing GET or POST requests containing your "message" encoded in some form? Such as JSON or BASE64?
Related
I am working on a live streaming project and came across many services like Wowza, Dacast, Ant etc. The one suits for my requirement uses RTMP protocol and so I will have to use an encoding software like OBS to publish the stream. I actually want to publish the stream from browser/iOS/Android.
I came across this FB presentation and seems like they are using RTMP protocol. FB is successfully doing the broadcast from the browser somehow.
Can I get an insight into how the things would be working with FB / similar RTMP based live streaming apps? Thanks in advance.
Facebook supports RTMP ingest of video (used by people who utilize the Live API), as well as WebRTC ingest for browser clients.
RTMP is not used as a distribution protocol. For that, there is DASH.
First, I wish to emphasize the keyword from. There are a lot of questions and answers on this topic but I found that no answer provide a step by step road-map to achieve this.
What I wish to achieve :
I wish to stream the video and audio (live) being recorded from the camera of iPhone/iPad to my server. And that's it.
What have I figured till now :
I guess that we can't use HTTP live streaming because it's meant for server to client and not client to server. AV framework allows the output only in the form of a mov file.
What am I not able to figure :
I don't know how to get individual frames (live) and send them to my server one by one
PS: I really don't know anything about this... You are welcomed to oversimplify things. I am writing server in node.js
You may take a look at Wowza GoCoder iOS app.
It requires Wowza as the media server though, so you'll be able to provide full-features streaming to anyone you want.
Server side set up is done easily via Wowza configs or by third-party cloud control .
I am trying existing to stream music/video on an iphone using HTTP Live Streaming. I read the apple docs on HTTP live streaming (http://developer.apple.com/library/mac/#documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html), and I get how it works.
What it doesn't say is how one would use iphone as a server? Do I have to add the tools to my ios app(mediastreamsegmenter, variantplaylistcreator) and run it as a NSTask or is there some kind of native support to stream media files.
If you really want to stream from an iPhone app you can't do this with the iPhone acting as a server. You need a separate server where you can send data from the iPhone app. So you can use the camera or the microphone in the app to get live content and then you can send asynchronously data to the server, which using mediastreamsegmenter and variantplaylistcreator will convert the data to ts segments and then will append them at the end of the m3u8 file and meanwhile another iPhone app can act as a client and watch the live content that you are streaming from the first app.
From my experience this is the only way to achieve that. Hope that helps.
So generally, I want to make an app which has video chat functionality for iPhone. But after many searches, I am still not able to find any successful results. Is there any public or even for that matter, private API available for doing this on iPhone??? If you have an YES answer, please help me.
Basically, what I want is to read the streams of the video on both the devices connected for chatting. Thanks a lot in advance and please help me if you can.
p.s - I have already checked iDoubs but it failed and always shows some unknown problem and for that reason, doesn't allow me to connect to anyone.
ALSO : The suggested method I have found is via HTTP Live Streaming. But, in that too, I have multiple doubts.
1.) I need to find how do I upload my video from iPhone to the HTTP server from where I would be broadcasting?
2.) Can you please post something related to setting up the server? How do I feed the video to the FFMPEG Server?
Mainly, I need to find the upload method. I am right now simply sending hex-code in the form of NSDATA to the server and I am stuck there. The main problem is, It is live. How do I handle that?
It would be best, if you could help me make the iDoubs work properly.
Thank you so much for any kind of support!
have a look on this how to implement video chat in iphone But before starting you must have a IMS server up & running.
here is the live video chat framework what you are looking for. Its easy and simple to implement for face to face video chat. I have already tried this. Its working very fine. Great thing about this framework is multiple platform support.
Tokbox : https://tokbox.com/platform
https://tokbox.com/opentok/tutorials/
Sample Code:
https://github.com/opentok/opentok-ios-sdk-samples/
Edit:
Here is the article explaining opentok using parse.
http://www.iphonegamezone.net/ios-tutorial-create-iphone-video-chat-app-using-parse-and-opentok-tokbox/
HTTP live streaming is primarily an approach for adaptive streaming from server-to-client. For client-to-server rather go for traditional streaming. There exists an open library for streaming, see this question.
Whilst it is possible to facetime to do two-way chat, it is not certain that you will be able to using public iOS APIs. That said, I have implemented one-way live streaming for iPhone and the difficult part was not the core streaming itself, but encoding of the payload. You will be able to do H264 in hardware and AAC / iLBC in software.
How you want to feed this to the FFMPEG depends on your transport, possibly changing from 'file' H264 frames to 'streaming' H264. Check out the H264 frame types if you implement frame dropping; reconfiguring the H264 encoder on-the-fly is not possible to my knowledge, but restarting with fresh parameters typically does not take more than a second or so.
Did you attempt to play back a live resource while capturing? That is a good starting point. If you come across an open API for H264 encoding, please post it here ;-)
I just started work on live streaming on iPhone. So any help of how to do live streming in iPhone. I think if I can add video tag in HTML5 and then load that html in UIWebView will work.
Am I right? If not what is your sugestion to do live streaming. I want to embed some news channel live streaming link in the application so from where I can find those links.
You have to go through HTTP Live streaming document provided by Apple.There are some sample live streaming URLs.The file extension will be .m3u8.If you want to configure your own webwserver , you have to configure FFMPEG server in your webserver.The links which will help you
1)Apple document
2)stackoverflow
3)stackoverflow
4)stackoverflow
If you're making a web app in html5 then the video tag is a good choice.
But, If you're developing a native app then MPMoviePlayerController would be a much better choice. There are many example of how to use it online.
iOS doesn't support RTMP or RSTP, so your stream would need to be a HTTP Live stream. From memory the codec choice is very limited too, eg if you supply H264+mp3 you won't get any sound despite iOS supporting mp3.
Also remember that streams from other people (such as the BBC) will normally be protected by international copyright law, so unless you have prior permission to use their stream in your app you may be breaking the law.
Apple has some nice resources on Http Live Streaming.