iphone video streaming - iphone

So, I recently submitted my first iphone app to Apple.
I did not stream my videos and they are over 10 minutes long, so my app was denied because I did not use HTTP Live Streaming.
So, we stream live videos every week. Those files are stored somewhere, but I am a little unsure of where. I want the video files that I made a feed for to be converted into streamed videos. But I don't want to use Apple's HTTP Live software. I do not know how to code into streamed video.
Is there anyway to either figure out where my streamed files are storing or is there a software that will convert videos into streamed video? Will take any suggestions.
Thanks

The main problem is that you must use HTTP Live Streaming if you wan't your app to be approved, and also be aware of the Apple restrictions (you must set different bitrates, one of 64kbps or lower).
If you don't want to use Apple tools, you can use ffmpeg. Take a look at ioncannon.net http://www.ioncannon.net/programming/452/iphone-http-streaming-with-ffmpeg-and-an-open-source-segmenter/
With Apple tools is easier. You just need mediafilesegmenter/mediastreamsegmenter.
There is also professional services out there, but not free, that will take care of all the process.
If you don't know where are your files, maybe you can use a sniffer and check where is your computer "listening to".

The easiest solution is to simply require that your users be on WiFi in order to watch the videos. The 10 min. / 5MB restriction only applies to video that is sent over Cellular networks, not WiFi. See Apple's "Reachability" code for an example of how to test the user's network connection at run-time.

Related

Turning an iPhone or iPod into a wireless webcam

I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.

iPhone: HTTP live streaming without any server side processing

I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for.
Could you guys give me any ideas which could help me further?
That's a tricky one. You should be able to do it, but it won't be easy.
One way that wouldn't be live (not answering your need, but worth mentioning) is to capture from the camera and save it to a video file. see the AV Foundation Guide on how to do that. Once saved you can then use the HTTP Live Streaming segmenter to generate the proper segments. Apple has applications for Mac OSX, but there's an open source version as well that you could adapt for iOS. On top of that, you'd also have to run an http server to serve those segments. Lots of http servers out there you could adapt.
But to do it live, first as you have already found, you need to collect frames from the camera. Once you have those you want to convert them to h.264. For that you want ffmpeg. Basically you shove the images to ffmpeg's AVPicture, making a stream. Then you'd need to manage that stream so that the live streaming segmenter recognized it as a live streaming h.264 device. I'm not sure how to do that, and it sounds like some serious work. Once you've done that, then you need to have an http server, serving that stream.
What might actually be easier would be to use an RTP/RTSP based stream instead. That approach is covered by open source versions of RTP and ffmpeg supports that fully. It's not http live streaming, but it will work well enough.

Streaming live H.264 video via RTSP to iphone does work! w/example

Using FFMPEG, Live555, JSON
Not sure how it works but if you look at the source files at http://github.com/dropcam/dropcam_for_iphone you can see that they are using a combination of open source projects like FFMPEG, Live555, JSON etc. Using Wireshark to sniff the packets sent from one of the public cameras that's available to view with the free "Dropcam For Iphone App" at the App Store, I was able to confirm that the iphone was receiving H264 video via RTP/RTSP/RTCP and even RTMPT which looks like maybe some of the stream is tunneled?
Maybe someone could take a look at the open source files and explain how they got RTSP to work on the iphone.
Thanks for the info TinC0ils. After digging a little deeper I'v read that they have modified the Axis camera with custom firmware to limit the streaming to just a single 320x240 H264 feed, to better provide a consistent quality video over different networks and, as you point out, be less of a draw on the phone's hardware etc. My interest was driven by a desire to use my iphone to view live video and audio from a couple of IP cameras that I own without the jerkiness of MJPEG or the inherent latency that is involved with "http live streaming". I think Dropcam have done an excellent job with their hardware/software combo, I just don't need any new hardware at the moment.
Oh yeah, I almost forgot the reason of this post RTSP PROTOCOL DOES WORK ON THE IPHONE!
They are using open source projects to receive the frames and decoding in software instead of using hardware decoders. This will work, however, this runs counter to Apple's requirement that you use their HTTP Streaming. It will also require greater CPU resources such that it doesn't decode video at the desired fps/resolution on older devices and/or decrease battery life compared to HTTP streaming.

XML, images, streaming video and "excessive volumes of data"

My first application was submitted to App Store and failed to be approved owing to "excessive volumes of data over the cellular network". I don't know how they test this but since it's basically a news application which displays various articles, images and streamed videos, I would go and blame the videos for the rejection. I can't test it for sure because there is no network stats in Ipod Touch and that's the only device I can access.
And so I'm curious..
1) Does anyone have any idea how Apple "runs" bandwidth test?
2) What are ways I can improve my XML loading, image displaying and video streaming to reduce bandwidth (in case user uses cellular network)? For images, I use asynchronous loading (maybe that can be a problem if lots of images can be requested at the same time?) I'm looking at http://allseeing-i.com/ASIHTTPRequest/ which could help with XML and maybe image loading but I don't understand if I can use ASIHTTPRequest to stream a video.
3) Is there any way to test network usage in iPhone simulator?
I expect the streaming video is the problem. Apple want you to use HTTP Live Streaming if you want to stream video over the cellular network.
See question 1236788 for more information.
They run bandwidth tests by looking at byte counters for the network interface I think. You can do the same in the simulator by making sure no other networking apps are running on your mac and then look at the output of the 'netstat -i -b' command. Or use a fancy utility to monitor bandwidth usage.
Not helpful, but I'd like to say this anyway: Apple's claims are silly in my opinion. Why do they care.

iPhone MP3 Streaming alternative to Segmenting

I have run into a bit of a problem. I built an iPhone app that streams my podcasts via the MPMoviePlayerController. Apple will not approve it because it can use too much bandwidth over the Carrier Network. So their workaround is to use a Stream Segmenter. I am unable to install a stream segmenter on my server. Are their ANY other solutions people have come up with that can help me stream my podcast to iPhone devices? Even if I have to make it a Web Application as opposed to a native application.
Thanks,
John
You could use a simple service like Encoding.com to create iphone segmented ondemand versions of your files for multi bitrate adaptive playback. You could also provide a high and low quality and only display the high when the reachability class shows that your using wifi. I had to do the second option to get one of my apps to pass approval. Hope this helps!
Well if you don't want a native app, I think you can just put a video link on a webpage and when the user clicks it Quicktime will take over and play the file. It will play the file as it downloads it.
I don't have any experience streaming large files over the iPhone, so I can't help guide you on alternatives and keeping it a native app.