iPhone MP3 Streaming alternative to Segmenting - iphone

I have run into a bit of a problem. I built an iPhone app that streams my podcasts via the MPMoviePlayerController. Apple will not approve it because it can use too much bandwidth over the Carrier Network. So their workaround is to use a Stream Segmenter. I am unable to install a stream segmenter on my server. Are their ANY other solutions people have come up with that can help me stream my podcast to iPhone devices? Even if I have to make it a Web Application as opposed to a native application.
Thanks,
John

You could use a simple service like Encoding.com to create iphone segmented ondemand versions of your files for multi bitrate adaptive playback. You could also provide a high and low quality and only display the high when the reachability class shows that your using wifi. I had to do the second option to get one of my apps to pass approval. Hope this helps!

Well if you don't want a native app, I think you can just put a video link on a webpage and when the user clicks it Quicktime will take over and play the file. It will play the file as it downloads it.
I don't have any experience streaming large files over the iPhone, so I can't help guide you on alternatives and keeping it a native app.

Related

Live streaming on iPhone

I just started work on live streaming on iPhone. So any help of how to do live streming in iPhone. I think if I can add video tag in HTML5 and then load that html in UIWebView will work.
Am I right? If not what is your sugestion to do live streaming. I want to embed some news channel live streaming link in the application so from where I can find those links.
You have to go through HTTP Live streaming document provided by Apple.There are some sample live streaming URLs.The file extension will be .m3u8.If you want to configure your own webwserver , you have to configure FFMPEG server in your webserver.The links which will help you
1)Apple document
2)stackoverflow
3)stackoverflow
4)stackoverflow
If you're making a web app in html5 then the video tag is a good choice.
But, If you're developing a native app then MPMoviePlayerController would be a much better choice. There are many example of how to use it online.
iOS doesn't support RTMP or RSTP, so your stream would need to be a HTTP Live stream. From memory the codec choice is very limited too, eg if you supply H264+mp3 you won't get any sound despite iOS supporting mp3.
Also remember that streams from other people (such as the BBC) will normally be protected by international copyright law, so unless you have prior permission to use their stream in your app you may be breaking the law.
Apple has some nice resources on Http Live Streaming.

Turning an iPhone or iPod into a wireless webcam

I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.

iphone video streaming

So, I recently submitted my first iphone app to Apple.
I did not stream my videos and they are over 10 minutes long, so my app was denied because I did not use HTTP Live Streaming.
So, we stream live videos every week. Those files are stored somewhere, but I am a little unsure of where. I want the video files that I made a feed for to be converted into streamed videos. But I don't want to use Apple's HTTP Live software. I do not know how to code into streamed video.
Is there anyway to either figure out where my streamed files are storing or is there a software that will convert videos into streamed video? Will take any suggestions.
Thanks
The main problem is that you must use HTTP Live Streaming if you wan't your app to be approved, and also be aware of the Apple restrictions (you must set different bitrates, one of 64kbps or lower).
If you don't want to use Apple tools, you can use ffmpeg. Take a look at ioncannon.net http://www.ioncannon.net/programming/452/iphone-http-streaming-with-ffmpeg-and-an-open-source-segmenter/
With Apple tools is easier. You just need mediafilesegmenter/mediastreamsegmenter.
There is also professional services out there, but not free, that will take care of all the process.
If you don't know where are your files, maybe you can use a sniffer and check where is your computer "listening to".
The easiest solution is to simply require that your users be on WiFi in order to watch the videos. The 10 min. / 5MB restriction only applies to video that is sent over Cellular networks, not WiFi. See Apple's "Reachability" code for an example of how to test the user's network connection at run-time.

iPhone external video playback from an app

I know the iPhone can play video on an external screen if you have the Apple component output cable. I also know you can write an app that plays video. Is there a way to put those two things together and write an app that will play video specifically on an external screen?
This is currently not possible with the iPhone API. I have heard of apps that have done it on jail-broken phones, but there is not Apple-approved way of doing it at this time.
This can be done using private private APIs, but it won't get in the store. This guy wrote a class to do it here: http://dragonforged.com/DFVideoOut.shtml Haven't used it myself, but it looks very simple.

How to show streaming videos in flash in iPhone application

I want to show live video streaming which is in the form of flash. Can any body tell me how to do this in an iPhone application.
Thanks
Ashwani
The iPhone doesn't support Flash, so using it isn't an option. You can, however, do HTTP streaming of video. Read Apple's documentation for more information.
Adobe announced during their Keynote at MAX yesterday that you will be able to compile Flash for iPhone apps. They have some sample projects available on the Adobe labs site
You can take those sample files and extend them to accomplish what you're looking for.
The native compile output from Flash Professional isn't going to be available until CS5.
Note: Flash is not native on the iPhone and will not run in browser. Also, apps will not have access to the video camera or the microphone.
No flash but you can stream mp4 etc.
good tutorial on how to stream video:
http://buildmobilesoftware.com/2010/08/09/how-to-stream-videos-on-the-iphone-or-ipad/