live streaming from wifi h.264 camera directly to iPhone - iphone

I have a standard WiFi h.264 camera that I use as a baby monitor which, in technical terms, means I need it to be as realtime as possible. My initial goal was to encode the stream from the camera as such that the native iPhone hardware decoder can be used so that the result is a direct, clean, sharp, and realtime video from my camera onto my iPhone. I really want to avoid using FFMPEG since its a software decoder, which is slower then a hardware decoder.
I am finidng that the iPhone will not take anything from the camera's stream unless I use HLS as a middleman server. I am desperatly trying to avoid introducing a server inbetween the camera and the iphone, since it means more work, more bandwidth, and more latency on the video.
So my question is: What do I need to do in order to get a direct h.264 stream from my WiFi camera to show up on my iPhone using its hardware decoding? I am currently using base profile. If you need any more details, please let me know.
Again, your help means a lot since I have been beating myself up on this for over 6 months now.

/* Edit (January 24, 2012) */
I'm leaving this answer for historic record, but I have a better answer now..
/* End Edit */
Depending on your brand of camera, the IP Vision app from the Apple App Store should work just fine to establish a direct connection.
See here: http://itunes.apple.com/us/app/ip-vision/id300593485?mt=8
There will of course be some latency, but I can tell you from 15 years experience in surveillance, that latency is just a fact of life.
Most IP cameras offer a reasonable degree of control over bitrate.
If you can sacrifice quality for speed, try getting a bitrate of around 32kbps. With H.264 compression, this will be around 1-2 FPS at QVGA resolution.
As for latency in the app, I cannot offer you any specific advice, but the app is free, and if it provides improved results than you win!

EDIT: Doe not work with stock iOS or Android 4.0. May have some use for the web, so I will leave this for others.
Can you get an RTSP stream from your camera?
Here is a list of IP cameras and their RTSP streams:
http://www.soleratec.com/rtsp/
If you can make a web page, you can use this code to embed your RTSP stream. It works on iOS, and is fairly universal:
<div class="box">
<OBJECT classid="clsid:9BE31822-FDAD-461B-AD51-BE1D1C159921"
codebase="http://downloads.videolan.org/pub/videolan/vlc/latest/win32/axvlc.cab"
width="320" height="240" id="vlc" events="True">
<param name="Src" value="rtsp://76.23.103.200:1935/live/camera.stream" />
<param name="ShowDisplay" value="True" />
<param name="AutoLoop" value="False" />
<param name="AutoPlay" value="True" />
<embed id="vlcEmb" type="application/x-google-vlc-plugin" version="VideoLAN.VLCPlugin.2" autoplay="yes" loop="no" width="320" height="240"
target="rtsp://76.23.103.200:1935/live/camera.stream" ></embed>
</OBJECT>
</div>
Style your box as required. For an iPhone 4, screen width is going to be 320px, and for a typical IP camera, you probably have a 3:4 aspect ratio, so you want a screen height of 240px, just as listed above. Style the containing div any way you want. I recommend to center it, in case you call the web page in a tablet, laptop, PC, etc. Just makes it easier to see.
Example CSS
.box {
margin: 0 auto;
width: 320px;
height: 240px;
}
NOTE: The scope of this answer does NOT address any security concerns. Just like anything on the web, if you put it out there unsecured, anyone can get a hold of it.
Re-addressing your latency issues, this method will result in about a 3 second latency while streaming at 32kbps. May be a bit longer on a 3G wireless network. As I mentioned, latency is a fact of life with video. Even very expensive solutions will have 1-2 seconds of lag.
I hope this helps you some. If you don't have a website, just make a free Wordpress site and stick this code into a static page.

Related

iPhone HE-AAC Streaming over Mobile Network (3G)

Developed an internet radio streamer using jPlayer which utilizes the html5 audio tags with jQuery and has a flash fall back for unsupported browsers. Upon testing the player on the iPhone (iOS 5.0.1), we ran into a very peculiar issue.
When the iPhone is connected to WiFi, it streams perfectly using the HE-AAC V2 stream # 64kbps 44.1kHz (the preferred codec for apple products). However, when the iPhone is connected to the 3G mobile network, it "stutters" or stops streaming for 1-2 secs every 1-2 minutes (does not stop streaming completely). The troubling thing is when the iPhone is forced to use a separate MP3 stream at the same bit rate, it does not have this issue and works very well on 3G.
UPDATE 5
We recently acquired a 3G/4G Sprint mobile hotspot device and tested this issue with the device. When the iPhone is connected to the mobile hotspot, it shows as being connected to a wifi device and the issue does not render even tho the actual connection is via 3G/4G. This might point back to the issue being with the iPhone not handling HE-AAC via HTTP Live Streaming and when directly connected to the mobile network.
UPDATE 4
Updated the iPhone to iOS 5.1 yet the issue persists.
UPDATE 3
Read here on SO various issues of script not rendering correctly when connected to mobile networks. The finger seems to point to the mobile network carriers that may be inserting Proxy to serve webpages, e.g. for downsizing images. Also it might inject some JavaScript pages. The test page can be found HERE Note: this page is using HE-AAC so it will only work on iPhone...
UPDATE
According to Apple's HTTP Live Streaming doc for iOS devices, that "Audio-only content can be either MPEG-2 transport or MPEG elementary audio streams, either in AAC format with ADTS headers or in MP3 format." Our music server is using OddcastV3 encoder to send out three streams (MP3, HE-AAC V2, and Oggvorbis) to the icecastV2 server. Not sure if the encoder is inserting ADTS headers for the HE-AAC V2 stream. Is there a way to check for this?
Comming from a radio planning point of view - here are my two cents:
What you are describing sounds like bandwidth shaping - which is both a common and often neccesary design of radio networks (like 3G networks). In most 3G operators I worked at you would typically optimize your network to give high-speed burst (think downloading an image, sending one email or fetching one HTML page) - over "long-running" high bandwidth services.
This is due to the simple fact that this is what most users want/need.
This shaping can on a typical 3GPP (GSM 3G) network result in that you will first get a RAB (radio access bearer) supporting 384kbit and is then downgraded as long as your your device accepts it.
This means that typicall you will get switched from 384 -> 256 -> 128, then 64kbit where maybe your device starts recieving data to slowly, then the network upgrades it and again downgrade it after a while.
So why is not then the MP3 file stuttering? my guess is that the total kbit rate might differ - so you are fine in the 64kbit RAB. This is a common phenomena.
We have managed to get the exact same thing working. 64kbit AAC-v2 on mobile devices. We are streaming files and not a steady stream, I think Magnus is right when he explains how the network prioritized traffic to bursts, in our case that means we have large parts of the file right away and the player can continue to play until he next burst comes in. In your case that means the stream pauses until the next burst comes.
Either if you can switch to larger chunks in your streaming (larger buffer) or stream whole files instead?
We had a very strange phenomenon with iOS, we had to rename all files from .m4a to .aac to be able to get them streaming on iOS. If we didn't rename them iOS wouldnt play them.
Good luck.

Is it possible to read iPhone or Android display data through the audio jack?

I'm not very well versed in the iPhone and Android API, so please bear with me if this is a stupid question.
As I understand it, Square's card reader works by converting the magnetic information on the card stripe into an audio tone that its software can then process. [1]
In a similar way, is there a way to somehow read what exactly is being displayed on the device screen simply through a small device inserted into the audio jack on that device?
[1] http://www.quora.com/How-does-Squares-hardware-work
It's not quite clear what you wish to achieve. You can indeed make an app that would output a representation (perhaps audio frequency-shift keying?) of the screen's contents to the iPhone's audio jack.
The iPhone (and other iOS-based devices) use TRRS connectors for bi-directional audio (and hence arbitrary modulated data) communication and there are well-supported publicly-documented APIs for using these interfaces.
That said, if you're writing your own app: why would you want to output the contents of the screen? If you are developing the app in question, why not transmit the salient data in a more effective manner? Which leads me to my next assumption:
You want to read what's being displayed on the device's screen at any time, not just when an app of your creation is open. In this case, the answer is that it is not possible, with the possible exception of a jailbroken solution. That said, I can't imagine a jailbroken solution being useful much longer on account of iOS 5 introduced "display mirroring" by means of AirPlay.
On Android, I have no idea. :-)
No. The screen is not connected to the audio jack.
I think you can make an app to take a screenshot and then encode that photo as music to play it.
It won't sound good though :)
For this kind of task, there is built in camera

Realtime Audio/Video Streaming FROM iPhone to another device (Browser, or iPhone)

I'd like to get real-time video from the iPhone to another device (either desktop browser or another iPhone, e.g. point-to-point).
NOTE: It's not one-to-many, just one-to-one at the moment. Audio can be part of stream or via telephone call on iphone.
There are four ways I can think of...
Capture frames on iPhone, send
frames to mediaserver, have
mediaserver publish realtime video
using host webserver.
Capture frames on iPhone, convert to
images, send to httpserver, have
javascript/AJAX in browser reload
images from server as fast as
possible.
Run httpServer on iPhone, Capture 1 second duration movies on
iPhone, create M3U8 files on iPhone, have the other
user connect directly to httpServer on iPhone for
liveStreaming.
Capture 1 second duration movies on
iPhone, create M3U8 files on iPhone,
send to httpServer, have the other
user connected to the httpServer
for liveStreaming. This is a good answer, has anyone gotten it to work?
Is there a better, more efficient option?
What's the fastest way to get data off the iPhone? Is it ASIHTTPRequest?
Thanks, everyone.
Sending raw frames or individual images will never work well enough for you (because of the amount of data and number of frames). Nor can you reasonably serve anything from the phone (WWAN networks have all sorts of firewalls). You'll need to encode the video, and stream it to a server, most likely over a standard streaming format (RTSP, RTMP). There is an H.264 encoder chip on the iPhone >= 3GS. The problem is that it is not stream oriented. That is, it outputs the metadata required to parse the video last. This leaves you with a few options.
Get the raw data and use FFmpeg to encode on the phone (will use a ton of CPU and battery).
Write your own parser for the H.264/AAC output (very hard)
Record and process in chunks (will add latency equal to the length of the chunks, and drop around 1/4 second of video between each chunk as you start and stop the sessions).
"Record and process in chunks (will add latency equal to the length of the chunks, and drop around 1/4 second of video between each chunk as you start and stop the sessions)."
I have just wrote such a code, but it is quite possible to eliminate such a gap by overlapping two AVAssetWriters. Since it uses the hardware encoder, I strongly recommend this approach.
We have similar needs; to be more specific, we want to implement streaming video & audio between an iOS device and a web UI. The goal is to enable high-quality video discussions between participants using these platforms. We did some research on how to implement this:
We decided to use OpenTok and managed to pretty quickly implement a proof-of-concept style video chat between an iPad and a website using the OpenTok getting started guide. There's also a PhoneGap plugin for OpenTok, which is handy for us as we are not doing native iOS.
Liblinphone also seemed to be a potential solution, but we didn't investigate further.
iDoubs also came up, but again, we felt OpenTok was the most promising one for our needs and thus didn't look at iDoubs in more detail.

Turning an iPhone or iPod into a wireless webcam

I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.

Streaming live H.264 video via RTSP to iphone does work! w/example

Using FFMPEG, Live555, JSON
Not sure how it works but if you look at the source files at http://github.com/dropcam/dropcam_for_iphone you can see that they are using a combination of open source projects like FFMPEG, Live555, JSON etc. Using Wireshark to sniff the packets sent from one of the public cameras that's available to view with the free "Dropcam For Iphone App" at the App Store, I was able to confirm that the iphone was receiving H264 video via RTP/RTSP/RTCP and even RTMPT which looks like maybe some of the stream is tunneled?
Maybe someone could take a look at the open source files and explain how they got RTSP to work on the iphone.
Thanks for the info TinC0ils. After digging a little deeper I'v read that they have modified the Axis camera with custom firmware to limit the streaming to just a single 320x240 H264 feed, to better provide a consistent quality video over different networks and, as you point out, be less of a draw on the phone's hardware etc. My interest was driven by a desire to use my iphone to view live video and audio from a couple of IP cameras that I own without the jerkiness of MJPEG or the inherent latency that is involved with "http live streaming". I think Dropcam have done an excellent job with their hardware/software combo, I just don't need any new hardware at the moment.
Oh yeah, I almost forgot the reason of this post RTSP PROTOCOL DOES WORK ON THE IPHONE!
They are using open source projects to receive the frames and decoding in software instead of using hardware decoders. This will work, however, this runs counter to Apple's requirement that you use their HTTP Streaming. It will also require greater CPU resources such that it doesn't decode video at the desired fps/resolution on older devices and/or decrease battery life compared to HTTP streaming.