iPhone Network Camera Feed - iphone

I plan on mounting a Wireless Network Camera on my robot http://mydlink.dlink.com/DCS930L . DLink has an iPhone app to see live video however I want to integrate the video with my iPhone remote controller I made.
Is it possible to get that video feed into my own app?
Where should i start looking..

I know this was posted some time ago, but the easiest way to integrate a network camera feed into your own application is via UIWebView. Then connect to the camera via IP address followed by /video/mjpg.cgi.

Related

Unity broadcast live video

I'm developing an app in unity, that receives the live video from another device through a local wifi network, and the app must broadcast it through internet to watch live by another user. Can anybody suggest a way?

iOS streaming camera feed to another device

Has anyone tried to streaming the camera feed of one iOS device to another device? So for example a iPhone camera feed to a iPad 1. I guess you could keep taking pictures and sending over Bluetooth but that would probably work very badly.
The ideal solution would be to stream video and location over to one device via wifi and be able to send data back.
Check out this project I hope this helps.

iphone send video to ipad live streaming and wireless control (like AR Drone)

How can I send video from an iPhone to an iPad?
I'm building a robot that is an iPhone controlling an arduino, for the next phase I would like to be able to send some live streaming video from the iPhone to see in an iPad and have the iPad sending commands to the iPhone.
so how to send live streaming video from one device to the other (WiFi preferred or BlueTooth), and how to control one device via wireless from the other?
EDIT:
The best example for what I intend to do is the Parrot AR Drone and another app for the toy,
app clone to pilot the Quadracopter
The difference is that I would be getting the image from an iPhone and sending the control orders to the iPhone [from an iPad] as well, not a separate hardware.
Thanks a lot!
Most of the apps I've seen that do this use AVFoundation to capture data form the video camera - then push the frames to a server somewhere. You probably won't want to push every frame. For the receiving side of things I would have a server hosting a web page with an html5 video tag looking at an m3u8 playlist. Have your files from the iphone go into the playlist folder.
<video src="http://yourserver.com/path/to/stream/yourPlaylist.m3u8">
Your browser does not support the VIDEO tag
</video>
Then set your view on the ipad or computer to look at that webpage. There is for sure a more direct way of sending the files straight to the ipad for viewing - but I like being able to view the video from any broswer :)
If you want to stay away from a web view on the ipad you can also get the files as you would retrieve any file over a network. The web view is just the easiest way in my opinion.
How to integrate Live555 in XCode (iOS SDK)
hope this helps!

How do media browser plugins function?

If I want to use Google Video chat on my browser
I have to download and install a plugin for it to work.
I would like to make a piece of software that creates
some interactions with a video displayed in the browser.
I assume that it might be problematic doing it with one solution
for all the browser, so if I might need to focus on only one browser
lets talk about firefox, although I think the firefox addon SDK
would not let me do a thing as complex as video interaction.
But how does the Google Video chat plugin work for the browsers?
It's only an example for one of those plugins that lets you
do activities (media in this case) with your browser
which are normally impossible.
As I understand it, Google Video Chat uses Flash.
I'm looking for something official-looking to back that up now...
Edit: I think this explains it pretty well.
Flash Player exposes certain audio/video functions to the (SWF) application. But the Flash Player does not give access to the raw real-time audio/video data to the application. There are some ActionScript API classes and methods: the Camera class allows you to capture video from your camera, the Microphone class allows you to capture audio from your microphone, the NetConnection/NetStream classes allow you to stream the video from Flash Player to remote server and vice-versa, the Video class allows you to render video either captured by Camera or received on NetStream. Given these, to display the video in Flash Player the video must be either captured by Camera object or received from remote server on NetStream. Luckily, ActionScript allows you to choose which Camera to use for capture.
When the Google plugin is installed, it exposes itself as two Camera devices; actually virtual device drivers. These devices are called 'Google Camera Adaptor 0' and 'Google Camera Adaptor 1' which you can see in the Flash Player settings, when you right click on the video. One of the device is used to display local video and the other to display the remote participant video. The Google plugin also implements the full networking protocol and stack, which I think are based on the GTalk protocol. In particular, it implements XMPP with (P2P) Jingle extension, and UDP-based media transport for transporting real-time audio/video. The audio path is completely independent of the Flash Player. In the video path: the plugin captures video from the actual camera device installed on your PC, and sends it to the Flash Player via one of the virtual camera device driver. It also encodes and sends the video to the remote user. In the reverse direction, it receives video (over UDP) from the remote user, and gives it to the Flash Player via the second of the virtual camera device drivers. The SWF application running in the browser creates two Video objects, and attaches them to two Camera object, one each for the two virtual video device, instead of attaching it to your real camera device. This way, the SWF application can display both the local and remote video in the Flash application.

How to connect the camera from iphone to mac

I need to know if there is a way to connect the camera of iPhone to mac. My aim is like to create a spying camera where iPhone will act as the spying device and we will see the view from its camera on mac screen.
There is no API for this. The "answer" is to build a client and then capture the images on the iPhone screen and transmit them to the client server.