I want to write application on iPhone which will:
make some digital processing on video from iPhone camera (in real time)
send some graphics through TV-Out
Could You tell me if this above is possible on iPhone 3G or 3GS?
Can I have access to each pixel in video capturing via iPhone camera?
There are private access points to do both live video recording and TV out. They are not well documented so you won't find much about them though and you wouldn't be able to submit anything that uses them to the app store. Real time video capture will probably eventually become a public part of the SDK but TV is probably less likely at this point.
Related
to cut story short Im developing an app thats like Sonos. That is, Im trying to stream music from the iPod library to a hardware over wifi. Hence the sound will come out of a hardware that is connected to the iPhone by wifi.
Doing this is easy enough when the app is foreground but in the background the app will be terminated. It is different when outputting the audio in the background in the iPhone device itself.
Also I notice that Sonos supports this feature only for iOS 6 so I wonder if theres some new magical API that allows this.
I hope any of you gurus can guide me to the right path.
Has anyone tried to streaming the camera feed of one iOS device to another device? So for example a iPhone camera feed to a iPad 1. I guess you could keep taking pictures and sending over Bluetooth but that would probably work very badly.
The ideal solution would be to stream video and location over to one device via wifi and be able to send data back.
Check out this project I hope this helps.
If it helps, I am using Xcode 4.3.1 and objective-c to program simple apps on an iPhone 4s running iOS 5.1.
I would like to find the documentation for a class with methods to capture the digital music signal that iTunes sends to an output device (speaker, headphones). I assume it must be accessible since it exists in the phone prior to reaching the speakers. I am not attempting a pirating move, but rather would like to route this music signal to the phone's outgoing wireless signal so that it can be heard clearly by someone on the other end of a call (ex. a method to play a favorite song, with decent sound quality, for a friend out of town). Can anyone point me in a general direction (if that direction exists) so that I can begin learning more?
Thanks,
Seth
One, I'm pretty sure there's no way to get the raw audio samples from a song in the library. You can get a list of the tracks and tell the system to play one, but that all happens outside your app. Two, apps can't access the cell phone—there's no way to send audio from your app to a phone call. Three, even if the first two did work, calls are heavily compressed and tuned to voice data. Call a friend and have them play a song through the phone, see how it sounds. Not very good, I'll bet.
I'm not very well versed in the iPhone and Android API, so please bear with me if this is a stupid question.
As I understand it, Square's card reader works by converting the magnetic information on the card stripe into an audio tone that its software can then process. [1]
In a similar way, is there a way to somehow read what exactly is being displayed on the device screen simply through a small device inserted into the audio jack on that device?
[1] http://www.quora.com/How-does-Squares-hardware-work
It's not quite clear what you wish to achieve. You can indeed make an app that would output a representation (perhaps audio frequency-shift keying?) of the screen's contents to the iPhone's audio jack.
The iPhone (and other iOS-based devices) use TRRS connectors for bi-directional audio (and hence arbitrary modulated data) communication and there are well-supported publicly-documented APIs for using these interfaces.
That said, if you're writing your own app: why would you want to output the contents of the screen? If you are developing the app in question, why not transmit the salient data in a more effective manner? Which leads me to my next assumption:
You want to read what's being displayed on the device's screen at any time, not just when an app of your creation is open. In this case, the answer is that it is not possible, with the possible exception of a jailbroken solution. That said, I can't imagine a jailbroken solution being useful much longer on account of iOS 5 introduced "display mirroring" by means of AirPlay.
On Android, I have no idea. :-)
No. The screen is not connected to the audio jack.
I think you can make an app to take a screenshot and then encode that photo as music to play it.
It won't sound good though :)
For this kind of task, there is built in camera
I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.