Is it possible to transfer data from/to iphone through headset using some modulation(fsk for example)? From theoretical point of view it looks completely possible.
Yes, you could consider in-code FSK demodulation.
Take a look at O'Reilly's iPhone Hacks - there's a nice chapter on serial modem.
True - it uses Headphone Jack but you'll get the feeling on how to handle audio 'data'.
You can also download the source from iPhone Hacks source code
EDIT: there's also an awesome link collection in not really Jake's answer here:
Using an iPhone audio dongle to transmit data
Related
I'm not very well versed in the iPhone and Android API, so please bear with me if this is a stupid question.
As I understand it, Square's card reader works by converting the magnetic information on the card stripe into an audio tone that its software can then process. [1]
In a similar way, is there a way to somehow read what exactly is being displayed on the device screen simply through a small device inserted into the audio jack on that device?
[1] http://www.quora.com/How-does-Squares-hardware-work
It's not quite clear what you wish to achieve. You can indeed make an app that would output a representation (perhaps audio frequency-shift keying?) of the screen's contents to the iPhone's audio jack.
The iPhone (and other iOS-based devices) use TRRS connectors for bi-directional audio (and hence arbitrary modulated data) communication and there are well-supported publicly-documented APIs for using these interfaces.
That said, if you're writing your own app: why would you want to output the contents of the screen? If you are developing the app in question, why not transmit the salient data in a more effective manner? Which leads me to my next assumption:
You want to read what's being displayed on the device's screen at any time, not just when an app of your creation is open. In this case, the answer is that it is not possible, with the possible exception of a jailbroken solution. That said, I can't imagine a jailbroken solution being useful much longer on account of iOS 5 introduced "display mirroring" by means of AirPlay.
On Android, I have no idea. :-)
No. The screen is not connected to the audio jack.
I think you can make an app to take a screenshot and then encode that photo as music to play it.
It won't sound good though :)
For this kind of task, there is built in camera
Are there any external buttons/controls that can be plugged into the iPhone that can be used within an app. If so, does anyone have any links to any code to use with these controls.
I was thinking of some kind of iPod controller that I could hack that could be plugged into the slot on the bottom of the phone.
Cheers.
Yes. Several USB MIDI controllers are supported via the Camera Connection kit on stock OS iOS devices. An app can use CoreMIDI in/out messages to get input from the buttons on these external MIDI controllers.
But an app can't use a generic hackable USB input device under the stock OS, unless the developer is the manufacturer and also a member of Apple's MFi program.
ADDED:
...or you hack the USB device so that it imitates one of Apple's supported MIDI devices. Example of doing this with an AVR microcontroller is here.
Apple's CoreMIDI reference is here.
ADDED #2:
If you want even more accuracy for a timer app, consider using the mic audio input jack, and connecting some buttons to audio chirp generators (could be done either with analog circuitry or a tiny cheap micro). Use different chirp frequencies for different buttons. Some suitable DSP code on the iPhone could probably determine the relative timings of audio input chirp signals with sub-millisecond accuracy.
There are many solutions to this, most involve jailbreaking the iphone.
The most famous/popular is the iControlPad.
Some days ago I saw a interesting device for iphone, square, here: https://squareup.com/
you can plug it into iphone's earphone socket, and it can transfer data to iphone. A running App on iphone can receive it.
does any one know how it implemented? I guess it can encode data to audio stream and "sing" it, and App on phone can record the sound and decode it. but how to? is there a protocol or SDK?
The implemention is likely to be no different to that of a simple acoustic modem. The relevant APIs include Audio Units (low-level) or Audio Queue Services (higher level).
Matt Gallagher has written an excellent (as always!) post on creating an iOS tone generator, which is one way of enabling what you are after.
I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.
Anyone know if this is allowed by the iPhone's various Api's or even if Apple allows this?
Example: Plug something in the audio jack and use it as a "taser" (this is just a hypothetical/proof-of-concept example).
Yes. The standard way of "sending electrical signals through the audio jack" is
known as "playing audio", and I'm pretty sure this is possible on the iPhone.