If it helps, I am using Xcode 4.3.1 and objective-c to program simple apps on an iPhone 4s running iOS 5.1.
I would like to find the documentation for a class with methods to capture the digital music signal that iTunes sends to an output device (speaker, headphones). I assume it must be accessible since it exists in the phone prior to reaching the speakers. I am not attempting a pirating move, but rather would like to route this music signal to the phone's outgoing wireless signal so that it can be heard clearly by someone on the other end of a call (ex. a method to play a favorite song, with decent sound quality, for a friend out of town). Can anyone point me in a general direction (if that direction exists) so that I can begin learning more?
Thanks,
Seth
One, I'm pretty sure there's no way to get the raw audio samples from a song in the library. You can get a list of the tracks and tell the system to play one, but that all happens outside your app. Two, apps can't access the cell phone—there's no way to send audio from your app to a phone call. Three, even if the first two did work, calls are heavily compressed and tuned to voice data. Call a friend and have them play a song through the phone, see how it sounds. Not very good, I'll bet.
Related
I'd like to record what the iPhone is currently outputting. So I'm thinking about recording audio from Apps like Music (iPod), Skype, any Radio Streaming App, Phone, Instacast... I don't want to record my own audio or the mic input.
Is there an official way to do this? How do I do it? It seems like AVAudioRecorder does not allow this, can somebody confirm?
Officially you can't. The audio stream belongs to the app playing it ,and iOS.
The Sandbox paradigm means that a resource owned by your App can't be used by another App. Resource here means Audio/Video stream or file. Exceptions are when a mediator like Document interaction controller are used.
If you want to do this you'd have to start with deducing AVFoundation's private methods and find out if theres a way there. Needless to say this it wouldn't be saleable on the App store and will probably only be possible on a jailbreak.
Good Luck.
TLDR;
This is only feasible only from time to time, as it's a time expensive process.
You can record the screen while listening your songs on Spotify, Music or whatever music application.
This will generate a video on your Photos application. That video can be converted on MP3 from your computer.
Actually, this is not true. The screen recordings will not actually have the audio from Apple Music at all, as it blocks it. Discord also uses this pipe as well, so you cannot record Discord audio either this way.
I'm not very well versed in the iPhone and Android API, so please bear with me if this is a stupid question.
As I understand it, Square's card reader works by converting the magnetic information on the card stripe into an audio tone that its software can then process. [1]
In a similar way, is there a way to somehow read what exactly is being displayed on the device screen simply through a small device inserted into the audio jack on that device?
[1] http://www.quora.com/How-does-Squares-hardware-work
It's not quite clear what you wish to achieve. You can indeed make an app that would output a representation (perhaps audio frequency-shift keying?) of the screen's contents to the iPhone's audio jack.
The iPhone (and other iOS-based devices) use TRRS connectors for bi-directional audio (and hence arbitrary modulated data) communication and there are well-supported publicly-documented APIs for using these interfaces.
That said, if you're writing your own app: why would you want to output the contents of the screen? If you are developing the app in question, why not transmit the salient data in a more effective manner? Which leads me to my next assumption:
You want to read what's being displayed on the device's screen at any time, not just when an app of your creation is open. In this case, the answer is that it is not possible, with the possible exception of a jailbroken solution. That said, I can't imagine a jailbroken solution being useful much longer on account of iOS 5 introduced "display mirroring" by means of AirPlay.
On Android, I have no idea. :-)
No. The screen is not connected to the audio jack.
I think you can make an app to take a screenshot and then encode that photo as music to play it.
It won't sound good though :)
For this kind of task, there is built in camera
Some days ago I saw a interesting device for iphone, square, here: https://squareup.com/
you can plug it into iphone's earphone socket, and it can transfer data to iphone. A running App on iphone can receive it.
does any one know how it implemented? I guess it can encode data to audio stream and "sing" it, and App on phone can record the sound and decode it. but how to? is there a protocol or SDK?
The implemention is likely to be no different to that of a simple acoustic modem. The relevant APIs include Audio Units (low-level) or Audio Queue Services (higher level).
Matt Gallagher has written an excellent (as always!) post on creating an iOS tone generator, which is one way of enabling what you are after.
I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.
I want to allow user to switch to my app during a phone call and play some sound that the second person will hear.
I see here that it was possible even before iOS4 and here that no one knows how to do it.
I will really appreciate if someone will spread some light on this issue.
Thank you.
If you would like to play a song and after launch a phone call, any iPhone OS will stop your song, just beacause phone call has privilege over audio playback.
What you can do is play a song after the call has started, but in this case you can only use speakers.
I've tried the second, and with great audio tracks and some simple advises, you can reach the desired effect.
You could achieve it using Twilio API