Is it possible to route audio such as a local mp3 file to play through the iPhone earpiece instead of speakerphone?
It is possible, yet a bit complicated as it appears. You might want to check the paragraph Audio Session Category Route Overrides in the Audio Sessions Services Reference, to get into the matter.
Related
Please help me, I have a problem when I play audio from local files, if I play one file at a time, I can. and it's working. but I want like applications in general to be able to play, next, stop, etc. using a background service. the data file has been successfully displayed in the application, the data of the audio file is in the form of a LIST, while the data requested by the audioservice is in the form of a MEDIAITEM
Suggest you to have a look on audio_service package. This is meant for playing audio in background, the use case you are looking for(play,next,pause).
I'd like to record what the iPhone is currently outputting. So I'm thinking about recording audio from Apps like Music (iPod), Skype, any Radio Streaming App, Phone, Instacast... I don't want to record my own audio or the mic input.
Is there an official way to do this? How do I do it? It seems like AVAudioRecorder does not allow this, can somebody confirm?
Officially you can't. The audio stream belongs to the app playing it ,and iOS.
The Sandbox paradigm means that a resource owned by your App can't be used by another App. Resource here means Audio/Video stream or file. Exceptions are when a mediator like Document interaction controller are used.
If you want to do this you'd have to start with deducing AVFoundation's private methods and find out if theres a way there. Needless to say this it wouldn't be saleable on the App store and will probably only be possible on a jailbreak.
Good Luck.
TLDR;
This is only feasible only from time to time, as it's a time expensive process.
You can record the screen while listening your songs on Spotify, Music or whatever music application.
This will generate a video on your Photos application. That video can be converted on MP3 from your computer.
Actually, this is not true. The screen recordings will not actually have the audio from Apple Music at all, as it blocks it. Discord also uses this pipe as well, so you cannot record Discord audio either this way.
I'm currently working on a project where it is necessary to record sound being played by the iPhone. By this, I mean recording sound being played in the background like a sound clip or whatever, NOT using the built-in microphone.
Can this be done? I am currently experimenting with the AVAudioRecorder but this only captures sound with the built-in microphone.
Any help would be appreciated!
This is possible only when using only the Audio Unit RemoteIO API or only the Audio Queue API with uncompressed raw audio, and with no background audio mixed in. Then you have full access to the audio samples, and can queue them up to be saved in a file.
It is not possible to record sound output of the device itself using any of the other public audio APIs.
Just to elaborate on hotpaw2's answer, if you are responsible for generating the sound then you can retrieve it. But if you are not, you cannot. You only have any control over sounds in your process. yes, you can choose to stifle sounds coming from different processes. but you can't actually get the data for these sounds or process them in any way.
I was trying to do the following on IPhone:
load a webpage on UIWebView control in my application.
The webpage has an audio tag (<audio/> in HTML5) with source of the audio file set correctly.
Webpage loads correctly and when I click on play button the audio was coming out from the receiver.
How do I make the audio to come out of the receiver instead of the speaker?
What have I already done?
I read about the Audio session services on apple platforms.
My initial category was kAudioSessionCategory_PlayAndRecord.
Now by default, according to the documentation the audio should go to the reciever.
The above did not work.
So I started listening on the kAudioSessionProperty_AudioRouteChange property. So when I played audio using the audio tag I got a callback. In the callback I queried for the audio route which came back with a string "RecieverAndMicrophone"
So looks like the browser control is doing some magic to override the app default audio session settings. How do I change that behavior through my app?
In my iPhone app I am trying to record audio and play iPod music at the same time, so I set the audio session category to kAudioSessionCategory_PlayAndRecord. But when I set this, all system audio (including vibrate) doesn't work anymore, although the iPod audio still does work. Does anyone know if this is a bug in the SDK or something, or how to get around it? Please help!
Thanks in advance!
Looking at the documentation for kAudioSessionOverrideAudioRoute tells us that the default for the category PlayAndRecord is to route the audio to the receiver (the speaker used when you're talking on the phone). Is it possible that all the audio is being routed to that and you just can't hear without putting your ear there?
If you want to change where the audio is going you need to call AudioSessionSetProperty, and pass it the constant specifying where you want the audio to go. These constants are
kAudioSessionOverrideAudioRoute_None, which specifies that you wish the audio to be routed to the receiver (as it is now), or
kAudioSessionOverrideAudioRoute_Speaker, which specifies that audio should be routed to the speaker at the bottom of the phone
UInt32 routeVar = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(routeVar), &routeVar);