iPhone Media Player Framework accessing raw file - iphone

I'm interested in using the Media Player framework on iPhone to access audio in the user's library. I want to be able to load a given audio file and then perform some specialised filtering on it.
Ideally I want to be able to directly load the audio file (or portion thereof for streaming) then use the audio converter services to perform decompression. Once I have the linear PCM data I want to perform some filtering on it before supplying the audio directly to an audio queue.
Is this possible on iPhone?
If so, can anyone tell me how I would access the audio file directly? Is it just a matter of using the "URL" to load it using NSFile (Presumably I obtain that through MPMediaItemPropertyAssetURL)? Or do I need to do something more complicated?
Cheers!

The URL you get from MPMediaItemPropertyAssetURL is an Assets Library URL. You can use it to initialize an AVURLAsset. Then create an AVAssetReader with the asset and add to it an AVAssetReaderOutput (instances of AVAssetReaderAudioMixOutput or AVAssetReaderTrackOutput, depending on what you need). This latter object will finally give you access to the media data (-copyNextSampleBuffer).

Related

Can I use libspotify to play audio without using a low level audio API?

Is it possible to use libspotify to obtain the URI of a track and play it using a higher level Media Player? For instance, I'm interested in doing this with a QMediaPlayer in Qt. I ask this because one of Spotify's sample applications uses lower level API's to write the samples directly.
I'd rather do this simpler, similar to how Grooveshark's API works by returning a URL for the track and simply setting that on the high level media player.
No, the only way to get audio for a Spotify track is using libspotify's audio delivery callback, which delivers raw PCM data.

objective-c record audio session output

I am writing an app that generates music. I am using OpenAL to: modify gain; modify pitch; mix audio; and play the resulting audio. I now need to record the audio as it is being played. I understand that OpenAL does not let you record the output audio. The other options I have found is to use audio units. However because I need to mix/pitch/gain the audio and record it, it seems I need to write all the audio processing so I can have access to the output buffer. Is this correct? Or is there a different iOS API I can use to do this. If not then is there a 3rd party solution already that lets me record the output (paid solutions are fine)?
You are correct.
Audio Units are the only iOS public API that allows an app to both process and then record audio.
Trying to record the OpenAL output may well be a violation of Apple's rules against using non-public APIs.
The alternative may be to completely rewrite the portions of OpenAL you need (there may be open source for some portions) running on top of the RemoteIO Audio Unit.
The best way to go is likely to be Core Audio, since it will give you as much flexibility as you need. Take a look into the Extended Audio File Services reference pages.
Using and extended audio file you should be able to set up a file format and audio stream buffer to send the final mixed output to, and then use the ExtAudioFileWrite() function to write the samples to the file.

Is it possible to access raw iphone audio output?

Is it possible access raw PCM data from the iphone audio output?
I know I can embed an MP3 and use AudioUnit. But if the user is playing music in the background from their itunes library, is it possible to access that audio data?
This is for an app that shows visual effects, which react to the music.
From what I can tell, it isn't possible, but that's just from lack of finding any information at all, rather than actual confirmation that it can't be done.
If it isn't possible to access the audio stream from the ipod, is it possible to access raw audio output from the Media Player inside an app, or is pretty much not permitted to access raw audio data from the itunes library at all?
EDIT: I found this question: iOS - Access output audio from background program, which say I can't access the audio from a background app. But is it possible to get the audio data from the itunes library if I play it inside the app?
I am busy coding something similar and as far as I know an AUGraph is needed, the hardware pulling from the recorder. You will have to get the URL of the MPMediaItem from the track the user selected with Apple's MPMediaPickerViewController. Then use the URL with Core-Audio. Core-Audio is a beast.
If your app is playing raw audio PCM samples, then your app has access to those samples. An app does not have access to the audio samples that another app (including the Music player) is playing via any public API.
An app can use AVAssetReader and Writer to convert mp3 files from the iTunes Library into raw audio (WAV) files.

Play specific local IPhone Video

I am in need for some help as I am stuck with a problem with my current IPhone application. I won't go into every details but the mainline is as follow:
I am currently playing videos from a remote URL. Everthing up to this point is working. But we need to add a certain validation as if the video exists on the local IPhone, play this version and otherwise, get the remote version. I get these informations from an XML feed and have the name of the video and it's remote URL.
I've implemented the ALAssetLibrary as a way to retrieve the locals video and transfered 3-4 videos with custom names. After some struggling, I could play these local video. But while I loop through them, all I get is names like 00001.jpg, etc.
Is there any way to get a local video name ? I don't mind if this needs another library but I would appreciate if someone could point me a way of doing it.
Thanks for your time,
AP
You don't have access to the local filenames, and even if you did those filenames would probably not be what you are expecting (i.e. Apple can and probably does rename them while saving them to the Camera Roll).
You can check the metadata on the ALAssetRepresentation for the video to see if a suitable name or other identifier can be found in there. You might also be able to retrieve the raw data and hash it, but that would fail if Apple does any recoding or metadata alteration when saving the video. If your program itself downloads and saves the videos to the Camera Roll using writeVideoAtPathToSavedPhotosAlbum:completionBlock:, you could store the returned assetURL to remember the correspondence. Or you could save the videos to your app's local storage instead of to the Camera Roll, but that would prevent the user from managing the videos with Apple's photo application and such.

iphone sdk: Core Audio How to continue recording to file after user stops recording by leaving the application and then re-opens it?

The iPhone's AVAudioRecorder class will not allow you to open an existing file to continue a recording. Instead, it overwrites it. I'd like to know an approach that would allow me to continue recording to an existing file using Core Audio APIs.
The best bet would be to take a look at the Audio Queue Services API. This is basically the next "deeper" level into the Core Audio stack provided by Apple. Unfortunately, the chasm between AVAudioRecorder and Audio Queue Services is vast. AQS is a C-based API and a fairly low level abstraction of the even more "raw" lowest levels of Core Audio. I would suggest reviewing the guide above, then taking a look at the example SpeakHere. It should easily be able to handle your current requirement.
No matter which API, you will have to handle the "intermediate" storage of your PCM data, probably temporarily storing it as a WAV or raw PCM, which you then reload and append with PCM data when continuing.