Simultaneous AVCaptureVideoDataOutput and AVCaptureMovieFileOutput - iphone

I need to be able to have AVCaptureVideoDataOutput and AVCaptureMovieFileOutput working at the same time. The below code works, however, the video recording does not. The didFinishRecordingToOutputFileAtURL delegate is called directly after startRecordingToOutputFileURL is called. Now if i remove AVCaptureVideoDataOutput from the
AVCaptureSession by simply commenting out the line:
[captureSession addOutput:captureDataOutput];
The video recording works but then the SampleBufferDelegate is not called (which i need).
How can i go about having AVCaptureVideoDataOutput and AVCaptureMovieFileOutput working simultaneously.
- (void)initCapture {
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:NULL];
captureDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[captureDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
m_captureFileOutput = [[AVCaptureMovieFileOutput alloc] init];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureDataOutput setVideoSettings:videoSettings];
captureSession = [[AVCaptureSession alloc] init];
[captureSession addInput:captureInput];
[captureSession addOutput:m_captureFileOutput];
[captureSession addOutput:captureDataOutput];
[captureSession beginConfiguration];
[captureSession setSessionPreset:AVCaptureSessionPresetLow];
[captureSession commitConfiguration];
[self performSelector:#selector(startRecording) withObject:nil afterDelay:10.0];
[self performSelector:#selector(stopRecording) withObject:nil afterDelay:15.0];
[captureSession startRunning];
}
- (void) startRecording
{
[m_captureFileOutput startRecordingToOutputFileURL:[self tempFileURL] recordingDelegate:self];
}
- (void) stopRecording
{
if([m_captureFileOutput isRecording])
[m_captureFileOutput stopRecording];
}
- (NSURL *) tempFileURL
{
NSString *outputPath = [[NSString alloc] initWithFormat:#"%#%#", NSTemporaryDirectory(), #"camera.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath]) {
[[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil];
[outputPath release];
return [outputURL autorelease];
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
NSLog(#"start record video");
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
NSLog(#"end record");
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
// do stuff with sampleBuffer
}
I should add i am getting the error:
Error Domain=NSOSStatusErrorDomain Code=-12780 "The operation couldn’t be completed. (OSStatus error -12780.)" UserInfo=0x23fcd0 {AVErrorRecordingSuccessfullyFinishedKey=false}
from
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
Cheers

I have contacted an engineer at Apple's support and he told me that simultaneous AVCaptureVideoDataOutput + AVCaptureMovieFileOutput use is not supported. I don't know if they will support it in the future, but he used the word "not supported at this time".
I encourage you to fill a bug report / feature request on this, as I did (bugreport.apple.com), as they measure how hard people want something and we perhaps can see this in a near future.

Still 9 years later Apple apparently does not seem to want this to work together.
But you can easily work with AVAssetWriter.
You can't use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput on the same time. But you can use AVCaptureVideoDataOutput and analyse or modify on the data, then use AVAsseWriter to write the frames to a file.
Source: https://developer.apple.com/forums/thread/98113
How to save video with output using AVAssetWriter:
Save AVCaptureVideoDataOutput to movie file using AVAssetWriter in Swift

Although you cannot use AVCaptureVideoDataOutput, you can use AVCaptureVideoPreviewLayer simultaneously with AVCaptureMovieFileOutput. See the "AVCam" example on Apple's Website.
In Xamarin.iOS, the code looks like this:
var session = new AVCaptureSession();
var camera = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);
var mic = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Audio);
if(camera == null || mic == null){
throw new Exception("Can't find devices");
}
if(session.CanAddInput(camera)){
session.AddInput(camera);
}
if(session.CanAddInput(mic)){
session.AddInput(mic);
}
var layer = new AVCaptureVideoPreviewLayer(session);
layer.LayerVideoGravity = AVLayerVideoGravity.ResizeAspectFill;
layer.VideoGravity = AVCaptureVideoPreviewLayer.GravityResizeAspectFill;
cameraView = new UIView();
cameraView.Layer.AddSublayer(layer);
var filePath = System.IO.Path.Combine( Path.GetTempPath(), "temporary.mov");
var fileUrl = NSUrl.FromFilename( filePath );
var movieFileOutput = new AVCaptureMovieFileOutput();
var recordingDelegate = new MyRecordingDelegate();
session.AddOutput(movieFileOutput);
movieFileOutput.StartRecordingToOutputFile( fileUrl, recordingDelegate);

XCode 14.1 is already support it.
XCode 13.4: Not works
XCode 14.1: works

Related

iPhone: keep audio recording app running in the background

I have searched for suitable answers to my question but I did not find any helpful so far.
I want to record the decibel in the environment. If a specific threshold is exceeded the app shall play a sound or song file. Everything works fine so far but I have troubles to keep the app running in the background.
I have already added the attribute "Application does not run in the background" and set its value to "NO". I've read that one should add the "external-accessory" element to the "Required background modes". I added that too but still it does not work.
I am using the AVAudioRecorder to record the sound and the AVPlayer to play the sound/music file. First I used the MPMediaController iPodMusicPlayer but it throws an exception along with the attribute "Required background modes".
EDIT:
I am using xCode 4.5 with iOS 6
EDIT 2:
When I add the string viop to the "Required background modes" it seems to continue recording while in background. But it still does not play the music file when being in background. I also tried to add the "audio" value too but it did not help.
EDIT 3:
I've consulted the apples developer reference. It seems like you have to configure your AVAudioSession. With that it seems to work (link to reference). But now I have troubles in playing more than one file because as soon as the first track has finished playing, the app will go into suspended mode again. As far as I know there is no possibility to initialize the AVPlayer or AVAudioPlayer with more than one file. I used the delegate methode audioPlayerDidFinishPlaying:successfully: to set the next track but it did not work.
EDIT 4: Ok, one possibility is to avoid stopping the recorder, that is removing the [record stop] so that it even records the sound when music is played. It is a work around that works but still I appreciate any other (better) solution to this. A solution that doesn't need to keep the recorder running all the time.
the relevant code:
I initialize everything in the viewDidLoad method:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
//[MPMusicPlayerController applicationMusicPlayer];
NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
NSError *error;
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
lowPassResults = -120.0;
thresholdExceeded = NO;
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.03 target: self selector: #selector(levelTimerCallback:) userInfo: nil repeats: YES];
} else {
NSString* errorDescription = [error description];
NSLog(errorDescription);
}
}
The levelTimer Callback that is called every 0.03 seconds:
- (void)levelTimerCallback:(NSTimer *)timer {
//refreshes the average and peak power meters (the meter uses a logarithmic scale, with -160 being complete quiet and zero being maximum input
[recorder updateMeters];
const double ALPHA = 0.05;
float averagePowerForChannel = [recorder averagePowerForChannel:0];
//adjust the referential
averagePowerForChannel = averagePowerForChannel / 0.6;
//converts the values
lowPassResults = ALPHA * averagePowerForChannel + (1.0 - ALPHA) * lowPassResults;
float db = lowPassResults + 120;
db = db < 0? 0: db;
if(db >= THRESHOLD)
{
[self playFile];
}
}
Finally the playFile method which plays the music file:
- (void) playFile {
NSString* title = #"(You came down) For a day";
NSString* artist = #"Forge";
NSMutableArray *songItemsArray = [[NSMutableArray alloc] init];
MPMediaQuery *loadSongsQuery = [[MPMediaQuery alloc] init];
MPMediaPropertyPredicate *artistPredicate = [MPMediaPropertyPredicate predicateWithValue:artist forProperty:MPMediaItemPropertyArtist];
MPMediaPropertyPredicate *titlePredicate = [MPMediaPropertyPredicate predicateWithValue:title forProperty:MPMediaItemPropertyTitle];
[loadSongsQuery addFilterPredicate:artistPredicate];
[loadSongsQuery addFilterPredicate:titlePredicate];
NSArray *itemsFromGenericQuery = [loadSongsQuery items];
if([itemsFromGenericQuery count])
[songItemsArray addObject: [itemsFromGenericQuery objectAtIndex:0]];
if([songItemsArray count])
{
MPMediaItemCollection *collection = [[MPMediaItemCollection alloc] initWithItems:songItemsArray];
if ([collection count]) {
MPMediaItem* mpItem = [[collection items]objectAtIndex:0];
NSURL* mediaUrl = [mpItem valueForProperty:MPMediaItemPropertyAssetURL];
AVPlayerItem* item = [AVPlayerItem playerItemWithURL:mediaUrl];
musicPlayer = [[AVPlayer alloc] initWithPlayerItem:item];
[musicPlayer play];
}
}
}
Can anybody help me with my problem? Did I miss anything else?
Try this,
AppDelegate.m
- (void)applicationDidEnterBackground:(UIApplication *)application
{
__block UIBackgroundTaskIdentifier task = 0;
task=[application beginBackgroundTaskWithExpirationHandler:^{
NSLog(#"Expiration handler called %f",[application backgroundTimeRemaining]);
[application endBackgroundTask:task];
task=UIBackgroundTaskInvalid;
}];
}

Getting a still image from the video output on the iphone?

I am writing an application to show stats on the light conditions as seen by the iphone camera. I take an image every second, and the performing calculations on it.
To capture an image, I am using the following method:
-(void) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in captureManager.stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[captureManager.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
latestImage = [[UIImage alloc] initWithData:imageData];
}];
}
However, the captureStillImageAsynhronously.... method causes the 'shutter' sound to be played by the phone, which is no good for my application, as it will be capturing images constantly.
I have read that it is not possible to disable this sound effect. Instead, I want to capture frames from the video input for the phone:
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];
and hopefully turn these into UIImage objects.
How would I achieve this? I don't know that much about how the AVFoundation stuff is working - I downloaded some example code and modified it for my purposes.
Don't use a still camera for this. Instead, grab from the video camera of the device and process the data contained within the pixel buffer you get in response to being an AVCaptureVideoDataOutputSampleBufferDelegate.
You can set up a video connection using code like the following:
// Grab the back-facing camera
AVCaptureDevice *backFacingCamera = nil;
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices)
{
if ([device position] == AVCaptureDevicePositionBack)
{
backFacingCamera = device;
}
}
// Create the capture session
captureSession = [[AVCaptureSession alloc] init];
// Add the video input
NSError *error = nil;
videoInput = [[[AVCaptureDeviceInput alloc] initWithDevice:backFacingCamera error:&error] autorelease];
if ([captureSession canAddInput:videoInput])
{
[captureSession addInput:videoInput];
}
// Add the video frame output
videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setAlwaysDiscardsLateVideoFrames:YES];
[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[videoOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
if ([captureSession canAddOutput:videoOutput])
{
[captureSession addOutput:videoOutput];
}
else
{
NSLog(#"Couldn't add video output");
}
// Start capturing
[captureSession setSessionPreset:AVCaptureSessionPreset640x480];
if (![captureSession isRunning])
{
[captureSession startRunning];
};
You'll then need to process these frames in a delegate method that looks like the following:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(cameraFrame, 0);
int bufferHeight = CVPixelBufferGetHeight(cameraFrame);
int bufferWidth = CVPixelBufferGetWidth(cameraFrame);
// Process pixel buffer bytes here
CVPixelBufferUnlockBaseAddress(cameraFrame, 0);
}
The raw pixel bytes for your BGRA image will be contained within the array starting at CVPixelBufferGetBaseAddress(cameraFrame). You can iterate over those to obtain your desired values.
However, you'll find that any operation performed over the entire image on the CPU will be a little slow. You can use Accelerate to help with an average color operation, like you want here. I've used vDSP_meanv() in the past to average luminance values, once you have those in an array. For something like that, you might be best served to grab YUV planar data from the camera instead of the BGRA values I pull down here.
I've also written an open source framework to process video using OpenGL ES, although I don't yet have whole-image reduction operations in there like you'd need for the kind of image analysis here. My histogram generator is probably the closest I have to what you're trying to do.

How to change video orientation for AVCaptureVideoDataOutput

Here's the problem. I am using AVCaptureVideoDataOutput to get video frames from camera and make video from them with AVAssetWriter. Its working OK, but the video that I get is upside down because default orientation of device for my app is landscape left, not landscape right as its stated by default in AVCaptureVideoDataOutput. Im trying to change orientation in AVCaptureConnection class, but isVideoOrientationSupported is always false, is it somehow possible to fix it?
Here is some code:
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput
deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]
error:nil];
/*We setupt the output*/
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
captureOutput.minFrameDuration = CMTimeMake(1.0, 24.0); //Uncomment it to specify a minimum duration for each video frame
[captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
// Set the video output to store frame in BGRA (It is supposed to be faster)
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
/*And we create a capture session*/
self.captureSession = [[AVCaptureSession alloc] init];
self.captureSession.sessionPreset = AVCaptureSessionPresetLow;
/*We add input and output*/
if ([self.captureSession canAddInput:captureInput])
{
[self.captureSession addInput:captureInput];
}
if ([self.captureSession canAddOutput:captureOutput])
{
[self.captureSession addOutput:captureOutput];
}
/*We add the preview layer*/
self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];
if ([self.prevLayer isOrientationSupported])
{
[self.prevLayer setOrientation:AVCaptureVideoOrientationLandscapeLeft];
}
self.prevLayer.frame = self.view.bounds;
self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: self.prevLayer];
AVCaptureConnection *videoConnection = NULL;
[self.captureSession beginConfiguration];
for ( AVCaptureConnection *connection in [captureOutput connections] )
{
for ( AVCaptureInputPort *port in [connection inputPorts] )
{
if ( [[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
}
}
}
if([videoConnection isVideoOrientationSupported]) // **Here it is, its always false**
{
[videoConnection setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
}
[self.captureSession commitConfiguration];
[self.captureSession startRunning];
Upd: figured that when exporting video, the AVAssetExportSession loses preferredTransform info.
I ran into the same problem and poked around the AVCamDemo from WWDC. I don't know why (yet) but if you query your videoConnection right after you create all the inputs/outputs/connections then both isVideoOrientationSupported and supportsVideoOrientation return NO.
However, if you query supportsVideoOrientation or isVideoOrientationSupported at some later point (after the GUI is setup for instance) then it will return YES. For instance I query it right after the user clicks the record button just before I call [[self movieFileOutput] startRecordingToOutputFileURL...]
Give it a try, works for me.
From here: http://developer.apple.com/library/ios/#qa/qa1744/_index.html#//apple_ref/doc/uid/DTS40011134
Currently, the capture outputs for a movie file
(AVCaptureMovieFileOutput) and still image (AVCaptureStillImageOutput)
support setting the orientation, but the data output for processing
video frames (AVCaptureVideoDataOutput) does not.

Getting exposure values from camera on iPhone OS 4.0

Exposure values from camera can be acquired when you take picture (without saving it to SavedPhotos). A light meter application on iPhone does this, probably by using some private API.
That application does it on iPhone 3GS only, so I guess it may be somehow related to EXIF data which is populated with this information when the image is created.
This all applies to 3GS.
Has anything changed with iPhone OS 4.0?
Is there a regular way to get these values now?
Does anyone have a working code example for taking these camera/photo setting values?
Thank you
If you want realtime* exposure information, you can capture a video using AVCaptureVideoDataOutput. Each frame CMSampleBuffer is full of interesting data describing the current state of the camera.
*up to 30 fps
With AVFoundation in iOS 4.0 you can mess with exposure, refer specifically to AVCaptureDevice, here is a link AVCaptureDevice ref. Not sure if its exactly what you want but you can look around AVFoundation and probably find some useful stuff
I think I finally found the lead to the real EXIF data. It'll be a while before I have actual code to post, but I figured this should be publicized in the meantime.
Google captureStillImageAsynchronouslyFromConnection. It's a function of AVCaptureStillImageOutput and following is an excerpt from the documentation (long sought for):
imageDataSampleBuffer -
The data that was captured.
The buffer attachments may contain metadata appropriate to the image data format. For example, a buffer containing JPEG data may carry a kCGImagePropertyExifDictionary as an attachment. See ImageIO/CGImageProperties.h for a list of keys and value types.
For an example of working with AVCaptureStillImageOutput see WWDC 2010 sample code, under AVCam.
Peace,
O.
Here is the complete solution. Dont forget to import appropriate frameworks and headers.
In the exifAttachments var in capturenow method you'll find all data you are looking for.
#import <AVFoundation/AVFoundation.h>
#import <ImageIO/CGImageProperties.h>
AVCaptureStillImageOutput *stillImageOutput;
AVCaptureSession *session;
- (void)viewDidLoad
{
[super viewDidLoad];
[self setupCaptureSession];
// Do any additional setup after loading the view, typically from a nib.
}
-(void)captureNow{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *__strong error) {
CFDictionaryRef exifAttachments = CMGetAttachment( imageDataSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
}];
}
// Create and configure a capture session and start it running
- (void)setupCaptureSession
{
NSError *error = nil;
// Create the session
session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPreset352x288;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
device.whiteBalanceMode = AVCaptureWhiteBalanceModeLocked;
device.focusMode = AVCaptureFocusModeLocked;
[device unlockForConfiguration];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
}
[session addInput:input];
stillImageOutput = [AVCaptureStillImageOutput new];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
if ([session canAddOutput:stillImageOutput])
[session addOutput:stillImageOutput];
// Start the session running to start the flow of data
[session startRunning];
[self captureNow];
}

How can I play a sound on the iPhone using MonoTouch?

I am looking for something like
PlaySound (uint frequency)
Does it exist?
From the HowTo at: http://wiki.monotouch.net/HowTo/Sound/Play_a_Sound_or_Alert
var sound = SystemSound.FromFile (new NSUrl ("File.caf"));
sound.PlaySystemSound ();
I don't know about mono, but in the iPhone SDK it isn't easy that easy to create and play sound. Other alternatives are to provide the sound as a file and play that, or create an array representing a sinusoid, and wrap it in a audio wrapper, and pass it to one of many sound APIs.
If mono proves to be just as limited, then search stackoverflow.com for System Sound Services and AVAudioPlayer as starting points.
Here are two ways to play a sound file:
SoundEffect.c (based on Apple's)
#import "SoundEffect.h"
#implementation SoundEffect
+ (id)soundEffectWithContentsOfFile:(NSString *)aPath {
if (aPath) {
return [[[SoundEffect alloc] initWithContentsOfFile:aPath] autorelease];
}
return nil;
}
- (id)initWithContentsOfFile:(NSString *)path {
self = [super init];
if (self != nil) {
NSURL *aFileURL = [NSURL fileURLWithPath:path isDirectory:NO];
if (aFileURL != nil) {
SystemSoundID aSoundID;
OSStatus error = AudioServicesCreateSystemSoundID((CFURLRef)aFileURL, &aSoundID);
if (error == kAudioServicesNoError) { // success
_soundID = aSoundID;
} else {
NSLog(#"Error %d loading sound at path: %#", error, path);
[self release], self = nil;
}
} else {
NSLog(#"NSURL is nil for path: %#", path);
[self release], self = nil;
}
}
return self;
}
-(void)dealloc {
AudioServicesDisposeSystemSoundID(_soundID);
NSLog(#"Releasing in SoundEffect");
[super dealloc];
// self = nil;
}
-(void)play {
AudioServicesPlaySystemSound(_soundID);
}
-(void)playvibe {
AudioServicesPlayAlertSound(_soundID);
}
+(void)justvibe {
AudioServicesPlayAlertSound(kSystemSoundID_Vibrate);
}
#end
SoundEffect.h:
#import <AudioToolbox/AudioServices.h>
#interface SoundEffect : NSObject {
SystemSoundID _soundID;
}
+ (id)soundEffectWithContentsOfFile:(NSString *)aPath;
- (id)initWithContentsOfFile:(NSString *)path;
- (void)play;
- (void)playvibe;
+ (void)justvibe;
#end
How to use it:
// load the sound
gameOverSound = [[SoundEffect alloc] initWithContentsOfFile:[mainBundle pathForResource:#"buzz" ofType:#"caf"]];
// play the sound
[gameOverSound playvibe];
This is useful for when you want to play sound at the same volume as the iPhone's volume control setting, and you won't need to stop or pause the sound.
Another way is:
+ (AVAudioPlayer *) newSoundWithName: (NSString *) name;
{
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource: name ofType: #"caf"];
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath: soundFilePath];
AVAudioPlayer *newPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: fileURL
error: nil];
[fileURL release];
// if the sound is large and you need to preload it:
[newPlayer prepareToPlay];
return (newPlayer);
}
and use it (you can see all the extras when you go with AVAudioPlayer):
timePassingSound = [AVAudioPlayer newSoundWithName:#"ClockTicking"];
[timePassingSound play];
// change the volume
[timePassingSound volume:0.5];
// pause to keep it at the same place in the sound
[timePassingSound pause];
// stop to stop completely, and go to beginning
[timePassingSound stop];
// check to see if sound is still playing
[timePassingSound isPlaying];