I am making piano application for iPad. I am using AudioServicesPlaySystemSound (toneSSID) for play sound of keys. But there also functionality to increase volume. But I not find how to increase sound. Can anybody help me?
Thanks.
Add AudioStreamer.h file
Add this method to AudioStreamer.m
- (void)setVolume:(float)Level
{
OSStatus errorMsg = AudioQueueSetParameter(audioQueue, kAudioQueueParam_Volume, Level);
if (errorMsg) {
NSLog(#"AudioQueueSetParameter returned %d when setting the volume.", errorMsg);
}
}
And handle this to audiostreamclass object like
AudioStreamer *Obj = [AudioStreamer alloc] init];
[obj setVolume:1.0];
Related
I'm trying to get to grips with OpenAL, working through a tutorial here: http://benbritten.com/2008/11/06/openal-sound-on-the-iphone/
My problem is that the sound does not play, although there are no iOS errors thrown. There is an OpenAL error though. The code sample below is the body of an IBAction method, and results in an AL_INVALID_OPERATION at alGenSources(1, &sourceID). sourceID reports as NULL.
I've tried this on the device and the simulator.
This code sample seems to be in pretty wide use, but I can't find anybody complaining of this particular problem. Can anybody throw any light on this? Many thanks for any help,
NSString *audioFileName = [[NSBundle mainBundle] pathForResource:#"1" ofType:#"caf"];
AudioFileID fileID = [self openAudioFile:audioFileName];
UInt32 filesize = [self audioFileSize:fileID];
unsigned char *outData = malloc(filesize);
OSStatus result = noErr;
result = AudioFileReadBytes(fileID, false, 0, &filesize, outData);
AudioFileClose(fileID);
if (result != 0) {
NSLog(#"Can't load file..");
}
NSUInteger bufferID;
//NSLog(#"bufferID %#", [NSNumber numberWithUnsignedInteger:bufferID]);
alGenBuffers(1, &bufferID);
//NSLog(#"bufferID %#", [NSNumber numberWithUnsignedInteger:bufferID]);
alBufferData(bufferID, AL_FORMAT_STEREO16, outData, filesize, 44100);
[bufferStorageArray addObject:[NSNumber numberWithUnsignedInteger:bufferID]];
alGetError();
ALuint sourceID;
alGenSources(1, &sourceID);
if(alGetError() == AL_INVALID_OPERATION)
{
printf("\n++++ Error creating buffers INVALID_OPERATION!!\n");
//exit(1);
}
else
{
printf("No errors yet.");
}
alSourcei(sourceID, AL_BUFFER, bufferID);
alSourcef(sourceID, AL_PITCH, 1.0f);
alSourcef(sourceID, AL_GAIN, 1.0f);
if (loops) {
alSourcei(sourceID, AL_LOOPING, AL_TRUE);
}
[soundDictionary setObject:[NSNumber numberWithUnsignedInt:sourceID] forKey:#"sound"];
if (outData) {
free(outData);
outData = NULL;
}
[self playSound:#"sound"];
For your pitch problem, make sure the sound file you are loading matches the sample rate you are feeding into alBufferData. Your caf file is probably saved at 22050 Hz.
AudioStreamBasicDescription's mSampleRate will tell you what the audio file's sample rate really is.
You should also check mChannelsPerFrame to make sure it really is stereo sound.
Also, OpenAL by default on iOS only generates 4 stereo sources. If you try to load more than 4 sources with stereo data, your audio will sound like garbage. You can change that by specifying attributes ALC_STEREO_SOURCES and ALC_MONO_SOURCES when you create a context. You have a maximum of 32 sources (by default it sets up 28 mono and 4 stereo sources).
Stupid mistake on my part - I had initialised OpenAL in initWithNibName, which was never being called. Moving the init into viewDidLoad has got everything working, although playback is chipmunk-style high speed
i am new to AVFoundation and i am trying to implement a video camera with AVFoundation here is my basic setup. Basically, when you click a button it will call the showCamera method. In here i create the session and then add an audio input and video input then add the video output.
Where in here do i add the AVCaptureConnection and how do i do it? Is there some tutorial that shows how to use the connections? Any help is appreciated.
- (IBAction) showCamera
{
//Add the camview to the current view ontop of controller
[[[[[UIApplication sharedApplication] delegate] self] window] addSubview:camView];
session = [[AVCaptureSession alloc] init];
//Set preset on session to make recording scale high
if ([session canSetSessionPreset:AVCaptureSessionPresetHigh]) {
session.sessionPreset = AVCaptureSessionPresetHigh;
}
// Add inputs and outputs.
NSArray *devices = [AVCaptureDevice devices];
//Print out all devices on phone
for (AVCaptureDevice *device in devices)
{
if ([device hasMediaType:AVMediaTypeVideo])
{
if ([device position] == AVCaptureDevicePositionBack)
{
//Add Rear Video input to session
[self addRearCameraInputToSession:session withDevice:device];
}
}
else if ([device hasMediaType:AVMediaTypeAudio])
{
//Add Microphone input to session
[self addMicrophoneInputToSession:session withDevice:device];
}
else
{
//Show error that your camera does not have a phone
}
}
//Add movie output
[self addMovieOutputToSession:session];
//Construct preview layer
[self constructPreviewLayerWithSession:session onView:camView];
}
You don't add AVCaptureConnections manually. When you have both an input and an output added to the AVCaptureSession object, the connections are automatically created for you. Quoth the documentation:
When an input or an output is added to a session, the session greedily forms connections between all the compatible capture inputs’ ports and capture outputs.
Unless you need to disable one of the automatically-created connections, or change the videoMirrored or videoOrientation properties, you shouldn't have to worry about them at all.
Take a look at following URLs...
Documentation from Apple:
An Article:
Video Recording using AVFoundation Framework iPhone?
I think, it will help you....
I implemented OpenAL code to my iphone game. When I starts the game, it runs for 1 sec and stalls for 2 sec then resumes (hiccup effect). I believe its delayed due to the sound files loading. What is the solution? Can anyone recommend any book, site or sources code (not the iphone reference, please)? Is there a loading process and where should I initialize the loading process? Would that help?
Below, I have included the related components of the OpenAL code that I have implemented. The sound file will be played and invoked by a "if" statement in the gameloop. The OpenALController class is for the sound sources and buffers creation and the InitOpenAL method is invoked in OpenALController. MyView is a customized subclass of UIView and connected to the main view (I didn't use the default view).
// MyView.m
// A customized UIView as the main view.
#import "OpenALSoundController.h"
- (void)startPlaying{
...
[self initializeValuables];
...
[self initializeTimer];
}
- (void)initializeTimer {
if (theTimer == nil) {
theTimer = [CADisplayLink displayLinkWithTarget:self selector:#selector)gameLoop)];
theTimer.frameInterval = 2;
[theTimer addToRunLoop: [NSRunLoop currentRunLoop] forMode: NSDefaultRunLoopMode];
}
}
- (void)gameLoop {
...
If something = true
// Play the sound
[[OpenALSoundController sharedSoundController] playSound1];
...
}
...
#end
// OpenALSoundController.h
#interface OpenALSoundController : NSObject {
...}
...
+ (OpenALSoundController*) sharedSoundController
...
#end
// OpenALSoundController.m
// Singleton accessor
{
static OpenALSoundController* shared_sound_controller;
#synchronized(self)
{
if(nil == shared_sound_controller)
{
shared_sound_controller = [[OpenALSoundController alloc] init];
}
return shared_sound_controller;
}
return shared_sound_controller;
}
- (void) initOpenAL{
...
file_url = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:#"fire" ofType:#"wav"]];
firePcmData = MyGetOpenALAudioDataAll((CFURLRef)file_url, &data_size, &al_format,&sample_rate);
alBufferData(fireOutputBuffer, al_format, firePcmData, data_size, sample_rate);
[file_url release];
...
alSourcei(outputSourceFire, AL_BUFFER, fireOutputBuffer);
...
}
You might be interested in Finch, an OpenAL sound engine for iOS. It’s very well suited to games. It’s usually better to reuse some already existing code than develop and maintain your own.
First its better to use mp3, as wav files are huge and loading from disk takes time. Mp3 files are smaller on disk, loaded into memory and decompressed there for playing. Try experimenting by reducing mp3 bitrate/encoding quality too.
Also you need to preload sounds to avoid hiccups, otherwise you will have a delay the first time a sound is played.
So I have an application and I want to keep it working even if the screen is turned off.
Previously when I wanted to do that I used this hack/trick - I play a silent/empty sound in a loop in the background (AudioServicesPlaySystemSound) so if user presses the on/off button the application still works in the background - so it never allow iPhone to go to sleep mode - it just turned off the screen and maybe things like wifi or bluetooth (and on the iPod Touch accelerometers as far as I remember). And it worked. I wanted to use the same trick in my new application but when I was testing it now it seems it doesn't work anymore. The sound in the background plays (when I replace "empty" audio file with some sound I can hear it play) even with screen turned off but the sound it should play (using AVAudioPlayer) doesn't play (even when I turn the screen on again).
I don't know at which point it stopped working (it worked on 3.x OS for sure). Am I doing something wrong? Did Apple changed/fixed the "hack" that allowed your app to work even with screen turned off? Is there another way to allow the device to go to sleep (and drain the battery less) but continue to work?
This is the code I use to play background/silent sound:
-(void) playSilentSound
{
CFBundleRef mainBundle = CFBundleGetMainBundle ();
CFURLRef silentUrl = CFBundleCopyResourceURL (mainBundle, CFSTR ("silence"), CFSTR ("aiff"), NULL);
AudioServicesCreateSystemSoundID (silentUrl, &silentSound);
silentTimer = [NSTimer scheduledTimerWithTimeInterval: 2.0 target: self selector:#selector(playSilence) userInfo: nil repeats: YES];
}
-(void) playSilence
{
AudioServicesPlaySystemSound (silentSound);
}
And this is how I play the sound that should play even if the screen is turned off:
-(BOOL) playSound: (NSString *) path withLoops: (BOOL) loops stopAfter: (int) seconds
{
NSError *error;
player = [[AVAudioPlayer alloc] initWithContentsOfURL: [NSURL fileURLWithPath: path] error: &error];
player.delegate = self;
player.numberOfLoops = 0;
player.volume = volume;
secondsPlayed = 0;
loop = loops;
BOOL played = [player play];
if(played && seconds > 0)
{
timer = [[NSTimer scheduledTimerWithTimeInterval: 0.5 target: self selector: #selector(stopPlaying:) userInfo: [NSNumber numberWithInt: seconds] repeats: YES] retain];
secondsLimit = seconds;
} else {
secondsLimit = -1;
}
}
-(void) stopPlaying:(NSTimer*)theTimer
{
if(secondsLimit > 0 && (secondsPlayed + [player currentTime]) >= secondsLimit)
{
[player stop];
[timer release];
timer = nil;
}
}
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)sender successfully:(BOOL)flag
{
if(sender == player)
{
if(flag) { secondsPlayed += sender.duration; }
if(loop) { [player play]; }
}
}
The code is a bit complicated maybe - it could be just the first few lines - it's made that way so you can play only X seconds of sound (if sound is shorter than X it will play few loops until the total time is >= X). And of course everything is working fine when the screen is left on.
Also - if you find the code useful in your projects (like playSound:withLoops:stopAfter:) - feel free to use it (but it would be cool if you send me a message so I would know that I helped :)).
So the answer is to set session category. Some categories just turn off sound when the device is being locked. For me the best option was AVAudioSessionCategoryPlayback
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: nil];
I don't know about Apple fixing stuff, but in iOS 4 there are now official ways to continue doing things in the background which you should take advantage of.
First, thanks for the StackOverflow team, cause it's a very useful website, since i'm developping on iPhone.
Secondary, please excuse my language. I'm a frenchie and like every frenchies i'm very bad in english.
I've a very strange problem with my sounds in my iPhone program :
I implemented a class which play a short sound in aiff. Here it is :
#implementation SoundPlayer
-(id)initWithFile:(NSString*)file{
self = [super init];
NSString *soundPath = [[NSBundle mainBundle] pathForResource:file ofType:#"aiff"];
AudioServicesCreateSystemSoundID((CFURLRef)[NSURL fileURLWithPath: soundPath], &soundID);
return self;
}
-(void)play {
if(SOUND_ACTIVATED){
AudioServicesPlaySystemSound (soundID);
}
}
-(void)dealloc{
[super dealloc];
}
#end
It works quite good, but even if my instances are initialized the same way, they are not in the same audio stream !
I noticed that because when I push the volume+ and volume- buttons of the iPhone, in some cases it controls the main audio stream, in other cases it controls the ring volume.
If I put the main stream to volume 0, sound A won't be hearable, but sound B will be.
Did someone have a similar problem ? Do you have any idea ?
Thanks a lot.
Martin
Ok.
I found somehing that would be interesting to answer the problem.
There's a global function which initialize the audio context. It seem's that I don't use it the right way, but I think the problem comes from there.
// Initialize the Audio context
AudioSessionInitialize (
NULL, // 'NULL' to use the default (main) run loop
NULL, // 'NULL' to use the default run loop mode
NULL, // a reference to your interruption callbac
self // data to pass to your interruption listener callback
);
// What kind of sound will be played
UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
AudioSessionSetProperty (
kAudioSessionProperty_AudioCategory,
sizeof (sessionCategory),
&sessionCategory
);
In spite of these two functions, one sound is remaining on the ring stream, and that's really strange. Can someone help me ?