mixing of five songs in iphone sdk - iphone

In my app I have five buttons and every button have five songs. In the main view there is a star (contains five ends) in the view. When the user drags the button to the end of the star it should play the song. Link this same has four and we want to mix those songs into a file like a ringtone saved it in to a tmp directory. I had tried a lot for this but I'm unable to get an answer to the problem.
The code I'm using is:
- (void) doAudioCallback: (NSTimer *)timer {
NSString *resName1 = #"drum1.aif";
NSString *resName2 = #"drum2.aif";
NSString *resPath1 = [[NSBundle mainBundle] pathForResource:resName1 ofType:nil];
NSString *resPath2 = [[NSBundle mainBundle] pathForResource:resName2 ofType:nil];
NSString *tmpDir = NSTemporaryDirectory();
NSString *tmpFilename = #"MixedoftwoSongs.aif";
NSString *tmpPath = [tmpDir stringByAppendingPathComponent:tmpFilename];
OSStatus status;
if (status == OSSTATUS_MIX_WOULD_CLIP) {
} else {
NSURL *url = [NSURL fileURLWithPath:tmpPath];
NSData *urlData = [NSData dataWithContentsOfURL:url];
NSLog(#"wrote mix file of size %d : %#", [urlData length], tmpPath);
AVAudioPlayer *avAudioObj = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
self.avAudio = avAudioObj;
[avAudioObj prepareToPlay];
[avAudioObj play];
}
}

I think you have to use Perform Selector to delay your each audio
[self performSelector:#selector(yourmethod) withObject:nil afterDelay:0.5];

Related

Too much time to get and merge video frames and write movies

I am building an app in which I have to get a thumbnail from a recorded video. I have an array of alpha video frames. I have to merge each set of frames and then make a movie with these final frames. Here is the piece of code I am dealing with:
-(void)createFrameForVideo
{
NSString *filePath = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"/Documents/movie.mp4"]];
NSURL *outputURL = [NSURL fileURLWithPath:filePath];
player = [[MPMoviePlayerController alloc]initWithContentURL:outputURL];
float frame = 0.00;
int count = 14;
NSFileManager *fileManager = [NSFileManager defaultManager];
NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
docPath = [docPath stringByAppendingPathComponent:#"OutPut"];
BOOL success = [fileManager fileExistsAtPath:docPath];
if (success) {
[fileManager removeItemAtPath:docPath error:nil];
}
[fileManager createDirectoryAtPath:docPath withIntermediateDirectories:YES attributes:nil error:nil];
for (frame = (frameStartTime); frame < (frameStartTime+7); frame+=0.033)
{
#autoreleasepool
{
UIImage * singleFrameImage = [player thumbnailImageAtTime:frame timeOption:MPMovieTimeOptionExact];
[player pause];
NSString *imageName = [NSString stringWithFormat:#"export2%d.png",count];
NSString * file = [[NSBundle mainBundle]pathForResource:imageName ofType:nil];
UIImage *overlayImage = [UIImage imageWithData:[NSData dataWithContentsOfFile:file]];
count = count + 1;
NSString *imagePath = [NSString stringWithFormat:#"%#/%#", docPath, imageName];
if (overlayImage)
{
UIImage * outImage = [self mergeImage:singleFrameImage withImage:overlayImage];
NSData *imgData = [[NSData alloc] initWithData:UIImagePNGRepresentation(outImage)];
[fileManager createFileAtPath:imagePath contents:imgData attributes:nil];
[imgData release];
}
else
{
NSData *imgData = UIImagePNGRepresentation(singleFrameImage);
[fileManager createFileAtPath:imagePath contents:imgData attributes:nil];
}
[outputFramesArray addObject:imagePath];
}
}
[player release];
if([fileManager fileExistsAtPath:filePath])
{
[fileManager removeItemAtPath:filePath error:nil];
}
NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"/Documents/movie1.mp4"]];
NSLog(#"filePath %#", path);
if ([[NSFileManager defaultManager] fileExistsAtPath:path]) {
[[NSFileManager defaultManager] removeItemAtPath:path error:nil];
}
[self writeImageAsMovie:outputFramesArray toPath:path size:CGSizeMake(480, 320) duration:10];
NSLog(#"hello Your layering is completed");
[outputFramesArray removeAllObjects];
[outputFramesArray release];
success = [fileManager fileExistsAtPath:docPath];
if (success) {
[fileManager removeItemAtPath:docPath error:nil];
}
//[self performSelectorOnMainThread:#selector(CompileFilesToMakeMovieWithAudio) withObject:Nil waitUntilDone:YES];
[self CompileFilesToMakeMovieWithAudio];
// [self compileFinalOutputMovie];
}
The problem is its taking much time to deal with frames in the whole loop. Could anyone please help speed up the process. I already have tried ffmpeg, but I think the problem is in merging. If anyone has suggestions, please share them.
Try using instruments to see where your memory is getting tied up and if there are any leaks.
I had a similar problem recently with leaks in a piece of code that ran constantly in a loop like that. Changing the local variables into instance variables and then just reusing the variable kept it from allocating too much memory for the application.
Your issue is one of basic approach. If you want to get better execution time then you need to change the basic way you app works. There is not a specific "change this code" step that will make your existing code execute faster.
What you should try is to encode the video frame and then write directly into the compiled h.264, instead of what you do now which is to create each frame and then combine them all together at the end in another loop. Like so:
1: Read Input PNG
2: Read Video Frame
3: Combine Video frame and input PNG
4: Write combined frame to movie via AVAsset apis.
That avoids having to read each frame 2 times and it avoids writing all the PNG images to disk again. IO takes a lot of time and compressing the images to PNG takes a lot of time. The suggested approach would be a lot faster because it would avoid these unneeded steps.

How do you make the iphone acceleromator trigger sound more effectively using avaudioplayer?

I have created a simple app that triggers 3 different sounds based on the x, y, z axis of the accelorometer, like an instrument. At the moment, if I set the frequency update interval of the accelometer too low, it plays the sound to much, and if I set it too high it it isn't responsive enough. I am a complete beginner to objective c and iphone development, can you tell by the code!..
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional
UIAccelerometer* accelerometer = [UIAccelerometer sharedAccelerometer];
[accelerometer setUpdateInterval: 25.0 / 10.0f];
[[AVAudioSession sharedInstance] setDelegate: self];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error: nil];
[accelerometer setDelegate:self];
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
player.volume = 0.5;
player.numberOfLoops = 0;
player.delegate = self;
}
- (void)accelerometer:(UIAccelerometer *)acel didAccelerate:(UIAcceleration *)aceler
{
if (aceler.x > 0.5) {
NSString *fileName = [NSString stringWithFormat:#"snare"];
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:fileName ofType:#"mp3"];
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
player = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:nil];
NSLog(#"acceleration.x = %+.6f greater", aceler.x);
[player play];
}
else if (aceler.y > 0.5) {
NSString *fileName = [NSString stringWithFormat:#"kick2"];
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:fileName ofType:#"mp3"];
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
player = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:nil];
NSLog(#"acceleration.y = %+.6f greater", aceler.y);
[player play];
}
else if (aceler.z > 0.5) {
NSString *fileName = [NSString stringWithFormat:#"hat"];
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:fileName ofType:#"mp3"];
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
player = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:nil];
NSLog(#"acceleration.y = %+.6f greater", aceler.z);
[player play];
}
else {
[player stop];
};
}
Setting the accelerometer's update frequency to a low value wont help here. Imagine the following situation:
time: ------------------------>
real world acceleration event: ___X_____X_____X_____X___
acceleration update: X_____X_____X_____X_____X
sound output started: _________________________
This draft represents a user shaking the device with the same frequency as the accelerometer is updated. But the shake event occurs just in the middle between two accelerometer updates. As a result, the shake events won't be registered and no sound is played.
Consider a high accelerometer update frequency in contrast to the prior approach:
real world acceleration event: ___X_____X_____X_____X___
acceleration update: XXXXXXXXXXXXXXXXXXXXXXXXX
sound output started: ___X_____X_____X_____X___
Basically all real world events result in a sound play within this situation.
The amended code is the following:
- (void)viewDidLoad
{
[super viewDidLoad];
NSLog(#"viewDidLoad");
UIAccelerometer* accelerometer = [UIAccelerometer sharedAccelerometer];
//[accelerometer setUpdateInterval: 25.0 / 10.0f];
[accelerometer setUpdateInterval: 0.01f];
[[AVAudioSession sharedInstance] setDelegate: self];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error: nil];
[accelerometer setDelegate:self];
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
//player.volume = 0.5;
//player.numberOfLoops = 0;
//player.delegate = self;
}
- (void)accelerometer:(UIAccelerometer *)acel didAccelerate:(UIAcceleration *)aceler
{
if ((aceler.x > ACC_THRESHOLD)&&((playerX == nil)||(playerX.isPlaying == NO))) {
NSString *fileName = [NSString stringWithFormat:#"snare"];
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:fileName ofType:#"mp3"];
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
playerX = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:nil];
NSLog(#"acceleration.x = %+.6f greater", aceler.x);
playerX.delegate = self;
[playerX play];
}
if ((aceler.y > ACC_THRESHOLD)&&((playerY == nil)||(playerY.isPlaying == NO))) {
NSString *fileName = [NSString stringWithFormat:#"kick2"];
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:fileName ofType:#"mp3"];
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
playerY = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:nil];
NSLog(#"acceleration.y = %+.6f greater", aceler.y);
playerY.delegate = self;
[playerY play];
}
if ((aceler.z > ACC_THRESHOLD)&&((playerZ== nil)||(playerZ.isPlaying == NO))) {
NSString *fileName = [NSString stringWithFormat:#"hat"];
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:fileName ofType:#"mp3"];
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
playerZ = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:nil];
NSLog(#"acceleration.z = %+.6f greater", aceler.z);
playerZ.delegate = self;
[playerZ play];
}
//else {
// [player stop];
//};
}
Please note, that multiple sound plays may be triggered by just one event as each dimension is evaluated separately now. A constant has been introduced by #define ACC_THRESHOLD 0.5f. A new sound play for one dimension is started only, after a previous play finished.
After these general change of event handling you may start optimizations with a signal filter.
Additionally you may use AVAudioPlayer's delegate methods for more detailed sound handling:
#pragma mark -
#pragma mark AVAudioPlayerDelegate methods
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag{
NSLog(#"audioPlayerDidFinishPlaying: %i", flag);
}
- (void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError *)error{
NSLog(#"audioPlayerDecodeErrorDidOccur: %#", error.description);
}

UIImagePickerController: Strange result of video compression

In my app, the user can either record a video or select one from his library. Naturally, I use a UIImagePickerController with it's sourceType set to either UIImagePickerControllerSourceTypeCamera or UIImagePickerControllerSourceTypePhotoLibrary.
The videoQuality is set to UIImagePickerControllerQualityTypeMedium in both cases.
Now, I did the following: I took a 15 sec video with my iPhone lying on it's back, so it's pitch-black. When I choose this video from the library, it is about 0.6 MB big. When I shoot the same video from within my app (15 sec, pitch-black), I get a file of over 4 MB.
Can anybody confirm this? I can hardly believe that I'm the first one to notice but then again, there is not much space for me to screw it up here (which I probably did nonetheless). Or does anybody have an explanation/solution for this?
(I'm at iOS 5.1 with an iPhone 4)
have you figured it out?
Now I have the same problem, a video in PhotoLibrary(2 minutes more); When I get it using UIImagePickerController, it's just about 30 Mb; But I get by the asset.defaultRepresentation(use this way(Getting video from ALAsset)), it reaches about 300Mb; Maybe the UIImagePickerController compressed the data in some way; I need to figure it out, but make no progress.......
================
EDIT: UIVideoEditorController can compress video to small size; and you can set the videoQuality just like UIImagePickerController.
maybe like this: when you use UIImagePickerController to chose a video, and set allowsEditing=YES, it will present UIVideoEditorController to compress video, then you get the compressed video in small size.
I've figured it out now. The solution is not to decrease the dimensions, but the bitrate. That is I think what Apple is doing when you select a video from the library.
Check out my answer here: https://stackoverflow.com/a/16035330/884119
Best way to compress the Video.
Here is the code.
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
NSString *tempFilePath = [[info objectForKey:UIImagePickerControllerMediaURL] path];
NSData *data = [NSData dataWithContentsOfURL:videoURL];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0];
NSString *caldate = [NSString stringWithFormat:#"%#.mov",[now description]];
caldate = [caldate stringByReplacingOccurrencesOfString:#" " withString:#""];
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *path = [NSString stringWithFormat:#"%#/%#", documentsDirectory,caldate];
NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];
NSURL *selectedVideoUrl;
if (CFStringCompare ((__bridge CFStringRef) mediaType, kUTTypeMovie, 0)
== kCFCompareEqualTo) {
tempFilePath = [[info objectForKey:UIImagePickerControllerMediaURL] path];
selectedVideoUrl = [info objectForKey:UIImagePickerControllerMediaURL];
}
NSLog(#"old move %#",path);
NSURL *url = [NSURL fileURLWithPath:tempFilePath];
[data writeToFile:path atomically:YES];
//Delete the original asset
NSArray *paths1 = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory1 = [paths1 objectAtIndex:0];
NSString *path1 = [NSString stringWithFormat:#"%#/%#", documentsDirectory1,caldate];
NSURL *url1 = [NSURL fileURLWithPath:path1];
NSLog(#"new move %#",path);
[self convertVideoToLowQuailtyWithInputURL:url outputURL:url1 handler:Nil];
[picker dismissModalViewControllerAnimated: YES];
}
-(void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL
outputURL:(NSURL*)outputURL
handler:(void (^)(AVAssetExportSession*))handler{
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality];
exportSession.outputURL = outputURL;
if ([[UIApplication sharedApplication]canOpenURL:inputURL]){
NSLog(#"open");
}
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
NSLog(#"errrsfseSession %#", exportSession.error);
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
if(exportSession.status != AVAssetExportSessionStatusCompleted){
NSLog(#"exportSession %#", exportSession.error);
}
if (exportSession.status == AVAssetExportSessionStatusCompleted)
{
NSLog(#"doneszdfsadj");
}
}];
}

App sounds cause iPod music, Pandora or Pod cast sound to stop

we have different sounds for actions and buttons within our app. We've heard a number of complaints where users background music stops as soon as one of our sounds is played. We're not quite sure why this is happening. Here is the code we're using to toggle our sounds.
- (void)playSound:(NSString *)sound {
if ( playSoundPath == nil ) {
playSoundPath = [[NSBundle mainBundle] pathForResource:sound ofType:#"wav"];
playSound = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:playSoundPath] error:NULL];
playSound.delegate = self;
}else{
playSoundPath = nil;
[playSound release];
playSoundPath = [[NSBundle mainBundle] pathForResource:sound ofType:#"wav"];
playSound = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:playSoundPath] error:NULL];
playSound.delegate = self;
}
BOOL enabled = [[NSUserDefaults standardUserDefaults] boolForKey:#"sounds_key"];
if ( enabled ){
[playSound stop];
[playSound play];
}
}
Update: We've actually ended up changing the above to the following per a few users suggestions. This allows iPod music or pod casts to continue and our sounds to play without stopping this background sound. This is possible by setting our sounds as "ambient" sound. This requires that you add the "AudioToolBox" framework and import this framework into your class using the following,
#import <AudioToolbox/AudioToolbox.h>
Here is the code we're using now,
BOOL enabled = [[NSUserDefaults standardUserDefaults] boolForKey:#"sounds_key"];
if ( enabled ){
AudioSessionInitialize(NULL, NULL, NULL, self);
UInt32 category = kAudioSessionCategory_AmbientSound;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(category), &category);
AudioSessionSetActive(YES);
NSString *path = [NSString stringWithFormat:#"%#/%#.wav",[[NSBundle mainBundle] resourcePath], sound];
SystemSoundID soundID;
NSURL *filePath = [NSURL fileURLWithPath:path isDirectory:NO];
AudioServicesCreateSystemSoundID((CFURLRef)filePath, &soundID);
AudioServicesPlaySystemSound(soundID);
}

How do I cycle through several sounds using one action?

Right now I have a single sound playing from one action, button press, accelerometer, etc..
I would like to know how I can cycle through several sounds (I'm using 3 sounds for my project) from a single action that the user initiates.
I am currently using the code shown below and it serves it's purpose for playing a single sound for each user action. I have not used NSArray in a project before, so if your including it please include any details.
NSURL *url = [NSURL fileURLWithPath: [NSString stringWithFormat:#"%#/Jet.wav", [[NSBundle mainBundle] resourcePath]]];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = 0;
if (audioPlayer == nil)
NSLog(#"%#", [error description]);
else
[audioPlayer play];
If you are only using 3 sounds then you can get away with just using a C array of NSString's but if you need a dynamic amount of sounds then you should change to using an NSArray
// .m
NSString *MySounds[3] = {
#"sound1.wav",
#"sound2.wav",
#"sound3.wav",
};
#implementation ...
Then in your method you need to add a little extra logic
- (void)playSound;
{
NSString *path = [NSString stringWithFormat:#"%#/%#", [[NSBundle mainBundle] resourcePath], [self nextSoundName]];
NSURL *url = [NSURL fileURLWithPath:path];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = 0;
if (audioPlayer == nil) {
NSLog(#"%#", [error description]);
} else {
[audioPlayer play];
}
}
- (NSString *)nextSoundName;
{
static NSInteger currentIndex = -1;
if (++currentIndex > 2) {
currentIndex = 0;
}
return MySounds[currentIndex];
}
-(IBAction)playSound:(UIButton *)sender{
NSURL *url = [NSURL fileURLWithPath: [NSString stringWithFormat:#"%#/%#", [[NSBundle mainBundle] resourcePath], [MySounds objectAtIndex:[sender tag]]]];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = 0;
if (audioPlayer == nil)
NSLog(#"%#", [error description]);
else
[audioPlayer play];
}
Here you can use single action buttons for all buttons. Just you need to do is set the tags to button. Use array as described by Paul