Sound issue in Xcode 4.3.2 - iphone

I updated to OS X Lion 10.7.3 and to Xcode 4.3.2. I am unable to play a sound on the iPhone Simulator or on the iPhone device with the following code. I am able to play a sound using the same code on Snow Leopard and Xcode 4.2. I have no idea why I am not able to hear the sound in 4.3.2. Can any one help?
.h file
#import <UIKit/UIKit.h>
#import <AudioToolbox/AudioToolbox.h>
#interface soundViewController : UIViewController {
}
-(IBAction)playSounds:(id)sender;
#end
.m file
-(IBAction)playSounds:(id)sender {
CFBundleRef mainBundle = CFBundleGetMainBundle();
CFURLRef soundFileURLRef;
soundFileURLRef = CFBundleCopyResourceURL(mainBundle, (CFStringRef) #"E", CFSTR("caf"), NULL);
UInt32 soundID;
AudioServicesCreateSystemSoundID(soundFileURLRef, &soundID);
AudioServicesPlaySystemSound(soundID);
}

XCode 4.3 sometimes excludes files from the copy resources list for no reason. Verify that the file is included in your target. Even if you add it, sometimes it does not become a resource for the target. Click on the file and choose the left most tab of the property inspector, and make sure the checkbox next to your target is checked in the "Target Membership" section.

Related

iOS enabling AirPrint for uiwebview contents

I am super new to XCode and App development. I currently am loading up a web based application in uiwebviews on the iPad. When one particular page is loaded, it displays a pdf file. I would like to be able to print this pdf file using AirPrint. I am looking for a simple solution. Currently the app I am working on has 6 files which it uses.
-ViewController.m
-ViewController.h
-MainStoryboard_iPad.storyboard
-MainStoryboard_iPhone.storyboard
-AppDelegate.h
-AppDelegate.m
In the MainStoryboard files, there are many windows (graphical) which are liked to a central navigation system. If it is possible to spend some time to really explain what I need to do, and not 'take a look at this link.' I have programming experience, but never with XCode or any product related to Apple.
I figured out how to do this. Firstly I found a piece of code here, iOS Air print for UIwebview, I had no idea how to implement this at the time, but then I did as follows.
My application was a single view XCode project
In my storyboard I inserted a button (on my navigation bar) and changed its Identifier to 'Action' then, making sure to have the 'tuxedo' editor view open, displaying my ViewController.m file, I clicked and dragged from the button to the ViewController.m file while holding down control. This inserted my IBAction method after asking for my buttons id.
myActionButton
Then I copied in the code specified in response 3 of the question. My ViewController.m looked something link this.
#import "ViewController.h"
#interface ViewController()
#end
#implementation ViewController()
//Some stuff here
#end
#synthesize webView
-(IBAction)myActionButton:(id)sender{
UIPrintInfo *pi = [UIPrintInfo printInfo];
pi.outputType = UIPrintInfoOutputGeneral;
pi.jobName = webView.request.URL.absoluteString;
pi.orientation = UIPrintInfoOrientationPortrait;
pi.duplex = UIPrintInfoDuplexLongEdge;
UIPrintInteractionController *pic = [UIPrintInteractionController sharedPrintController];
pic.printInfo = pi;
pic.showsPageRange = YES;
pic.printFormatter = webView.viewPrintFormatter;
[pic presentAnimated:YES completionHandler:^(UIPrintInteractionController *pic2, BOOL completed, NSError *error) {
// indicate done or error
}];
}
Also in my ViewController.h file
#import ...
#interface ViewController : ...
{
IBOutlet UIWebView *webView
}
#property (nonatomic,retain) IBOutlet UIWebView *webView
#end
I didn't setup the webviews so I am not 100% sure on how they are created, but there is a good series for beginners on youtube at HackLife87 which shows how to make a single view app. I think XCode Tutorial Video #7 involves setting up views.
Since I am extremely green to XCode and IPad app development, I managed to solve my problem by combining knowledge from watching the aforementioned XCode tutorial videos and then implementing the solution provided on stackoverflow by Hafthor.

AVMediaTypeVideo not working for flashlight app

line in my code
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
for AVMediaTypeVideo xcode tells me ...
Use of undeclared identifier 'AVMediaTypeVideo'; did you mean 'kCMMediaType_Video'?
a side note, I do have #import AVFoundation/AVCaptureDevice.h in my header
I am trying to make a "flashlight" effect in an app, that is the only line of code with a problem, according to Xcode ...
Have you imported the AVFoundation Framework into to your project?
Add #import <AVFoundation/AVFoundation.h> to your .m
Try to add #import <AVFoundation/AVMetadataFormat.h>

save videos in iphone simulator & upload it to the web services

I just want to know that how I can save the videos to the iphone simulator & how I can upload it to the web services?
Thanks.
vpc=[[UIImagePickerController alloc] init];
vpc.delegate=self;
vpc.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
vpc.mediaTypes = [UIImagePickerController availableMediaTypesForSourceType:vpc.sourceType];
vpc.allowsEditing = NO;
vpc.mediaTypes = [NSArray arrayWithObject:(NSString *)kUTTypeMovie];
[self presentModalViewController:vpc animated:YES];
i am doing this it gives an error, kuTType undeclared before it used, i dont know about the kuTTypeMovies, please reply as soon as possible.
Thanks
I think you are missing one of below framework
libz.1.2.3.dylib
MobileCoreServices.framwork
SystemConfiguration.framwork
You need to import framework #import MobileCoreServices/UTCoreTypes.h
If u got an error when u import framework.try below method I found.
1) open your project profile
2) build phases > link binary with libraries
3) click + , add MobileCoreService framework
4) now see it in left side.In the Headers folder of framework.find file name MobileCoreServices.h
5) Now open .h file, simple drag it below #import UIKit/UIKit.h
Note : make sure you need to drag it into #import<....>
Now what...? Thats it.You can see KuTTYpeMovie in .m file...thanks

What is the best way to play sound quickly upon fast button presses Xcode?

I have a soundboard
it's just a screen with about 8 buttons.
each individual button will have its own sound which will be played upon button press
There are a couple of ways I could play the sound, such as using SystemSound or AVAudioPlayer
system sound so far seems have the quickest response times, avaudioplayer is quite slow, it cant keep up if the user taps on the buttons really fast, I created an audio player for each sound which was quite messy.
this is how I'm playing the sounds at the moment
the .h file
#interface MainScreenViewController : UIViewController <AVAudioPlayerDelegate, UITabBarControllerDelegate> {
AVAudioPlayer *player;
CFURLRef keNURL;
SystemSoundID keNObject;
//KE LOUD
CFURLRef keLURL;
SystemSoundID keLObject;
//GE NORMAL
CFURLRef geNURL;
SystemSoundID geNObject;
//GE LOUD
CFURLRef geLURL;
SystemSoundID geLObject;
//NA NORMAL
CFURLRef naNURL;
SystemSoundID naNObject;
//NA LOUD
CFURLRef naLURL;
SystemSoundID naLObject;
//RA
CFURLRef raURL;
SystemSoundID raObject;
//DAGGA CLICK
CFURLRef daCURL;
SystemSoundID daCObject;
//TILLI CLICK
CFURLRef tiCURL;
SystemSoundID tiCObject;
}
#property (nonatomic, retain) AVAudioPlayer *player;
#property (readwrite) CFURLRef keNURL;
#property (readonly) SystemSoundID keNObject;
#property (readwrite) CFURLRef keLURL;
#property (readonly) SystemSoundID keLObject;
#property (readwrite) CFURLRef geNURL;
#property (readonly) SystemSoundID geNObject;
#property (readwrite) CFURLRef geLURL;
#property (readonly) SystemSoundID geLObject;
#property (readwrite) CFURLRef naNURL;
#property (readonly) SystemSoundID naNObject;
#property (readwrite) CFURLRef naLURL;
#property (readonly) SystemSoundID naLObject;
#property (readwrite) CFURLRef raURL;
#property (readonly) SystemSoundID raObject;
#property (readwrite) CFURLRef daCURL;
#property (readonly) SystemSoundID daCObject;
#property (readwrite) CFURLRef tiCURL;
#property (readonly) SystemSoundID tiCObject;
}
then the actions that play the individual sounds
then the .m file, after importing the .h file and the right frame works and synthesizing all the variables, write the code for the action
and this is what's in the individual actions.
-(IBAction)geSound{
AudioServicesPlaySystemSound (self.geNObject);
}
I just wanted to know if systemsound is the way forward for me to create a soundboard.
thanks especially when the user will tap on the board really fast alternating between beats.
.
if not what is the best way to play the sound which responds really well?
Speaking from experience, AVAudioPlayer works quite well at playing multiple sounds at the same time or very quickly one after the other. The best way to use it is to just create one method that you feed in an NSString to play some sound with the name held in that NSString... in that way, you will create a new player for each sound file that you play. Be careful about releasing the allocated players though unless you know you are finished with them.
Unless you have very large sound files which might take a short second to buffer (you'll have to figure out for yourself if you can live with any latency or not), I've never had any issues with it being slow. If it's slow, you're probably doing something against Apple's recommendations in regards to decoding certain files (i.e. for multiple sounds at once, Apple recommends the CAF format since it is hardware decoded versus software decoded): http://developer.apple.com/iphone/library/documentation/AVFoundation/Reference/AVAudioPlayerClassReference/Reference/Reference.html
Update
Here is how I encode my files with a shell script. Put this in a file called batchit.sh in it's own directory. Then place whatever .WAV files you want to encode as a .CAF file in that directory. Open up the Terminal on your Mac, cd to that directory and type sh batchit.sh and let it do it's thing. It will batch convert all the files... Here's the code:
for f in *; do
if [ "$f" != "batchit.sh" ]
then
/usr/bin/afconvert -f caff -d ima4 $f
echo "$f converted"
fi
done
Notice I didn't do anything in the audioPlayerDidFinishPlaying method which is a delegate method of AVAudioPlayer. You will need to add code to properly release the players after each is finished playing (hey, I can't do all the work for you :) so that all the memory is allocated correctly. Otherwise you will eventually run out if you indefinitely keep creating players without releasing them. Since it seems like you're having a rough day.. see below for a big hint.
That's all the help I can give though. You gotta learn & earn the rest
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag {
// do something here if you want.. you know like maybe release the old players by uncommenting the line below :)
//[player release];
}
22/03/2014 - Update: Got rid of my overly excited post (which was written a long time ago) and updated to portray a more appropriate answer.
I had to achieve fine control over audio so opted to use OpenAL.
WOW is one way to sum it up. As soon as I took it upon myself to use OpenAL which required a bit of extra leg work, getting all the required methods implemented and setting everything up; I found myself with practically no noticeable latency at all. I was absolutely pleased with the excellent results.
I was able to achieve both fine control and no latency at all. I remember jumping in joy at the time when I had it working.
Here are the resources I used that helped ease the endeavor of implementing OpenAL for the first time:
OpenAL on the iPhone
It was because of this link this video tutorial that I was able to create an excellent sound manager singleton. This allowed me to play all the sounds that I needed from any class that I wanted with little sound management, the singleton took care of everything for me.
I humbly suggest everyone to use OpenAL if you require fine control over your audio, and more importantly if you require a low-latency on-demand audio, specially for games when you need to make sure sound is played there and then when you require and expect it to, an audio based application for example.
Just going to add for the benefit of others, even though this is 2 years after the issue.
I've found that if they are short sounds, you would be ok (ish) on your application load. buffering all your tracks by running an instance of prepare to play for every sound (in a for loop) (then releasing - just enough for your processor to remember where the file is)
This takes away the initial loading time next time you come round to it and lets the sounds start more promptly
you could create AVAudioPlayer for each sound and then just pause/play it.
Have you tried playing back each sound in its own thread? You could try a basic NSOperation

MPMoviePlayerController undeclared (first use in the function) message

I dont know why I am getting this message when I click have this code
MPMoviePlayerController *mp = [[MPMoviePlayerController alloc] iniWithContentURL: url];
Is there something, I am missing here?
Thanks
You need to import/include the movie player:
#import <MediaPlayer/MediaPlayer.h>
Furthermore, the MediaPlayer.framework must be added to your "Frameworks" folder in the XCode project.
To add the framework, right-click on "Frameworks", then select the path on your system where this framework resides. On my system it is under the following path:
/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS3.0.sdk/System/Library/MediaPlayer.framework
Make sure you are linking to MediaPlayer.framework in your Xcode project. That's where MPMoviePlayerController comes from and if you don't link to it, the linker won't know what it is.
MediaPlayer.framework must be added to your "Frameworks" and #import <MediaPlayer/MediaPlayer.h> in your .h file