does the accelerometer work for the iphone/ipad simulator? - iphone

From what I can tell, my app should be firing accelerometer events while Im using the iPad simulator in XCode, but its not.
I have googled around and it somewhat seems that the accelerometer is not implemented in the simulator, is this correct? If so, why on earth would they have a "Hardware->Shake Gesture" menu option?
My code is as follows:
.h file:
#interface MyViewController : UIViewController <UIPickerViewDataSource, UIPickerViewDelegate, UIAccelerometerDelegate>{
UIAccelerometer *accelerometer;
//...other stuff
}
#property (nonatomic, retain) UIAccelerometer *accelerometer;
#end
then the .m file:
#implementation MyViewController
#synthesize accelerometer;
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
NSLog(#"%#", [NSString stringWithFormat:#"%#%f", #"X: ", acceleration.x]);
NSLog(#"%#", [NSString stringWithFormat:#"%#%f", #"Y: ", acceleration.y]);
NSLog(#"%#", [NSString stringWithFormat:#"%#%f", #"Z: ", acceleration.z]);
}
- (void)viewDidLoad {
[super viewDidLoad];
self.accelerometer = [UIAccelerometer sharedAccelerometer];
self.accelerometer.updateInterval = .1;
self.accelerometer.delegate = self;
}
#end
Does this look right?

There's no accelerator in the simulator.
The "Hardware → Shake Gesture" is generated by directly sending a UIEvent to the active application.
Although Shake Gesture is implemented using the accelerator on the device, conceptually they are two different kind of user input — the gesture is an event, the acceleration is continuous variation. Thus the Shake Gesture menu item exists for the apps which only use such gesture (e.g. Shake to Undo) but do not need to know the exact acceleration vector.

Accelerometer does not work on iPhone/iPod Simulator. Neither "Hardware->Shake Gesture" use accelerometer nor the orientation change of safari on simulator.

Related

CLLocation Region Monitoring behaves differently in iOS 5.1 and iOS 6

UPDATE: I am looking for a workaround to this problem in iOS5.1. Currently, I have evidence that this issue in fact is known. However, I think it is related to an updated xcode rather than iOS5.1 actually region monitoring not working.
The simple code below behaves differently between iOS5 and iOS6. It works as expected in iOS6.
But in iOS5, the didEnterRegion callback is only triggered the first time the region is entered. If you exit the region, then re-enter the region, it will not be triggered. If you close and restart the app, entering the region will not trigger the callback.
The difference in behavior was seen on the iOS5 and iOS6 simulators. The broken iOS5 behavior was seen on an iPhone 4S with iOS5. Xcode 4.6 was used. CoreLocation framework was added properly, and locMan is a property of the AppDelegate. A clean new project was created for this test.
Can someone please find a workaround to this problem? The fix needs to use region monitoring, not active location updating.
#import <UIKit/UIKit.h>
#import <CoreLocation/CoreLocation.h>
#interface AppDelegate : UIResponder <UIApplicationDelegate, CLLocationManagerDelegate>
#property (strong, nonatomic) UIWindow *window;
#property (nonatomic, strong) CLLocationManager *locMan;
#end
// AppDelegate implementation file
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
/// LocationManager
self.locMan = [[CLLocationManager alloc] init];
self.locMan.delegate = self;
CLLocationCoordinate2D coordinates = CLLocationCoordinate2DMake(40.0, 40.0);
CLLocationDistance distance = 100.0;
CLRegion *region = [[CLRegion alloc] initCircularRegionWithCenter:coordinates radius:distance identifier:#"hello"];
[self.locMan startMonitoringForRegion:region];
[self.window makeKeyAndVisible];
return YES;
}
-(void)locationManager:(CLLocationManager *)manager didEnterRegion:(CLRegion *)region {
NSLog(#"didEnterRegion");
}
After leaving this question open for a week, I feel that I've enough evidence that region monitoring is broken in iOS5.1. Time to close out the question.

Why is my app crashing when I try to play a sound using the Finch library for iPhone SDK?

I followed the instructions in the read me file exactly, but for some reason, in my app, every time I hit the UIButton corresponding to the code to play the sound "[soundA play]; the app just crashes without any detailed error description except for lldb. I'm using Finch because it plays the audio using OpenAL, and I need to use OpenAL for the type of app I'm making because AVAudioPlayer or System Sounds are not usable for what I'm making. Here is the code that I am using.
Main file:
#import "ViewController.h"
#interface ViewController ()
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
soundFactory = [[FIFactory alloc] init];
engine = [soundFactory buildSoundEngine];
[engine activateAudioSessionWithCategory:AVAudioSessionCategoryPlayback];
[engine openAudioDevice];
soundA = [soundFactory loadSoundNamed:#"1.caf" maxPolyphony:16 error:NULL];
soundB = [soundFactory loadSoundNamed:#"2.caf" maxPolyphony:16 error:NULL];
soundC = [soundFactory loadSoundNamed:#"3.caf" maxPolyphony:16 error:NULL];
soundD = [soundFactory loadSoundNamed:#"4.caf" maxPolyphony:16 error:NULL];
}
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
return (interfaceOrientation != UIInterfaceOrientationPortraitUpsideDown);
} else {
return YES;
}
}
- (IBAction) PlaySoundA {[soundA play];}
- (IBAction) PlaySoundB {[soundB play];}
- (IBAction) PlaySoundC {[soundC play];}
- (IBAction) PlaySoundD {[soundD play];}
#end
Header file:
#import <UIKit/UIKit.h>
#import "FISoundEngine.h"
#import "FIFactory.h"
#import "FISound.h"
#interface ViewController : UIViewController {
FIFactory* soundFactory;
FISoundEngine* engine;
FISound* soundA;
FISound* soundB;
FISound* soundC;
FISound* soundD;
}
#end
Any help would be appreciated! Thanks!
Most likely the sound player cannot find your sound file. try right clicking on the audio file and selecting view in finder. make sure the file is actually in your project directory and not

Detect touch globally

I trying to figure out how to solve this (fairly) simple problem but I failing miserably, so I really need your advice.
My application consists of a uitabbar with several tabs. In one of them I have a bunch of UIImageViews each of which represents the thumbnail of a picture. Similarly as you remove apps from the iPhone by pressing for a second on the app icon, I implemented a UILongPressGestureRecognizer recognizer which starts wobbling the thumb. If the user taps on the 'X' that appears on the corner of the thumb the picture gets removed.
The logic that starts and stops the wobbling animation is inside a subclass of UIImageView that is used to show the thumb.
What I'm trying to do is cancel the wobble effect if the user presses anywhere else outside the thumb. Ideally, if possible, I would prefer to place the code that detects this cancel touch inside the UIImageView subclass.
To catch all touch events globally I ended up subclassing UIWindow as follows:
// CustomUIWindow.h
#import <UIKit/UIKit.h>
#define kTouchPhaseBeganCustomNotification #"TouchPhaseBeganCustomNotification"
#interface CustomUIWindow : UIWindow
#property (nonatomic, assign) BOOL enableTouchNotifications;
#end
// CustomUIWindow.m
#import "CustomUIWindow.h"
#implementation CustomUIWindow
#synthesize enableTouchNotifications = enableTouchNotifications_;
- (void)sendEvent:(UIEvent *)event
{
[super sendEvent:event]; // Apple says you must always call this!
if (self.enableTouchNotification) {
[[NSNotificationCenter defaultCenter] postNotificationName:kTouchPhaseBeganCustomNotification object:event];
}
}#end
Then whenever I need to start listening to all touches globally I do the following:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(stopThumbnailWobble:)
name:kTouchPhaseBeganCustomNotification
object:nil];
((CustomUIWindow *)self.window).enableTouchNotification = YES;
In stopThumbnailWobble I remove the observer and process the UITouch event to decide whether to remove the thumb or not:
- (void)stopThumbnailWobble:(NSNotification *)event
{
[[NSNotificationCenter defaultCenter] removeObserver:self
name:kTouchPhaseBeganCustomNotification
object:nil];
((CustomUIWindow *)self.window).enableTouchNotification = NO;
UIEvent *touchEvent = event.object;
// process touchEvent and decide what to do
...
Hope this helps others.
If you must include the code detection in your uiimageview subclass then I would tell the appdelegate that a touch was received and where. The app delegate could then either tell all your uiimageviews or tell the viewcontroller which would tell it's uiimageviews.
untested code:
appDelegate = (myAppDelegate *)[[UIApplication sharedApplication] delegate];
[appDelegate touchedAt:(int)xPos yPos:(int)yPos];

How to subclass MPMoviePlayer controller

I am not sure if I understand what subclass of 'MoviePlayerController' means.
a) Is this when create new view controller and add MPMoviePlayerController instance in it?
b) Something else? Some code example would be very helpful.
Thanks
This couldn't be a comment above as it takes up too many characters.
Okay #1110 I'm going to assume you want to add a UITapGestureRecognizer to the player view, don't forget it already supports pinch gestures for full screen / removing full screen.
The code below is assuming you're working with a view controller with MPMoviePlayerController as an iVar.
You probably don't want to detect a single tap because it already uses tap detection count of 1 in order to show/hide the players controller.
Below is a code example for you with a gesture recognizer for a double tap
Code for PlayerViewController.h
#import <UIKit/UIKit.h>
#import <MediaPlayer/MediaPlayer.h>
#interface PlayerViewController : UIViewController {}
//iVar
#property (nonatomic, retain) MPMoviePlayerController *player;
// methods
- (void)didDetectDoubleTap:(UITapGestureRecognizer *)tap;
#end
Code for PlayerViewController.m
#import "PlayerViewController.h"
#implementation PlayerViewController
#synthesize player;
- (void)dealloc
{
[player release];
[super dealloc];
}
- (void)viewDidLoad
{
// initialize an instance of MPMoviePlayerController and set it to the iVar
MPMoviePlayerController *mp = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:#"http://path/to/video.whatever"]];
// the frame is the size of the video on the view
mp.view.frame = CGRectMake(0.0, 0.0, self.view.bounds.size.width, self.view.bounds.size.height / 2);
self.player = mp;
[mp release];
[self.view addSubview:self.player.view];
[self.player prepareToPlay];
// add tap gesture recognizer to the player.view property
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(didDetectDoubleTap:)];
tapGesture.numberOfTapsRequired = 2;
[self.player.view addGestureRecognizer:tapGesture];
[tapGesture release];
// tell the movie to play
[self.player play];
[super viewDidLoad];
}
- (void)didDetectDoubleTap:(UITapGestureRecognizer *)tap {
// do whatever you want to do here
NSLog(#"double tap detected");
}
#end
FYI, I've checked this code out and it works.
If you're not sure what subclassing means, you should research the topic "inheritance". There should be a lot of material covering the subject for iOS specifically, when you create any file in Xcode, most likely you're already subclassing a class. Most of the time you'll be subclassing NSObject or UIViewController when creating a view etc.
You probably don't want to subclass MPMoviePlayerController as it's quite an advanced class built for streaming. MPMoviePlayerViewController works like a normal view controller only it already comes with a MPMoviePlayerController as an iVar.
condensed declaration below:
#interface MPMoviePlayerViewController : UIViewController {}
#property(nonatomic, readonly) MPMoviePlayerController *moviePlayer;
#end
You can find some sample code in the Apple documentation center here:
https://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/MultimediaPG/UsingVideo/UsingVideo.html#//apple_ref/doc/uid/TP40009767-CH3-SW1
Movie player Xcode project you'll find here:
https://developer.apple.com/library/ios/#samplecode/MoviePlayer_iPhone/Introduction/Intro.html#//apple_ref/doc/uid/DTS40007798
Playing movies from an iOS device is really easy, make sure you read it all up on Apple's documentation. Checking out the header files for those classes will also give you more of an insight into what you're dealing with.
Hope that helps.

iphone UIWebView, Landscape, Zoom!

I have a UIWebView in my app and I am having trouble making it work correctly simultaneously with landscape and the viewport zoom.
If I load pages in portrait and then rotate the phone, using the autoresize, it works correctly, zooming in for pages that are set to zoom. However, if I start the webview in landscape mode, I have to rotate to portrait and then rotate back to landscape to get the zoom correct. Mobile Safari does not have this problem.
For example if you load mobile.washingtonpost.com in the webview starting in landscape mode, the text is small. If you rotate to portrait and then back to landscape, the text is larger (as it should be). Mobile safari is smart enough to have the text large immediately when you load it in landscape. I want that :)
I have a workaround, but it is a total hack and I am wondering if there is a better way. Basically, I can get the landscape to have the right zoom, but only after it is loaded by twiddling the frame size of the webivew to 320 wide and then back to 480 wide. This gets me there, but: A) It seems silly, B) I can't do it until the page is loaded so you see the fonts resize.
Below you can see code for a very simple app that does this. It is easiest to use this in the simulator by setting the initial orientation to landscape by adding this to the info.plist:
UIInterfaceOrientation
UIInterfaceOrientationLandscapeRight
And here is the code:
// webviewAppDelegate.m
#import "webviewAppDelegate.h"
#implementation webviewAppDelegate
#synthesize window;
UIWebView *wv;
-(void) doWebView {
wv = [[UIWebView alloc] init];
wv.frame = CGRectMake(0,0,480,300);
[self.view addSubview:wv];
NSURLRequest *r = [NSURLRequest requestWithURL:[NSURL URLWithString:#"http://mobile.washingtonpost.com"]];
wv.delegate = self;
[wv loadRequest:r];
}
- (void)webViewDidFinishLoad:(UIWebView *)webView {
// HACK
wv.frame = CGRectMake(0,0,320,300);
wv.frame = CGRectMake(0,0,480,300);
}
- (void)applicationDidFinishLaunching:(UIApplication *)application {
// Override point for customization after application launch
[window makeKeyAndVisible];
UIView *blank = [[UIView alloc] init];
self.view = blank;
[window addSubview:self.view];
NSTimer *t = [NSTimer timerWithTimeInterval : 2.0target: selfselector: #selector(doWebView) userInfo:nil repeats: NO];
[[NSRunLoop currentRunLoop] addTimer:t forMode:NSDefaultRunLoopMode];
}
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
returnYES;
}
- (void)dealloc {
[window release];
[super dealloc];
}
#end
--------------------------------------------------------
// webviewAppDelegate.h
#import <UIKit/UIKit.h>
#interface webviewAppDelegate : UIViewController <UIApplicationDelegate, UIWebViewDelegate> {
UIWindow *window;
}
#property (nonatomic, retain) IBOutlet UIWindow *window;
#end