Does NSUndoManager work on iPhone without shake gestures? - iphone

I have an app that uses the shake gesture for something else besides undo. I want to use NSUndoManager and all the examples and how-tos say I have to put the following line in my applicationDidFinishLaunching method.
application.applicationSupportsShakeToEdit = YES;
Is it possible to use NSUndoManager without supporting the shake to undo gesture?

Sure. Create an Undo button, then:
- (IBAction)undoButtonPressed {
[myUndoManager undo];
}

Related

iPod Touch iOS 5.1.1 Not calling viewDidDisappear:animated

I've posted this question elsewhere, but as SO is such a great community I'm doing so here as well.
First up, I'm using Cocos2D 2.0-gles20 to put a multiplayer/team oriented game together.
I've been integrating GameKitHelper into the app. To date it's been working just fine on my iPhone4 and iPad2 and in the Simulator, but now when I try to use it on an iPod Touch 4th I'm getting assertions in [CCDirectorIOS startAnimation] because the app is getting a viewWillAppear when it shouldn't and no call to viewDidDisappear when it should.
The reason this matters is that these methods on the CCDirectorIOS class cause Cocos2D to start/stop animation whilst another UIKit view is in front. This is something that I've managed myself with Cocos2D-0.99 but with 2.0 it is handled nicely within the director so that each app doesn't have to handle it specifically.
The GameKitHelper class has the following methods for pushing a GKMatchmakerViewController onto the screen:
-(void) showMatchmakerWithInvite:(GKInvite*)invite
{
GKMatchmakerViewController* inviteVC = [[[GKMatchmakerViewController alloc] initWithInvite:invite] autorelease];
if (inviteVC != nil)
{
inviteVC.matchmakerDelegate = self;
[self presentViewController:inviteVC];
}
}
-(UIViewController*) getRootViewController
{
return [CCDirector sharedDirector];
}
-(void) presentViewController:(UIViewController*)vc
{
UIViewController* rootVC = [self getRootViewController];
[rootVC presentModalViewController:vc animated:YES];
}
-(void) dismissModalViewController
{
UIViewController* rootVC = [self getRootViewController];
[rootVC dismissModalViewControllerAnimated:YES];
}
When I call showMatchmakerWithInvite, on the iPhone4, etc I see a call to viewDidDisappear: on the CCDirectorIOS object which stops animation. This is fine. When the GK view is gone, I see a call to viewWillAppear which restarts the animation. Sweet.
On the iPod Touch however, running exactly the same project, the call to viewDidDisappear is not made, but a call to viewWillAppear is, before the GK view has gone.
I can't fathom why there would be a difference. All devices are running iOS 5.1.1.
It's almost as if the behaviour of UIKit is different on the iPod Touch, but I find that hard to believe. My other thought was that I was looking at a timing issue, but I put some code in to allow the app to keep running even with the problem, but the call to viewDidDisappear never happened.
I can work around this I think by managing the start/stop of animation myself, but I would have preferred not to customise the Cocos2D code.
Does anyone have any ideas?
Thanks
Well, being the impatient person I am, rather than leave it to others and work on something else, I nutted it out.
I turns out that the iPod Touch devices in question had multi player games disabled in the restrictions app. This seems to cause the GK view to not show "properly" and as a result the events like viewDidDisappear: and viewWillAppear: don't occur the way I was expecting.
So I've been able to revert all of my tweaks and instrumentation in the Cocos2D code, and simply apply a correction to the GameKitHelper class to ensure that if features such as multi-player are disabled, the player isn't able to request them.

Show iPhone soft keyboard even though a hardware keyboard is connected

My iPad app uses an external "device" that acts as a hardware keyboard. But, at some point in the settings, I need to input text and I can't use the "device" ("device" is not a keyboard).
So, is there any way to force pop the soft keyboard even thought I have a hardware keyboard connected?
Yes. We've done this in a few of our apps for when the user has a Bluetooth scanner "keyboard" paired with the device. What you can do is make sure your textField has an inputAccessoryView and then force the frame of the inputAccessoryView yourself. This will cause the keyboard to display on screen.
We added the following two functions to our AppDelegate. The 'inputAccessoryView' variable is a UIView* we have declared in our app delegate:
//This function responds to all textFieldBegan editing
// we need to add an accessory view and use that to force the keyboards frame
// this way the keyboard appears when the scanner is attached
-(void) textFieldBegan: (NSNotification *) theNotification
{
UITextField *theTextField = [theNotification object];
// NSLog(#"textFieldBegan: %#", theTextField);
if (!inputAccessoryView) {
inputAccessoryView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, navigationController.view.frame.size.width, 1)];
}
theTextField.inputAccessoryView = inputAccessoryView;
[self performSelector:#selector(forceKeyboard) withObject:nil afterDelay:0];
}
//Change the inputAccessoryView frame - this is correct for portrait, use a different
// frame for landscape
-(void) forceKeyboard
{
inputAccessoryView.superview.frame = CGRectMake(0, 759, 768, 265);
}
Then in our applicationDidFinishLaunching we added this notification observer so we would get an event anytime a text field began editing
//Setup the textFieldNotifications
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(textFieldBegan:) name:UITextFieldTextDidBeginEditingNotification object:nil];
Hope that helps!
There’s no way to do this with the current SDK. Please let Apple know via the Bug Reporter.
The solutions here didn't work on iOS 13 or aren't App Store compatible so I solved the problem by creating my own soft keyboard. It is pretty basic but works. Feel free to contribute!
Project on Github
All you have to do is add SoftKeyboardView.swift to your project and somewhere (e.g. appDidFinishLaunching) hit the singleton:
Usage:
SoftKeyboardManager.shared.disabled = false
Since I have the same problem, the closest solution I have found is to use Erica Sadun's app called KeysPlease which is available via cydia and modmyi. It's description is "Use soft kb even when connected to a BT kb.".
Additionally I have found that if you have a physical keyboard also attached, in my case via the iPad keyboard doc, you can bring up the keyboard using a key which seems to map to the eject key on a bluetooth keyboard. Perhaps there is a way to inject this key as if it was pressed on an attached keyboard?
I really wish there was a more official coding solution to this.
When my app connect bluetooth device, keyboard wouldn't show.I try set force the frame of the inputAccessoryView as Brian Robbins say. It didn't work.
Then I use a stupid way to solve.I found when I click textfield or textview one more time, keyboard will show.
So I just need to simulate touch in textfield or textview once , it works.
If you want to do some simulate touch, check this.
https://github.com/HUYU2048/PTFakeTouch

iOS 4: Remote controls for background audio

I'm currently attempting to set up background audio for an app I'm developing for iOS 4. The app doesn't have a dedicated music player viewController, however, unlike other background audio apps such as Pandora, which makes the task a bit more confusing.
I've set the appropriate Info.plist settings correctly and have an AVAudioPlayer object in my app delegate which is accessible from everywhere. When the user plays a song, I replace the AVAudioPlayer with a new one initialized with the song and play it. This all works great, except now I have no idea how to go about supporting remote control events.
Based on Apple's documentation, I have this:
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
[[UIApplication sharedApplication] endReceivingRemoteControlEvents];
[self resignFirstResponder];
}
- (BOOL)canBecomeFirstResponder {
return YES;
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)event {
switch(event.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
if([iPhoneAppDelegate backgroundAudioPlayer].playing)
[iPhoneAppDelegate pauseBackgroundAudioPlayer];
else
[iPhoneAppDelegate playBackgroundAudioPlayer];
break;
}
}
The thing is, where do I put this? Apple's documentation seems to suggest this should go in some view controller somewhere, but my app has lots of view controllers and navigation controllers. Wherever I try to put this, for some reason tapping the Toggle Play/Pause button in the multitasking tray remote controls either causes the song to just pause for a moment and then unpause, or somehow causes the song to play twice.
The documentation examples are a bit misleading, but there is no need to subclass anything anywhere. The correct place to put remoteControlReceivedWithEvent: is in the application delegate, as it remains in the responder chain regardless of whether the app is in the foreground or not. Also the begin/end receiving remote control events should be based on whether you actually need the events, not on the visibility of some random view.
I found a couple of solutions to receiving global remote control events on the Apple Developer Forums after a bit of searching.
One way is to subclass UIWindow and override its remoteControlReceivedWithEvent:.
The second, perhaps nicer way is to subclass UIApplication and override sendEvent:. That way, you can intercept all the remote control events and handle them there globally, and not have any other responders handle them later in the responder chain.
- (void)sendEvent:(UIEvent *)event {
if (event.type == UIEventTypeRemoteControl) {
// Handle event
}
else
[super sendEvent:event];
}
The second method didn't work for me, sendEvent was never called. However the first method worked just nicely (subclassing UIWindow).
I struggled with this one for a while and none of the answers above worked. The bug in my code, and I hope that it will help someone reading this, was that I had the AudioSession set to mix with others. You want to be the foreground audio player to get Remote Control events. Check to see if you have INCORRECT code like this:
[[AVAudioSession sharedInstance] setDelegate: self];
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: nil];
UInt32 doSetProperty = 0;
AudioSessionSetProperty (
kAudioSessionProperty_OverrideCategoryMixWithOthers,
sizeof (doSetProperty),
&doSetProperty
);
NSError *activationError = nil;
[[AVAudioSession sharedInstance] setActive: YES error: &activationError];
And remove the AudioSessionSetProperty, or change doSetProperty to 1.
No need to subclass Window or forward events. Simply handle it from your main view controller. See the Audio Mixer (MixerHost) example for details.
http://developer.apple.com/LIBRARY/IOS/#samplecode/MixerHost/Listings/Classes_MixerHostViewController_m.html
Documentation explains it very well.
https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/Remote-ControlEvents/Remote-ControlEvents.html
One thing that seems to influence this behavior is any category options you set for your AVAudioSession using setCategory:withOptions:error: instead of just setCategory:error:. In particular, from trial and error, it appears that if you set AVAudioSessionCategoryOptionMixWithOthers you will not get remote control events; the now playing controls will still control the iPod app. If you set AVAudioSessionCategoryOptionDuckOthers you will get remote control events, but it seems like there may be some ambiguity regarding which app is controlled. Setting the categoryOptions to 0 or just calling setCategory:error: works best.

Why doesn't the undo/redo panel appear when I start a shake gesture in iPhone Simulator?

I've created an NSUndoManager for the Managed Object Context of Core Data, like this:
NSUndoManager *undoManager = [[NSUndoManager alloc] init];
[undoManager setLevelsOfUndo:10];
[managedObjectContext setUndoManager:undoManager];
[undoManager release];
In the app delegate where the didFinishLaunching method is called, I did this:
application.applicationSupportsShakeToEdit = YES;
For some reason, I never get that undo/redo panel when I make a shake gesture in iPhone Simulator (from the menu). Must I enable undo/redo somewhere else, maybe in the Info.plist file?
As lukya pointed out, you already asked this question back in March.

Using lock screen for my app?

I'd like to make my app use the audio buttons on the lock screen while multitasking.
(Yes, like Pandora.)
What API am I looking to use?
See the Remote Control of Multimedia docs. Basically, you just need to call -beginReceivingRemoteControlEvents on your shared application instance, then register something (probably your main view controller) as the first responder and implement the -remoteControlReceivedWithEvent: method on it. You will get events both from the lock-screen controls and from the headset clicker, as well as the control buttons to the left of the multitasking drawer. To play audio while your application isn't foremost, you should also check out this information on background audio.
It is even easier now, as of iOS 7. Here's the example for the play/pause toggle (headset button). See the docs for MPRemoteCommandCenter and MPRemoteCommand for more options.
MPRemoteCommandCenter *commandCenter = [MPRemoteCommandCenter sharedCommandCenter];
[commandCenter.togglePlayPauseCommand addTargetWithHandler:^MPRemoteCommandHandlerStatus(MPRemoteCommandEvent * _Nonnull event) {
NSLog(#"toggle button pressed");
return MPRemoteCommandHandlerStatusSuccess;
}];
or, if you prefer to use a method instead of a block:
[commandCenter.togglePlayPauseCommand addTarget:self action:#selector(toggleButtonAction)];
To stop:
[commandCenter.togglePlayPauseCommand removeTarget:self];
or:
[commandCenter.togglePlayPauseCommand removeTarget:self action:#selector(toggleButtonAction)];
You'll need to add this to the includes area of your file:
#import MediaPlayer;