Detect user input using headsets - iphone

I'm trying to detect user input (a click) on the headphones connected to an iPhone. So far I've only found how to detect interruptions using AVAudioSession. Is AVAudioSession right or is there another way? how?

You want this:
beginReceivingRemoteControlEvents
You implement something this in one of your VCs classes:
// If using a nonmixable audio session category, as this app does, you must activate reception of
// remote-control events to allow reactivation of the audio session when running in the background.
// Also, to receive remote-control events, the app must be eligible to become the first responder.
- (void) viewDidAppear: (BOOL) animated {
[super viewDidAppear: animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
- (BOOL) canBecomeFirstResponder {
return YES;
}
// Respond to remote control events
- (void) remoteControlReceivedWithEvent: (UIEvent *) receivedEvent {
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
[self playOrStop: nil];
break;
default:
break;
}
}
}
See the sample code here.

It is even easier now, as of iOS 7. To execute a block when the headphone play/pause button is pressed:
MPRemoteCommandCenter *commandCenter = [MPRemoteCommandCenter sharedCommandCenter];
[commandCenter.togglePlayPauseCommand addTargetWithHandler:^MPRemoteCommandHandlerStatus(MPRemoteCommandEvent * _Nonnull event) {
NSLog(#"toggle button pressed");
return MPRemoteCommandHandlerStatusSuccess;
}];
or, if you prefer to use a method instead of a block:
[commandCenter.togglePlayPauseCommand addTarget:self action:#selector(toggleButtonAction)];
To stop:
[commandCenter.togglePlayPauseCommand removeTarget:self];
or:
[commandCenter.togglePlayPauseCommand removeTarget:self action:#selector(toggleButtonAction)];
You'll need to add this to the includes area of your file:
#import MediaPlayer;

Related

How to assign AVPlayer play/pause button iOS?

I have asked this question before few days but nobody answer me and I could not find the solution.
So I want to ask the same question again and please answer me if you know the answer.
I want to assign these buttons in the picture to my avplayer play/pause button.
Note: My application icons appear in the now playing bar instead of Music icon, my application works fine in background .
Any help please.
To allow delivery of remote-control events, you must call the beginReceivingRemoteControlEvents method of UIApplication.
You then respond to the remote control events by implementing the remoteControlReceivedWithEvent: method like this:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
...
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[[UIApplication sharedApplication] becomeFirstResponder];
return YES;
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)event
{
switch (event.subtype)
{
case UIEventSubtypeRemoteControlPlay:
// play code
...
break;
case UIEventSubtypeRemoteControlTogglePlayPause:
// toggle code
...
break;
case UIEventSubtypeRemoteControlNextTrack:
// next code
...
break;
default:
break;
}
}
Note:
"In iOS 7.1 and later, use the shared MPRemoteCommandCenter object to register for remote control events. You do not need to call this method (note: beginReceivingRemoteControlEvents) when using the shared command center object."
(Source)

Swallow touches, unless touching a child of my current layer

I am writing a pause menu using a CCLayer. I need the layer to swallow touches so that you cannot press the layer below, however I also need to be able to use the buttons on the pause layer itself.
I can get the layer to swallow touches, but the menu won't work either.
Here is my code:
pauseLayer.m
#import "PauseLayer.h"
#implementation PauseLayer
#synthesize delegate;
+ (id) layerWithColor:(ccColor4B)color delegate:(id)_delegate
{
return [[[self alloc] initWithColor:color delegate:_delegate] autorelease];
}
- (id) initWithColor:(ccColor4B)c delegate:(id)_delegate {
self = [super initWithColor:c];
if (self != nil) {
NSLog(#"Init");
self.isTouchEnabled = YES;
CGSize wins = [[CCDirector sharedDirector] winSize];
delegate = _delegate;
[self pauseDelegate];
CCSprite * background = [CCSprite spriteWithFile:#"pause_background.png"];
[self addChild:background];
CCMenuItemImage *resume = [CCMenuItemImage itemFromNormalImage:#"back_normal.png"
selectedImage:#"back_clicked.png"
target:self
selector:#selector(doResume:)];
resume.tag = 10;
CCMenu * menu = [CCMenu menuWithItems:resume,nil];
[menu setPosition:ccp(0,0)];
[resume setPosition:ccp([background boundingBox].size.width/2,[background boundingBox].size.height/2)];
[background addChild:menu];
[background setPosition:ccp(wins.width/2,wins.height/2)];
}
return self;
}
-(void)pauseDelegate
{
NSLog(#"pause delegate");
if([delegate respondsToSelector:#selector(pauseLayerDidPause)])
[delegate pauseLayerDidPause];
}
-(void)doResume: (id)sender
{
if([delegate respondsToSelector:#selector(pauseLayerDidUnpause)])
[delegate pauseLayerDidUnpause];
[self.parent removeChild:self cleanup:YES];
}
- (void)onEnter {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:INT_MIN+1 swallowsTouches:YES];
[super onEnter];
}
- (void)onExit {
[[CCTouchDispatcher sharedDispatcher] removeDelegate:self];
[super onExit];
}
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
return YES;
}
-(void)dealloc
{
[super dealloc];
}
#end
why dont you just disable touches on the game layer?
like in the onEnter method disable the touches on the game layer..and onExit re enable them
something like
-onEnter{
gameLayer.isTouchEnabled=NO;
....
}
-onExit{
gameLater.isTouchEnabled=YES;
...
}
also you wont need CCTouchDispatcher
According to your code, the problem is the modal layer is swallowing events even if it's for the own children.
To solve this problem, you have to set the touch priority of the children even higher than the modal layer itself.
In other words, set the menu's touch priority value below modal layer's.
There are two solutions.
Simply override "CCMenu::registerWithTouchDispatcher()" method and set the priority higher from the beginning.
Change menu's touch priority using "setPriority" method of the touchDispatcher or "setHandlerPriority" of menu itself.
To achieve second solution, you have to pay attention to the timing.
"CCMenu::registerWithTouchDispatcher()" is called somewhere AFTER "onEnter" and "onEnterTransitionDidFinish".
So, use "scheduleOnce" or something like that.
Sample codes.
- (id) initWithColor:(ccColor4B)c delegate:(id)_delegate {
self = [super initWithColor:c];
if (self != nil) {
//your codes...... put CCMenu in instance
[self scheduleOnce:#selector(setMenuPriority:) delay:0];
}
return self;
}
- (void) setMenuPriority (ccTime)dt {
[[[CCDirector sharedDirector] touchDispatcher] setPriority:INT_MIN forDelegate:menu];
//priority "INT_MIN" because you set the layer's priority to "INT_MIN+1".
}
PS: My english is not so good, so if there are loose sentences, correction will be very pleased :)
The problem is, that the layer/node hierarchy is not considered when propagating touches.
The touches are handed from the touch handler with the smallest priority value to the ones with the highest.
The touches are not forwarded anymore, once one of the responders swallows the touch.
You can use an approach similar to CCMenu. CCMenu handles all touches and detects which CCMenuItemhas been selected, based on the position of these items.
If you implement this the same way, you let your PauseLayer handle and swallow all touches and use a seperate mechanism to determine which child element in your PauseLayer has been selected.
Example Implementation: CCMenu Subclass
I have implemented a solution in this repository:
https://github.com/Ben-G/MGWU-Widgets/tree/master/Projectfiles/Components/CCMenuBlocking
That component is a CCMenuSubclass that swallows all touches and does not forward them.
Example Implementation: CCNode
Here is a more general solution of a CCNode that swallows touches and only forwards them to its own children:
https://github.com/Ben-G/MGWU-Widgets/blob/master/Projectfiles/Components/Popup/PopUp.m

iOS: How to access the `UIKeyboard`?

I want to get a pointer reference to UIKeyboard *keyboard to the keyboard on screen so that I can add a transparent subview to it, covering it completely, to achieve the effect of disabling the UIKeyboard without hiding it.
In doing this, can I assume that there's only one UIKeyboard on the screen at a time? I.e., is it a singleton? Where's the method [UIKeyboard sharedInstance]. Brownie points if you implement that method via a category. Or, even more brownie points if you convince me why it's a bad idea to assume only one keyboard and give me a better solution.
Try this:
// my func
- (void) findKeyboard {
// Locate non-UIWindow.
UIWindow *keyboardWindow = nil;
for (UIWindow *testWindow in [[UIApplication sharedApplication] windows]) {
if (![[testWindow class] isEqual:[UIWindow class]]) {
keyboardWindow = testWindow;
break;
}
}
// Locate UIKeyboard.
UIView *foundKeyboard = nil;
for (UIView *possibleKeyboard in [keyboardWindow subviews]) {
// iOS 4 sticks the UIKeyboard inside a UIPeripheralHostView.
if ([[possibleKeyboard description] hasPrefix:#"<UIPeripheralHostView"]) {
possibleKeyboard = [[possibleKeyboard subviews] objectAtIndex:0];
}
if ([[possibleKeyboard description] hasPrefix:#"<UIKeyboard"]) {
foundKeyboard = possibleKeyboard;
break;
}
}
}
How about using -[UIApplication beginIgnoringInteractionEvents]?
Also, another trick to get the view containing the keyboard is to initialize a dummy view with CGRectZero and set it as the inputAccessoryView of your UITextField or UITextView. Then, get its superview. Still, such shenanigans is private/undocumented, but I've heard of apps doing that and getting accepted anyhow. I mean, how else would Instagram be able to make their comment keyboard interactive (dismiss on swipe) like the Messages keyboard?
I found that developerdoug's answer wasn't working on iOS 7, but by modifying things slightly I managed to get access to what I needed. Here's the code I used:
-(UIView*)findKeyboard
{
UIView *keyboard = nil;
for (UIWindow* window in [UIApplication sharedApplication].windows)
{
for (UIView *possibleKeyboard in window.subviews)
{
if ([[possibleKeyboard description] hasPrefix:#"<UIPeripheralHostView"])
{
keyboard = possibleKeyboard;
break;
}
}
}
return keyboard;
}
From what I could make out, in iOS 7 the keyboard is composed of a UIPeripheralHostView containing two subviews: a UIKBInputBackdropView (which provides the blur effect on whatever's underneath the keyboard) and a UIKeyboardAutomatic (which provides the character keys). Manipulating the UIPeripheralHostView seems to be equivalent to manipulating the entire keyboard.
Discaimer: I have no idea whether Apple will accept an app that uses this technique, nor whether it will still work in future SDKs.
Be aware, Apple has made it clear that applications which modify private view hierarchies without explicit approval beforehand will be rejected. Take a look in the Apple Developer Forums for various developers' experience on the issue.
If you're just trying to disable the keyboard (prevent it from receiving touches), you might try adding a transparent UIView that is the full size of the screen for the current orientation. If you add it as a subview of the main window, it might work. Apple hasn't made any public method of disabling the keyboard that I'm aware of - you might want to use one of your support incidents with Apple, maybe they will let you in on the solution.
For an app I am currently developing I am using a really quick and easy method:
Add this in the header file:
// Add in interface
UIWindow * _window;
// Add as property
#property (strong, nonatomic) IBOutlet UIView * _keyboard;
Then add this code in the bottom of the keyboardWillShow function:
-(void) keyboardWillShow: (NSNotification *) notification {
.... // other keyboard will show code //
_window = [UIApplication sharedApplication].windows.lastObject;
[NSTimer scheduledTimerWithTimeInterval:0.05
target:self
selector:#selector(allocateKeyboard)
userInfo:nil
repeats:NO];
}
This code look for when the keyboard is raised and then allocates the current window. I have then added a timer to allocate the keyboard as there were some issues when allocated immediately.
- (void)allocateKeyboard {
if (!_keyboard) {
if (_window.subviews.count) {
// The keyboard is always the 0th subview
_keyboard = _window.subviews[0];
}
}
}
We now have the keyboard allocated which gives you direct "access" to the keyboard as the question asks.
Hope this helps
Under iOS 8 it appears you have to jump down the chain more than in the past. The following works for me to get the keyboard, although with custom keyboards available and such I wouldn't rely on this working unless you're running in a controlled environment.
- (UIView *)findKeyboard {
for (UIWindow* window in [UIApplication sharedApplication].windows) {
UIView *inputSetContainer = [self viewWithPrefix:#"<UIInputSetContainerView" inView:window];
if (inputSetContainer) {
UIView *inputSetHost = [self viewWithPrefix:#"<UIInputSetHostView" inView:inputSetContainer];
if (inputSetHost) {
UIView *kbinputbackdrop = [self viewWithPrefix:#"<_UIKBCompatInput" inView:inputSetHost];
if (kbinputbackdrop) {
UIView *theKeyboard = [self viewWithPrefix:#"<UIKeyboard" inView:kbinputbackdrop];
return theKeyboard;
}
}
}
}
return nil;
}
- (UIView *)viewWithPrefix:(NSString *)prefix inView:(UIView *)view {
for (UIView *subview in view.subviews) {
if ([[subview description] hasPrefix:prefix]) {
return subview;
}
}
return nil;
}

iphone keyboard touch events

I need to be able to detect touch events on the keyboard. I have an app which shows a screen which occurs after a certain period of inactivity (i.e. no touch events) To solve this issue, I have subclassed my UIWindow and implemented the sendEvent function, which allows me to get touch events on the whole application by implementing the method in one place. This works everywhere beside when the keyboard is presented and the user is typing on the keyboard. What I need to know is that is there a way to detect touch events on the keyboard, kind of like what sentEvent does for uiWindow. Thanks in advance.
found a solution to the problem. if you observe the following notifications, you are able to get an event when the key is pressed. I added these notifications in my custom uiwindow class so doing it at one place will allow me to get these touch events throughout the application.
[[NSNotificationCenter defaultCenter] addObserver: self selector: #selector(keyPressed:) name: UITextFieldTextDidChangeNotification object: nil];
[[NSNotificationCenter defaultCenter] addObserver: self selector: #selector(keyPressed:) name: UITextViewTextDidChangeNotification object: nil];
- (void)keyPressed:(NSNotification*)notification
{ [self resetIdleTimer]; }
anyways, hope it helps someone else.
iPhoneDev: here is what I am doing.
I have a custom UIWindow object. In this object, there is a NSTimer that is reset whenever there is a touch. To get this touch you have to override the sendEvent method of UIWindow.
this is what the sendEvent method looks like in my custom window class:
- (void)sendEvent:(UIEvent *)event
{
if([super respondsToSelector: #selector(sendEvent:)])
{
[super sendEvent:event];
}
else
{
NSLog(#"%#", #"CUSTOM_Window super does NOT respond to selector sendEvent:!");
ASSERT(false);
}
// Only want to reset the timer on a Began touch or an Ended touch, to reduce the number of timer resets.
NSSet *allTouches = [event allTouches];
if ([allTouches count] > 0)
{
// anyObject works here.
UITouchPhase phase = ((UITouch *)[allTouches anyObject]).phase;
if (phase == UITouchPhaseBegan || phase == UITouchPhaseEnded)
{
[self resetIdleTimer];
}
}
}
here is the resetIdleTimer:
- (void)resetIdleTimer
{
if (self.idleTimer)
{
[self.idleTimer invalidate];
}
self.idleTimer = [NSTimer scheduledTimerWithTimeInterval:PASSWORD_TIMEOUT_INTERVAL target:self selector:#selector(idleTimerExceeded) userInfo:nil repeats:NO];
}
after this, in the idleTimerExceeded, I send a message to the window delegates, (in this case, the appDelegate).
- (void)idleTimerExceeded
{
[MY_CUSTOM_WINDOW_Delegate idleTimeLimitExceeded];
}
When I create this custom window object in the appDelegate, I set the appDelegate as the delegate for this window. And in the appDelegate definition of idleTimeLimitExceeded is where I do what I have to when the timer expires. They key thing is the create the custom window and override the sendEvent function. Combine this with the two keyboard notification shown above I added in the init method of the custom window class and you should be able to get 99% of all touch events on screen anywhere in the application.

Lock Unlock events iphone

How can I detect lock/unlock events on the iPhone? Assuming it's only possible for jailbroken devices, can you point me to the correct API?
By lock events, I mean showing or hiding the Lock Screen (which might need a password to unlock, or not).
You can use Darwin notifications, to listen for the events. From my testing on a jailbroken iOS 5.0.1 iPhone 4, I think that one of these events might be what you need:
com.apple.springboard.lockstate
com.apple.springboard.lockcomplete
Note: according to the poster's comments to a similar question I answered here, this should work on a non-jailbroken phone, too.
To use this, register for the event like this (this registers for just the first event above, but you can add an observer for lockcomplete, too):
CFNotificationCenterAddObserver(CFNotificationCenterGetDarwinNotifyCenter(), //center
(void*)self, // observer (can be NULL)
lockStateChanged, // callback
CFSTR("com.apple.springboard.lockstate"), // event name
NULL, // object
CFNotificationSuspensionBehaviorDeliverImmediately);
where lockStateChanged is your event callback:
static void lockStateChanged(CFNotificationCenterRef center, void *observer, CFStringRef name, const void *object, CFDictionaryRef userInfo) {
NSLog(#"event received!");
if (observer != NULL) {
MyClass *this = (MyClass*)observer;
}
// you might try inspecting the `userInfo` dictionary, to see
// if it contains any useful info
if (userInfo != nil) {
CFShow(userInfo);
}
}
The lockstate event occurs when the device is locked and unlocked, but the lockcomplete event is only triggered when the device locks. Another way to determine whether the event is for a lock or unlock event is to use notify_get_state(). You'll get a different value for lock vs. unlock, as described here.
Round about answer:
Application will resign active gets called in all sorts of scenarios... and from all my testing, even if your application stays awake while backgrounded, there are no ways to determine that the screen is locked (CPU speed doesn't report, BUS speed remains the same, mach_time denom / numer doesn't change)...
However, it seems Apple does turn off the accelerometer when the device is locked... Enable iPhone accelerometer while screen is locked
(tested iOS4.2 on iPhone 4 has this behavior)
Thus...
In your application delegate:
- (void)applicationWillResignActive:(UIApplication *)application
{
NSLog(#"STATUS - Application will Resign Active");
// Start checking the accelerometer (while we are in the background)
[[UIAccelerometer sharedAccelerometer] setDelegate:self];
[[UIAccelerometer sharedAccelerometer] setUpdateInterval:1]; // Ping every second
_notActiveTimer = [NSTimer scheduledTimerWithTimeInterval:2 target:self selector:#selector(deviceDidLock) userInfo:nil repeats:NO]; // 2 seconds for wiggle
}
//Deprecated in iOS5
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration
{
NSLog(#"STATUS - Update from accelerometer");
[_notActiveTimer invalidate];
_notActiveTimer = [NSTimer scheduledTimerWithTimeInterval:2 target:self selector:#selector(deviceDidLock) userInfo:nil repeats:NO];
}
- (void)deviceDidLock
{
NSLog(#"STATUS - Device locked!");
[[UIAccelerometer sharedAccelerometer] setDelegate:nil];
_notActiveTimer = nil;
}
- (void)applicationDidBecomeActive:(UIApplication *)application
{
NSLog(#"STATUS - Application did become active");
[[UIAccelerometer sharedAccelerometer] setDelegate:nil];
[_notActiveTimer invalidate];
_notActiveTimer = nil;
}
I know... It's kind of a hack, but it has worked like a charm for me so far. Please update if you see any issues that prevent this from working.
There is a prettier way of telling apart task switching and screen locking-originated applicationWillResignActive: callbacks which doesn't even involve undocumented features such as the accelerometer state.
When the app is moving to the background, the app delegate is first sent an applicationWillResignActive:, then an applicationDidEnterBackground:. When the app is interrupted by pressing the Lock button or by an incoming phone call, the latter method is not called. We can use this information to distinguish between the two scenarios.
Say you want to be called back in the screenLockActivated method if the screen gets locked. Here's the magic:
- (void)applicationWillResignActive:(UIApplication*)aApplication
{
[self performSelector:#selector(screenLockActivated)
withObject:nil
afterDelay:0];
}
- (void)applicationDidEnterBackground:(UIApplication*)aApplication
{
[NSObject cancelPreviousPerformRequestsWithTarget:self];
}
- (void)screenLockActivated
{
NSLog(#"yaay");
}
Explanation:
By default, we assume that every call to applicationWillResignActive: is because of an active->inactive state transition (as when locking the screen) but we generously let the system prove the contrary within a timeout (in this case, a single runloop cycle) by delaying the call to screenLockActivated. In case the screen gets locked, the system finishes the current runloop cycle without touching any other delegate methods. If, however, this is an active->background state transition, it also invokes applicationDidEnterBackground: before the end of the cycle, which allows us to simply cancel the previously scheduled request from there, thus preventing it from being called when it's not supposed to.
Enjoy!
As of the time of writing there are two fairly reliable ways to detect device locking:
Data Protection
By enabling the Data Protection entitlement your app can subscribe to the applicationProtectedDataWillBecomeUnavailable: and applicationProtectedDataDidBecomeAvailable: notifications to determine with high probability when a device that uses passcode/TouchID Authentication is locked/unlocked. To determine if a device uses a passcode/TouchID LAContext can be queried.
Caveats: This method relies on the "protected data becoming unavailable" coinciding with the phone being locked. When the phone is using TouchID and the sleep/lock button is pressed then the phone is locked, protected data becomes unavailable, and a passcode will immediately be required to unlock it again. This means that protected data becoming unavailable essentially indicates that the phone has been locked. This is not necessarily true when someone is using just a passcode since they can set the "requires passcode" time to anywhere from immediately to something like 4 hours. In this case the phone will report being able to handle protected data but locking the phone will not result in protected data becoming unavailable for quite some time.
Lifecycle Timing
If your app is in the foreground there will be a noticeable change in time difference between the two lifecycle events UIApplicationWillResignActiveNotification and UIApplicationDidEnterBackgroundNotification depending on what triggers them.
(This was tested in iOS 10 and may change in future releases)
Pressing the home button results in a significant delay between the two (even when the Reduced Motion setting is enabled):
15:23:42.517 willResignActive
15:23:43.182 didEnterBackground
15:23:43.184 difference: 0.666346
Locking the device while the app is open creates a more trivial (<~0.2s) delay between the two events:
15:22:59.236 willResignActive
15:22:59.267 didEnterBackground
15:22:59.267 difference: 0.031404
in iOS 8, you lock the screen or push the home button, all of those make app push in background, but you don't know which operator result in this. My solution same with Nits007ak,use notify_register_dispatch to get state.
#import <notify.h>
int notify_token
notify_register_dispatch("com.apple.springboard.lockstate",
&notify_token,
dispatch_get_main_queue(),
^(int token)
{
uint64_t state = UINT64_MAX;
notify_get_state(token, &state);
if(state == 0) {
NSLog(#"unlock device");
} else {
NSLog(#"lock device");
}
}
);
As long as the app is running, in foreground or background. not suspend, you can get this event.
And you can use notify_token as parameter of notify_get_state to get current state anywhere, this is useful when you want know the state and the screen state don't change.
If passcode is set, you can use these event in AppDelegate
-(void)applicationProtectedDataWillBecomeUnavailable:(UIApplication *)application
{
}
- (void)applicationProtectedDataDidBecomeAvailable:(UIApplication *)application
{
}
Just import #import notify.h before using this code. enjoy!!
-(void)registerAppforDetectLockState {
int notify_token;
notify_register_dispatch("com.apple.springboard.lockstate", &notify_token,dispatch_get_main_queue(), ^(int token) {
uint64_t state = UINT64_MAX;
notify_get_state(token, &state);
if(state == 0) {
NSLog(#"unlock device");
} else {
NSLog(#"lock device");
}
NSLog(#"com.apple.springboard.lockstate = %llu", state);
UILocalNotification *notification = [[UILocalNotification alloc]init];
notification.repeatInterval = NSDayCalendarUnit;
[notification setAlertBody:#"Hello world!! I come becoz you lock/unlock your device :)"];
notification.alertAction = #"View";
notification.alertAction = #"Yes";
[notification setFireDate:[NSDate dateWithTimeIntervalSinceNow:1]];
notification.soundName = UILocalNotificationDefaultSoundName;
[notification setTimeZone:[NSTimeZone defaultTimeZone]];
[[UIApplication sharedApplication] presentLocalNotificationNow:notification];
});
}
From a lot of trial and error, discovered monitoring the blank screen, lock complete and lock state events gives a consistent lock screen indicator. You'll need to monitor a state transition.
// call back
void displayStatusChanged(CFNotificationCenterRef center, void *observer, CFStringRef name, const void *object, CFDictionaryRef userInfo)
{
// notification comes in order of
// "com.apple.springboard.hasBlankedScreen" notification
// "com.apple.springboard.lockcomplete" notification only if locked
// "com.apple.springboard.lockstate" notification
AppDelegate *appDelegate = CFBridgingRelease(observer);
NSString *eventName = (__bridge NSString*)name;
NSLog(#"Darwin notification NAME = %#",name);
if([eventName isEqualToString:#"com.apple.springboard.hasBlankedScreen"])
{
NSLog(#"SCREEN BLANK");
appDelegate.bDeviceLocked = false; // clear
}
else if([eventName isEqualToString:#"com.apple.springboard.lockcomplete"])
{
NSLog(#"DEVICE LOCK");
appDelegate.bDeviceLocked = true; // set
}
else if([eventName isEqualToString:#"com.apple.springboard.lockstate"])
{
NSLog(#"LOCK STATUS CHANGE");
if(appDelegate.bDeviceLocked) // if a lock, is set
{
NSLog(#"DEVICE IS LOCKED");
}
else
{
NSLog(#"DEVICE IS UNLOCKED");
}
}
}
-(void)registerforDeviceLockNotif
{
// screen and lock notifications
CFNotificationCenterAddObserver(CFNotificationCenterGetDarwinNotifyCenter(), //center
CFBridgingRetain(self), // observer
displayStatusChanged, // callback
CFSTR("com.apple.springboard.hasBlankedScreen"), // event name
NULL, // object
CFNotificationSuspensionBehaviorDeliverImmediately);
CFNotificationCenterAddObserver(CFNotificationCenterGetDarwinNotifyCenter(), //center
CFBridgingRetain(self), // observer
displayStatusChanged, // callback
CFSTR("com.apple.springboard.lockcomplete"), // event name
NULL, // object
CFNotificationSuspensionBehaviorDeliverImmediately);
CFNotificationCenterAddObserver(CFNotificationCenterGetDarwinNotifyCenter(), //center
CFBridgingRetain(self), // observer
displayStatusChanged, // callback
CFSTR("com.apple.springboard.lockstate"), // event name
NULL, // object
CFNotificationSuspensionBehaviorDeliverImmediately);
}
To have the screen lock indicators run in the background, you need to implement background processing calling the following upon app launching.
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
self.backgroundTaskIdentifier =
[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{
[[UIApplication sharedApplication] endBackgroundTask:self.backgroundTaskIdentifier];
}];
[self registerforDeviceLockNotif];
}
If your app is running and the user locks the device your app delegate will receive a call to 'application Will Resign Active:'. If your app was running when locked, it will receive a call to 'application Did Become Active:' when the device is unlocked. But you get the same calls to your app if the user gets a phone call and then chooses to ignore it. You can't tell the difference as far as I know.
And if your app wasn't running at any of these times there is no way to be notified since your app isn't running.
The simplest way to get screen lock and unlock events are by adding event observers using NSNotificationCenter in your viewcontroller. I added the following observer in the viewdidload method. This is what i did:
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(applicationEnteredForeground:)
name:UIApplicationWillEnterForegroundNotification
object:nil];
Then I added the following selector to the viewcontroller. This selector will get called when the screen is unlocked.
- (void)applicationEnteredForeground:(NSNotification *)notification {
NSLog(#"Application Entered Foreground");
}
If you want to detect the event when screen gets locked, you can replace UIApplicationWillEnterForegroundNotification with UIApplicationDidEnterBackgroundNotification.