Dispatching touch events to other app - service

I am creating android activity that starts a service. This service is intended to receive touch events even if user is using other application.
Its onCreate() method is as follow.
public void onCreate() {
super.onCreate();
// create linear layout
touchLayout = new LinearLayout(this);
// set layout width 30 px and height is equal to full screen
LayoutParams lp = new LayoutParams(30, LayoutParams.MATCH_PARENT);
touchLayout.setLayoutParams(lp);
// set color if you want layout visible on screen
//touchLayout.setBackgroundColor(Color.CYAN);
// set on touch listener
touchLayout.setOnTouchListener(this);
// fetch window manager object
mWindowManager = (WindowManager) getSystemService(WINDOW_SERVICE);
// set layout parameter of window manager
WindowManager.LayoutParams mParams = new WindowManager.LayoutParams(
//30, // width of layout 30 px
WindowManager.LayoutParams.MATCH_PARENT,
WindowManager.LayoutParams.MATCH_PARENT, // height is equal to full screen
WindowManager.LayoutParams.TYPE_PHONE, // Type Ohone, These are non-application windows providing user interaction with the phone (in particular incoming calls).
WindowManager.LayoutParams.FLAG_NOT_FOCUSABLE , // this window won't ever get key input focus
PixelFormat.TRANSLUCENT);
mParams.gravity = Gravity.LEFT | Gravity.TOP;
Log.i(TAG, "add View");
mWindowManager.addView(touchLayout, mParams);
}
Above i am creating a window that span full height and width of screen. I have set it to listen to touch events. But doing so stops other applications receive touch event. So i am looking to dispatch these touch events received on my service to background application on which my window is placed.
Please Help !

Sandip,
Adding an overlay view to the window manager automatically places that view at the top of that window's view hierarchy, meaning that it will intercept touches instead of the view(s) behind it. The only way for views behind it to receive touch events is for the user to touch outside of the top view (which is impossible since it spans the entire screen), or for the top view to label itself "untouchable".
So, either restrict the size of your view, or add the flag WindowManager.LayoutParams.FLAG_NOT_TOUCHABLE to your params flags.
In either case, UNLESS the user currently has your app in the foreground, any touch events outside of your view will return an ACTION_OUTSIDE touch event, but without any coordinates on the location of the touch. (If the user does have your app in the foreground, then you'll receive coordinates with your ACTION_OUTSIDE events.)

Related

UIView animation acting weird

My view controller's view has a child view -a "login" panel of sorts, with two input text fields- placed at its center, with autolayout constraints set in Interface Builder. The view controller class also has an outlet set to reference the vertical constraint and manipulate it at run time.
On startup -viewDidLayoutSubviews()-, I cache the value of the vertical constraint in a property (constraintInitialValue), calculate a value that will hide the panel beneath the view's bounds (based on both the panel's and the view's sizes) and apply that value immediately, effectively hiding the panel before the user sees the view. I also cache this calculated "off-screen" value of the constrain in another property (constraintOffscreenValue), for later use (e.g., to "hide" the panel).
(I do all this initial setup in viewDidLayoutSubviews() because it is the first opportunity to get the actual bounds of my view controller's view.)
(For the record, the original constraint value is 0.0: center Y, no offset. And, for an iPhone 6 and the current size of my panel, the "offscreen value" is -453.0.)
In the background, I authenticate the user. If that fails, I animate the login panel back into its original position (the center of the screen).
So far, so good.
Next, when the user enters their credentials and hits the return key for the password (the last input field), I perform some local validation (e.g., strings are not empty) and "dismiss" the panel by animating it back into its off-screen position. If I do it with the following code:
UIView.animateWithDuration(NSTimeInterval(0.3),
delay:NSTimeInterval(0.0),
options:UIViewAnimationOptions.CurveEaseOut,
animations: { () -> Void in
self.panelVerticalSpaceConstraint.constant = self.constraintOffscreenValue
self.view.layoutIfNeeded()
},
completion: { (finished) -> Void in
}
)
(viewDidLayoutSubviews() gets called several times during the animation, but from the second call onwards my code just returns right away. The initial setup I mentioned above does not get executed more than once)
My panel moves slightly upwards, and then the same ammount downwards, ending at the center of the screen (instead, it should disappear at the very bottom).
If I try changing the animation duration from 0.3 to 10.0 (to get a proper look at what's happening), instead the panel quickly "jumps" to almost above the upper margin of the view, and slowly animates back into the center. That is, if the initial jump didn't happen, I would obtain the desired result (move to the bottom of the view).
Any suggestions? Thanks in advance...
ADDENDUM: If, instead of trying to animate the panel downwards, I set the constraint to its off-screen value immediately, like this:
self.panelVerticalSpaceConstraint.constant = self.constraintOffscreenValue
self.view.layoutIfNeeded()
...it immediately disappears, only to bounce back into the center of the view right away! Where did that animation come from? Is the constraint somehow "resisting change"?
OK, I found the answer. For some weird quirk of fate, I can only find my mistakes after posting a question to SO (curse me a million times!).
It turns out another, separate animation was being executed concurrently (and thus, interfering with) my lower-the-panel-to-below-the-bottom-of-the-screen animation:
I happen to have a similar animation, to adjust the panel's postion up when the keyboard appears ("duck the keyboard" animation), and down back again when the keyboard is dismissed, but only on devices where actual overlap occurs (i.e., smaller screens), by checking the keyboard's dimensions (passed along the UIKeyboardWillShowNotification notification) against those of the main view and the panel.
I had completely forgotten about that one because I was testing on the iPhone 6 simulator, which screen size causes only minimal (almost imperceptible, but nevertheless present) keyboard overlap and hence panel adjustment animation. When I tried the code on the iPhone 6+ simulator (where the keyboard-ducking animation is skipped altogether because no overlapping occurs), the problem disappeared completely and all animations behaved as expected.
This other restore-panel-after-ducking-the-keyboard animation is no longer needed in my code, because the only way to dismiss the keyboard is when valid credentials have been entered, and autnentication begins (lowering the panel all the way down, off-screen). The panel only reappears if the entered credentials failed to be authenticated on the server side.

MKMapKit Follow User

How can I have my MKMapView follow my user around until they scroll, and then have a button to follow the user around again?
Here is the flow I would like it to have.
View Loads:
Zoom in and center on the users current location, then follow the user around.
User scrolls:
Do nothing until a button is pressed
Button pressed:
Same code as 'View Loads'
Your location manager is continuously providing you with new location information via the delegate method locationManager:didUpdateToLocation:fromLocation:. Change the map's region whenever you get an update. Before doing so, check a flag ("shouldFollowCurrentLocation" or similar) that is set by default. You will unset the flag when your map view delegate gets mapView:regionWillChangeAnimated: (you will of course have to keep track of the occasions when you cause the region to change programmatically) and reset it in the button's action method.

MVC - Game - Objective C

I am developing a game for the iPhone. The game basically animates objects on the screen, allowing a user to click on them. I am trying to use the MVC pattern. I have the functionality laid out as such:
Model:
Holds data about the targets (speed, relative size, image, etc)
Has timer running that adds targets to the list (should this be in the controller?)
Controller:
Subscribes to events fired from the model (such as target added)
Subscribes to events fired from the view (such as target clicked)
View:
Displays targets
The flow can be as follows:
Controller tells Model to start game
Model fires timer which says to add a target
Controller hears event and passes it to view
View adds image to screen (animating it)
User clicks on image
View fires event which says image was clicked
Controller hears event and passes it to model
Model removes target from itself
Lastly, I am unsure where to put the animation. Should the view construct the animation (based off of settings from the model)?
I'd consider having the view or controller handle the animation so you can take advantage of Cocoa's built-in animation support. For example it would go like this:
View queries model for position
View adds image to screen at position
View queries model for velocity
View computes new position after 0.1s and animates image to new position
Cocoa fires animation complete event
View hears event
View computes new position after 0.1s more and animates image to new position
And if targets can change velocity, you have some options. You can have the model fire events for velocity changes, and cancel the animation in progress and start a new one. Or you can just have the view requery the model each time an animation completes, ask for the valid position and velocity, and compute the next position. With the latter there would be some missynchronization between the model and the view, but with updates every 0.1s it wouldn't get too far off. It depends how precise you need to be.
I would think that an animation would be part of a model (or a model within a model), specified by a controller, and rendered by a view.
Here's a good example for the iPhone:
http://www.bit-101.com/blog/?p=1969

Why is UIView exclusiveTouch property not blocking?

I am launching a simple UIView with a textField - let's call it orderSetNameView - upon a button tap. I wish to make this view modal, but without using a
[UIViewController presentModalViewContoller:animated:].
It seems I could simply set textInputView.exclusiveTouch = YES to achieve that.
Apple documentation says about exclusiveTouch:
A Boolean value indicating whether the receiver handles touch events
exclusively. If YES, the receiver blocks other views in the same
window from receiving touch events; otherwise, it does not. The
default value is NO.
I assume "same window" means same UIWindow, of which I have only one.
The problem is that when I instantiate my orderSetNameView, add it as a subview, and set exclusiveTouch = YES, touch events happen in all other views of my app, i.e., touch events in other views are not blocked as expected.
// ....
[self.view addSubview:self.orderSetNameView];
[self.orderSetNameView openWithAnimationForAnimationStyle:kMK_AnimationStyleScaleFromCenter];
}
// Set as modal
self.orderSetNameView.exclusiveTouch = YES;
Shouldn't orderSetNameView block touch events in all other views? What am I missing?
From Apple developer forums:
exclusiveTouch only prevents touches in other views during the time in which there's an active touch in the exclusive touch view. That is, if you put a finger down in an exclusive touch view touches won't start in other views until you lift the first finger. It does not prevent touches from starting in other views if there are currently no touches in the exclusiveTouch view.
To truly make this view the only thing on screen that can receive touches you'd need to either add another view over top of everything else to catch the rest of the touches, or subclass a view somewhere in your hierarchy (or your UIWindow itself) and override hitTest:withEvent: to always return your text view when it's visible, or to return nil for touches not in your text view.
Put this in AppDelegate or another file. Use this single time.
// Multi Touch Disable
UIView.appearance().isExclusiveTouch = true
UIButton.appearance().isExclusiveTouch = true

iPhone Input with Stacked Views

I've got a situation in an iphone application where buttons are not receiving input as I expect. Here's the setup:
ViewMain - The main view full of various images and labels
ViewOverlay - A HUD like overlay view with two UIButton objects.
To create my scene I do the following:
viewController.view = ViewMain
[ViewMain addsubview:ViewOverlay]
This view renders as expected, with ViewOverlay correctly rendered ontop of ViewMain. However the two buttons found inside ViewOverlay do not receive touch events and can not be pressed. Tapping them does nothing at all.
I very well may be going about this in the entirely wrong direction. Any ideas?
1) Check that you've connected buttons' touch events (with Interface Builder or programmatically - something like Touch Up or Touch Down).
2) Check that all your parent views (for buttons) can get user touches (User Interactions Enabled == YES to all parent views) - if some parent view can't receive touches, responder chain for it's subviews won't be checked.