My UIView already handles single-finger touches with gesture recognisers for tap and pan just fine.
However, I would like touches with two fingers to be passed through to the parent view, behind this view. (The parent is a WKWebView with a javascript generated map that I'd like the user to be able to pinch-zoom, even while this other view is in front. This works OK when the other view is not in front, but of course when the front view is there, it doesn't pass through the touches.)
I have tried to detect this using either of the following in the front view, but in both cases, allTouches is an empty set (zero touches):
override func point(inside point: CGPoint, with event: UIEvent?) -> Bool {
if let event = event {
print("\(event)")
if let touches = event.allTouches {
print("\(event.allTouches)")
if touches.count > 1 {
return false
}
}
}
return super.point(inside: point, with: event)
}
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
if let event = event {
print("\(event)")
if let touches = event.allTouches {
print("\(event.allTouches)")
if touches.count > 1 {
return nil
}
}
}
return super.hitTest(point, with: event)
}
How can I continue to use my existing gesture recognisers, but pass through multi-finger touches to the superview?
More of a work-around than an actual answer, but here's what solved it for me...
Instead of the in-front child view covering all of the parent view with a transparent layer, I reduced it to only include the small area needed to actually present information and form fields to the user.
Then I changed the single-touch gestures to be added to the parent view, instead of to the child view.
So now NONE of the touches are captured by the child view (except for its form fields and buttons).
This is perhaps the better way to handle my particular situation anyhow. So I was probably asking the wrong question (X/Y issue!).
Related
I want to receive information about mouse move events during button click (mouse up)
I'm adding NSTrackingArea to view that I want to track mouse move and mouse dragged events on, but I still don't receive these events.
I assume that mouseDown in NSButton is blocking mouse events, so the only solution I come up with is overriding mouseDown function for NSButton and not calling super.mouseDown, but then I need to handle button selection manually and I'm not sure if this is right approach for this.
Is this a right solution for my problem? Will there be no problems? Is there a better solution?
Here is code for test, just add button to new project and assign TestButton class to it.
class TestButton: NSButton {
override func mouseDown(with event: NSEvent) {
print("Mouse up")
super.mouseDown(with: event) // After removing events works.
print("Mouse down")
}
}
class ViewController: NSViewController {
override func viewDidLoad() {
super.viewDidLoad()
let trackingArea = NSTrackingArea(rect: view.bounds, options: [.activeAlways, .mouseMoved, .enabledDuringMouseDrag], owner: self, userInfo: nil)
view.addTrackingArea(trackingArea)
}
override func mouseMoved(with event: NSEvent) {
print("Mouse moved")
super.mouseMoved(with: event)
}
override func mouseDragged(with event: NSEvent) {
print("Mouse dragged")
super.mouseDragged(with: event)
}
}
I had a similar requirement for my project. So initially, I started out with NSButton just like you. But, it turned out to be more hectic than I thought. Handling the action and selected state where the main concerns I faced. You could proceed with NSButton if it actually meets up with your requirement. But, I eventually moved to NSView and customised it. So, few of my implementation for handling move and drag event implementation I've open-sourced here is the link.
The traditional way to do this is to use a custom subclass of NSButtonCell and override continueTracking(last:current:in:) (inherited from NSCell). Your override should generally call through to super and return what it returns, but you can do something else in addition, to respond to the mouse movements. The issue is that NSCell and its subclasses have been "soft deprecated" for a while.
That said, it would be very surprising to me as a user that anything other than the button would react to the mouse movements while I'm interacting with the button (clicked in it and dragging).
I am making a mac app using Swift and this app has a custom view (a class extending NSView and overriding its draw method). Now, I want to disable all mouse clicks and mouse drags on this view and pass them on to the other applications running beneath my application.
I have tried the following ways (gleaned from Apple documentation and other SO questions) to disable clicks on my view and nothing worked for me so far:
1. Overriding hitTest inside my custom View class
override func hitTest(_ point: NSPoint) -> NSView? {
let view = super.hitTest(point)
return view == self ? nil : view
}
2. Overriding acceptsFirstMouse inside my custom View class
override func acceptsFirstMouse(for event: NSEvent?) -> Bool {
return false
}
3. Overriding mouseDown in ViewController as well as in my custom View class
override func mouseDown(with event: NSEvent) {
// do nothing
}
4. Overriding mouseDragged in ViewController as well as in my custom View class
override func mouseDragged(with event: NSEvent) {
// do nothing
}
Am I missing something?
This isn't handled at the view level, it's handled at the window level. You can set the ignoresMouseEvents property of the window to true.
The issue is that the Window Server will only dispatch an event to a single process. So, once it has arrived in your app, it's not going to another. And there's no feasible way for your app to forward it along, either.
I am trying to catch mouse down events on some of my controls in the (Cocoa with Storyboards) application window.
If I override mouseDown(with event: NSEvent) in my ViewController class, I face two issues:
It is not possible to identify (directly) which exactly control in the window has been clicked. To do so, I use the following:
Swift 4:
override func mouseDown(with event: NSEvent)
{
let point: NSPoint = event.locationInWindow
let view: NSView = self.view.hitTest(point)!
if type(of:view) == NSTextField.self // with tags is also fine
{
// do something with the control (in the example NSTextField)
// (view as! NSTextField).backgroundColor = NSColor.systemPink
}
}
I am a little bit puzzled, why for such a basic GUI operation, there isn't a "native" event handler, as provided for the mouse click (by creating an #IBAction).
Am I missing something, or this is the way to catch and handle the mouse down events?
For some controls, e.g. NSLevelIndicator, my overridden method is not called. Why?
You must override NSView mouseDown, not NSViewController mouseDown.
I have a container view with a button over it which hides and shows the view. Within the shown view, there are N number of mini buttons that have actions.
The problem I'm having is, when I tap on the mini buttons, those targets are ignored and the larger view button is what receives the action.
How do I configure things so that the larger tappable button on the view still works in most places but where the mini buttons exist, those tap actions register as well?
Thanks!
There are two possible solution
First
Change view hierarchy of uibutton (large on top of the stack in
interface builder)
Like
-Largebutton
-minibutton1
-minibutton1
'
'
-minibuttonn
Second one
Use gesture on the conainer view like
let hideViewGesture = UITapGestureRecognizer(target: self, action: "hideView")
containerView.addGestureRecognizer(hideViewGesture)
func hideView() {
containerView.isHidden = true
}
Its not merely possible to get button action working within a button as
1) Adding a button [Large button] on ContainerView will cause always to detect Large button action and will not allow you to detect button inside it
2) If seen in case of layers large button layer is on top So Logically always large button will first come in Action not inside View of containerView
Possible Solutions :
1) try to make use of gestures on ContainerView
2) you can use a segmented control as show and hide or a UIButton that is placed on side of containerView not over it So you will be able to perform all the required Actions
This is an old question but there actually is a simple solution, by overriding hitTest(_ point: CGPoint, with event: UIEvent?).
Inside your outer button:
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
if self.innerButton.frame.contains(point) {
return self.innerButton
}
else if self.bounds.contains(point) {
return self
}
else {
return super.hitTest(point, with: event)
}
}
It tests to see if the touch point is within the inner button (or you could have several inner buttons), if so it returns it and the touch is registered only by the inner button, if not it goes to the outer view. If neither view contains it, it calls super so the touch is handled correctly by other views.
I'm trying to move a UIView - "A" (Subview) inside another UIView - "B" (SuperView) using the touches moved method. I'm able to move the UIView outside the superview. I want to restrict the UIView inside the Superview bounds. Is there any way there can be a generic method to test if the subview is inside the visible rect of the superview ???
It sounds like you want to constrain the movement of the subview (viewA) to be always completely contained by the superview (viewB). CGRectContainsRect is the right answer, but it must be applied carefully, since a subview frame is specified in it's superview's coordinate system.
// inside touches moved, compute the newViewAFrame based on the movement
// but only assign it if it meets the containment constraint:
if (CGRectContainsRect(viewB.bounds, newViewAFrame)) {
viewA.frame = newViewAFrame;
}
Notice that we don't mention viewB.frame in the check. viewB's position in it's parent is not relevant to whether viewB contains viewA.
Use clipsToBounds method or CGRectContainsRect
youSuperView.clipsToBounds = YES;
I think it will be helpful to you
I had the same problem and this post helped me a lot, so I'll share my answer (that seems to fit exactly what you need) :
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
if let touch = touches.first {
if (SuperView.frame.contains(self.frame)) {
let oldCenter = self.center
self.center = touch.location(in: SuperView)
if(!SuperView.frame.contains(self.frame)) {
self.center = oldCenter
}
}
}
}
quite simple and you can use this in the methods touchesMoved and touchesEnded too, so it won't stop you when your UIView reach the limits (this is why oldCenter exists).
As mentioned by #dahn, if your view is INSIDE the SuperView, you must take care, because the coordinates of the first will be restrict to the frame of the second, so it may fail if the SuperView is not a full-screen view.
If it is not a full-screen view, the SuperView cannot be the dad of the SubView, because it will cause bugs. The solution is to keep the SuperView and the SubView both inside a third View (so their coordinate system will be the same and it will work fine).
Hope that helps someone someday :)
Swift 2.2:
let rect1 = CGRect(...)
let rect2 = CGRect(...)
There is a method to check this - CGRectContainsRect(rect1, rect2)