I'm experimenting using the google maps for flutter plugin but I'm running into a problem managing the gesture priority.
My map, which need to handle gestures is rendered inside a PageView which also listen to some gestures. The thing is, It appears that the gesture handler of PageView take the priority over the map one, so if I drag vertically on the map, the map move but if I drag horizontally on the map, the page will switch instead of the map moving.
Is there a way to manage the priority between gesture handlers of widget to fix this sort of problems?
Related
I'm creating some kind of image editor (kinda like Canva). For zooming in, out, and panning, I use InteractiveViewer. But the thing is, once I start using InteractiveViewer, I noticed that my other gestures for the elements inside the canvas (e.g. tap and drag on an element should drag the element around) are not working. Tapping and dragging now become panning the whole canvas.
This is still the case even if I turn off panning and scaling (panEnabled: false, scaleEnabled: false). Thankfully single tap is still working because InteractiveViewer does not use that, but I need the entire gesture input when the user makes input that's not intended for the InteractiveViewer. For example, if the user taps on an element so it becomes selected, then all the gesture input should go to the tapped element instead of to InteractiveViewer.
I once thought of a solution to just simply not using InteractiveViewer when there's an element selected, but that causes the user to immediately lose the zoom and pan previously set before. I need solution so that the InteractiveViewer can preserves the zoom and pan setting while letting all the gesture input goes through to the elements inside. How can I do that?
I am wondering what is the equivalent of the web api DocumentOrShadowRoot.elementFromPoint() in flutter.
Specifically, I am wondering how I could figure out what is the leaf element/widget instance in a widget hierarchy, given an Offset.
For example, consider the following structure:
For the First Offset marked with a dark circle, I would expect to get some sort of data that can help me figure out the offset is over Container.
For the Second Offset marked with a dark circle, I would expect the stack.
For the last one, it would be the positioned element.
A bit of context
I'm exploring the implementation of a visual editor similar to FIGMA in Flutter. I have experience in implementing such a rendering system with web technologies.
I want to render a selection indicator or outline when a tap/click happens on each element. These elements are nested. Adding multiple nested event handlers triggers all of them. For example, mouse enter and mouse leave when moving the mouse over the Stack or Positioned element would trigger all the parent event handlers as well.
Any help or guidance would be appreciated.
Simple answer to your exact question: No direct equivalent. Possible to implement but not advisable.
You could theoretically implement your own version of elementFromPoint() by looking at how GestureBinding works in Flutter. That would be a deep dive for sure, and you might learn from it, but there is a simpler solution. Even if you implement your own method, then you would still need to resolve conflicts when more than 1 element is found - and this is something Flutter solves out of the box with the gesture arena.
I see that you expect the top-most or deepest child to be reported, something that you can obtain by using the GestureDetector widget. What you're looking for is making your gesture detectors opaque. A GestureDetector has a property called behaviour of type HitTestBehaviour. The default for it is deferToChild. Here are the possible values:
/// How to behave during hit tests.
enum HitTestBehavior {
/// Targets that defer to their children receive events within their bounds
/// only if one of their children is hit by the hit test.
deferToChild,
/// Opaque targets can be hit by hit tests, causing them to both receive
/// events within their bounds and prevent targets visually behind them from
/// also receiving events.
opaque,
/// Translucent targets both receive events within their bounds and permit
/// targets visually behind them to also receive events.
translucent,
}
What follows is slightly related, so consider it a deep dive in your use-case
Since you're going down this path: I also built a WYSIWYG design system, with selection indicators, handles for rotating, resizing, etc. and have one advice: Completely separate your design rendering from your gesture detectors and selection indicators.
I initially put the gesture detectors "around" the design elements - in your example, the gesture detectors would sit in between yellow / blue / green / red. The reason this is a bad idea is that it complicates a few things. In some cases I needed to create touch areas larger than the design elements themselves, in which case I needed to add padding and reposition the GestureDetector parents. In other cases the design elements would become fixed or locked and would not have a GestureDetector parent and Flutter would completely rebuild the contents of the layer since tree comparing got confused. It gets messy fast. So, stack these layers:
Design on bottom, no interactivity.
Selection indicators, resize / rotate handles. Still no interactivity
Gesture detectors for all design elements. If you're lucky, you know the exact size, position, rotation for the design elements you can simply use Positioned. If you have groups of design elements, then your gesture detectors also get grouped and transformed together. If you also have self-sizing design elements (like images), it gets a bit more complicated, but I got around my issues by adding the design element as an invisible child. The way I would do this now is by loading meta-data about the images and knowing them at build time (as opposed to waiting for images to load and produce layout changes).
Selection indicators + resize / rotate handles gesture detectors. They are top-most and also opaque, so they catch everything that hits them.
This setup then allows you to experiment more in the gesture department, it allows you to use colored boxes to debug and in general will make your life easier.
TLDR: Use opaque gesture detectors.
When you swipe vertically, image is dragged towards the swipe, with fading out animation of black background.
And after release, it smoothly returns to it's previous position on the screen, like Hero animation does.
How is it possible to recreate such an effect using Flutter? The same scenario, in fullscreen photo view.
photo_view package is desired for fullscreen, so it shouldn't interfere with zooming.
What you need is actually an out-of-the-box widget, and it's surprisingly easy to use. It's called Hero. Basically, you wrap the widget you want to animate like that in a Hero widget, with a specific tag string, and, when you navigate to another screen, you wrap the destination widget in another Hero with the same tag. An effect like the one you shared can be achieved by wrapping two widgets with the same photo with Heroes in different PageRoutes.
Check out this Flutter widget of the week video to get you started, and this Flutter.dev article for a more detailed explanation on Hero widgets.
Edit: I see you are looking for a more specific image-viewer behavior. Then, I suggest you use the photo_view package, which includes many functionalities to visualize images, including the hero transition with swipe-dow-to-dismiss behaviors, pinching to zoom, etc.
I found the solution.
To create such an animation, you should use extended_image package, which has SlideOutPage widget for creation of such transitions.
I want to listen for tap events on a widget and take one action for that, and a hold event and take a different action for that. It looks like flutter gesture detector can only detect a single gesture? This seems like it would be hugely limiting for mobile development though, so I figured there must be a way to detect two different gestures on one widget. How would I do that?
if you see the api docs, you will see there are detectors like doubletap, longpress etc
https://api.flutter.dev/flutter/widgets/GestureDetector-class.html
"hold and drag" - would be holding down with one finger and dragging horizontally (or vertically) with another.
GestureDetector would recognize it as Scale gesture. I want to differentiate between these 2:
Two fingers moving closer or further apart (normal scale)
One finger holding and another finger moving closer or further (I want to detect this)
There's a built-in Widget for that called LongPressDraggable you can use it similar to the Draggable widget.
The one main common difference being the ability to directly drag using the Draggable widget or hold and drag using the LongPressDraggable Widget.