Zoom & pan with trackpad on web without clicking - flutter

I'm trying to add the panning & zoom functionality on web & desktop for both trackpad & touch (without a modifier, click or key).
The main issue i'm facing is that the widgets I am using i.e. GestureDetector/onScale, InteractiveViewer(based on onScale)... are always working on desktop as I expect but fail to work on web: On web, zoom is fine but the only way to pan is by a tap + a 2 fingers swipe.
I am not sure if this is a limitation of browsers/js, flutter implementation or If I am missing something.
Do you have any approach to implement this?
Relevant links:
https://github.com/flutter/engine/pull/36342
https://kenneth.io/post/detecting-multi-touch-trackpad-gestures-in-javascript
I have both behaviors implemented via onPointerSignal but it only works because I make a difference between panning & zooming by pressing a specific key on top of a simple scroll event.
onPointerPanZoomStart from Listener is trackpad-specific but only works on windows.
Web/js seems to be using the ctrlKey attached to a mouse event to filter pan zoom & zoom trackpad event.
At the moment i'm exploring MultiTapGestureRecognizer and if supported by web and not touch specific, will allow me to get different pointer ids and calculate proper deltas.

Related

Remove ui square when many touches

I am coding a game on a Touchscreen with many players at the same time. The issue is, when there are 2 or more touches, a little square is appearing on the screen. It seems to be a unity built-in feature as it is still present in an empty project.
Is there a way to prevent this annoying little square to appear ? I already disabled magic touch shortcuts in windows. And this doesn't appear on the desktop home screen.
I am able to listen to the touches. It seems to be only a visual thing.
Even when I disable multitouch with Input.multiTouchEnabled = false; It still appears.
I also tried to remove the 18 default Axes in the Input Manager.
My goal is to handle every touch separately, without listening to pinch, long press, or scroll interactions. Each player has only to tap somewhere on the screen.
Thanks for you time
Solved it by myself. I completely disabled Touch feedback in the windows parameters. I don't think it is the only way to do that but it works.
Configuration Panel > Pen and Touch
Unmark "Show visual feedback when touching the screen"

UI Hololens - HandDraggable Issues

I've recently created a 2D app for the HoloLens. It is a UI Panel with several buttons into it. In order to drag the panel and be positioned as the user wants, I implemented the HanDdraggable.cs functionality (from HoloToolKit). However, whenever I try to move the panel it also rotates.
To change that I modified the Rotation Mode from "Default" to "Orient Towards User" and "Orient Towards User and Keep Uptight". But then It works even worst; if I implement that case, whenever I try to select the panel and drag it to somewhere, the panel runs off from my field of view and it suddenly disappears.
I wanted to ask if somebody has already tried to implement the HandDraggable option into an UI Hololens app and knows how to fix this nodding issue.
I'm currently working on hololens UI for one of my projects and to manipulate UI I used TwoHandManipulatable script which is built into MixedRealityToolKit. In Manipulation Mode of that script you could only set "Move" as an option, and this would allow you to move a menu with two hands, as well as one. (I wanted to have a menu which you can also rotate and scale - which works perfectly with this script, you can lock around which axis you want to have rotation enabled, to avoid unwanted manipulation).
For your script HandDraggable, did you try to set RotationMode to Lock Object Rotation? Sounds like this could solve the problem.

Using pen to control scrollviewer and disable touch

For a page with a scrollviewer - using the hand gesture allows the user to scroll up and down and across or whatever. Using a pen doesn't do anything.
The requirement is to have a button to reverse the functionality of pen and touch.
If this button is pushed the touch can draw ink strokes on a canvas and the pen cannot. This works.
I'm not entirely sure how to proceed with the situations:
pen to be "gesture" to scroll page instead of touch.
hand to draw ink stroke on a canvas that is within a scrollviewer.
Is there a method or attribute that i can set that makes the pen/hand accept gestures?
thanks.
So I found a suggestion (can't remember where) ... that i should disable the zooming and scrolling. When that was set I called the method to allow for inking. Not entirely sure if that's the correct way of doing it but through various testing...it worked.

How to know if the user has reached the edges of Webview?

I'm trying to develop an app based on the idea of the Twitter for iPad, with sliding panels (for this, I'm using stackscrollView https://github.com/Reefaq/StackScrollView).
I have a UIWebview inside a UIScrollView, what I need is a way to find if the user has "reached the edge" of the UIWebView. I mean, I want user being able to pan around the web view, zoom in/out. But, I would like to find a way to get noticed when some of the edges of the web view has been reached. When that happens, I don't want the webView to bounce (that's simple, bounce = FALSE), instead, I want that starting from this point, the ScrollView start to respond to the touch (Even with the finger of the user inside the webView).
Basically: I wanna to the same thing that Twitter for iPad does when it opens a panel to show a site: allow us to zoom in/out, panning around, and, when the edge is reached, the panel is swiped.
I think the way to go is using scrollView property of the UIWebview, but I don't know exactly how.
Any idea?

Gestures and usability

I'm currently drawing some mockups of my future iPhone app.
One of the app's functionalities is to display a bar graph showing the evolution of a value over time. Users can perform few gestures on the graph :
swipe/drag to move through time;
pinch to zoom in or zoom out (and therefore display a longer or shorter period of time);
double tap to add a cursor to the graph (i.e. a vertical line with a label on top).
What I'm afraid of is users not noticing these gestures. Of course, I would provide buttons for doing the same tasks, but if users ended up only using those, the interface's usability would not be very great...
Therefore, I am wondering if there is any way to show some visual clues to indicate the presence of gestures on the interface. Do you know any app that does something similar?
I think if you animate mentioned graph behavior it would be a great clue for user to perform this actions by fingers. For example if he(she) choose another date you should move you graph through time smoothly with easyInOut animation. Or if user changed scale you should gradually zoom from scale 1 to scale 2.