Flutter- GestureDetector detect the direction of Horizontal and Vertical Drag - flutter

I am using GestureDetector and didn't find any onXYZ event which tells you the direction of drag.

Did you try onPanUpdate(details) method? Here is how you can do it.
GestureDetector(
onPanUpdate: (details) {
if (details.delta.dx > 0)
print("Dragging in +X direction");
else
print("Dragging in -X direction");
if (details.delta.dy > 0)
print("Dragging in +Y direction");
else
print("Dragging in -Y direction");
},
child: Container(
color: Colors.blue,
width: double.infinity,
height: double.infinity,
),
)
Note: This callback causes a crash if onHorizontalDragUpdate() or onVerticalDragUpdate() is also provided as mentioned by anmol.majhail.

For those, who have "Having both a pan gesture recognizer and a scale gesture recognizer is redundant; scale is a superset of pan." error:
Implementing both drag and zoom-in with [onScaleUpdate] in [GestureDetector] widget:
onScaleUpdate: (scaleInfo) {
//for drag, use [scaleInfo.focalPointDelta.dx] and [scaleInfo.focalPointDelta.dy]
_controller.translate(scaleInfo.focalPointDelta.dx, scaleInfo.focalPointDelta.dy);
//for zooming use [scaleInfo.scale] and don't react when zoom is exactly '1' [scaleInfo.scale != 1]
if (scaleInfo.scale > 0.5 && scaleInfo.scale < 2 && scaleInfo.scale != 1) {
setState(() {
currentSize = Size(widget.size.width * scaleInfo.scale, widget.size.height * scaleInfo.scale);
});
}
},

Related

Flutter) I want to know how to detect the number of fingers with the GestureDetector

I'm making a whiteboard app and I'm going to implement the draw and zoom functions through the GestureDetector.
Each test went well, but if I use both together, only Zoom function is used in onScaleUpdate() and Draw function is not output.
So I'm going to implement it so that if I touch two fingers, I can only use the Zoom function, and if I touch one, I can only use the Draw function.
Can you tell the number of fingers touched using the Gesture Detector?
Or is there another good way?
The following is part of my code
GestureDetector(
onScaleStart: (details) {
Tool tool = context.read<DrawProvider>().tool;
double seletedPenWidth = context.read<DrawProvider>().seletedPenWidth;
Color seletedPenColor = context.read<DrawProvider>().seletedPenColor;
RenderBox box = context.findRenderObject() as RenderBox;
// zoom test
context.read<DrawProvider>().onScaleStart(details);
// Use Pen
if (tool == Tool.pen) {
Offset point = box.globalToLocal(details.focalPoint);
point = Offset(point.dx, point.dy);
currentLine = DrawingModel(
pointList: [point],
color: seletedPenColor,
width: seletedPenWidth,
);
} else {
// TODO Other Tool
}
},
onScaleUpdate: (details) {
Tool tool = context.read<DrawProvider>().tool;
double seletedPenWidth = context.read<DrawProvider>().seletedPenWidth;
Color seletedPenColor = context.read<DrawProvider>().seletedPenColor;
RenderBox box = context.findRenderObject() as RenderBox;
// zoom test
context.read<DrawProvider>().onScaleUpdate(details);
if (tool == Tool.pen) {
Offset point =
box.globalToLocal(details.focalPoint);
point = Offset(point.dx, point.dy);
List<Offset> path = List.from(currentLine!.pointList!)..add(point);
currentLine = DrawingModel(
pointList: path,
color: seletedPenColor,
width: seletedPenWidth,
);
currentLineStreamController.add(currentLine!);
}
},
onScaleEnd: (details) {
Tool tool = context.read<DrawProvider>().tool;
// zoom test
context.read<DrawProvider>().onScaleEnd(details);
if (tool == Tool.pen) {
allLines = List.from(allLines)..add(currentLine!);
linesStreamController.add(allLines);
}
}
provider.dart, zoom functions
Offset _offset = Offset.zero;
Offset _initialFocalPoint = Offset.zero;
Offset _sessionOffset = Offset.zero;
double _scale = 1.0;
double _initialScale = 1.0;
void onScaleStart(ScaleStartDetails details) {
// TODO if use move tool
// _initialFocalPoint = details.focalPoint;
_initialScale = _scale;
notifyListeners();
}
void onScaleUpdate(ScaleUpdateDetails details) {
// TODO if use move tool
// _sessionOffset = details.focalPoint - _initialFocalPoint;
_scale = _initialScale * details.scale;
notifyListeners();
}
void onScaleEnd(ScaleEndDetails details) {
// TODO if use move tool
// _offset += _sessionOffset;
_sessionOffset = Offset.zero;
notifyListeners();
}
whiteboard screen widget
Transform.translate(
offset: _offset + _sessionOffset,
child: Transform.scale(
scale: _scale,
child: buildAllPaths(allLines: allLines), // drawing screen
),
),
Use details.pointerCount from onScaleUpdate and onScaleStart in GestureDetector.

flutter: How I can pass gesture to children from GestureDetector?

I have the GestureDetector, and I detect only onHorizontalDrag if it started near the edges. How I can pass the gesture to InteractiveViewer if the gesture is not valid for me? if(isHorizontalDragActive == false)
return GestureDetector(
behavior: HitTestBehavior.deferToChild,
onHorizontalDragStart: (DragStartDetails details) {
// allows start only near L/R edges.
final double dX = details.localPosition.dx;
final double dW = constraints.maxWidth / 6;
if (dX < dW || dX > (constraints.maxWidth - dW)) {
isHorizontalDragActive = true;
} else {
isHorizontalDragActive = false;
}
},
onHorizontalDragEnd: (DragEndDetails details) {
if (isHorizontalDragActive == true) {
final double velocity = details.primaryVelocity ?? 0;
if (velocity < 0) {
manager.toForwardPage();
} else if (velocity > 0) {
manager.toBackwardPage();
}
}
isHorizontalDragActive = false;
},
child: InteractiveViewer
Now if I scale InteractiveViewer I can't start moving content inside InteractiveViewer in a horizontal direction.
A GestureDetector wrapping the InteractiveViewer will not respond to GestureDetector events as expected. Try listeners
Listener(
onPointerMove: (moveEvent){
if(moveEvent.delta.dx > 0) {
print("user swiped right");
}
}
child: InteractiveViewer()
.....
)

How to make GestureDetector detect two-finger drag in Flutter Web?

I'm trying to implement Smooth Scroll in a PageView, so I have set the physics property to NeverScrollableScrollPhysics() and I have wrapped the items in the builder fonction with a GestureDetector that detects Drag gestures.
When a drag gesture is detected, a function is triggered and the page will be change accordingly.
My problem is that on Flutter Web, the onVerticalDragGesture is only detected when dragging with three fingers, and nothing happens when swiping with two. Is there any way to solve this?
By the way, the result is the same with the onPanUpdate property.
Feel free to suggest other ways to implement Smooth Scroll. Thanks in advance, guys.
Here is my code if you want to reproduce it:
PageView.builder(
physics: NeverScrollableScrollPhysics(),
controller: /* PAGECONTROLLER */,
scrollDirection: Axis.vertical,
pageSnapping: false,
itemBuilder: (context, index) {
return GestureDetector(
onVerticalDragUpdate: (DragUpdateDetails details) {
if (details.delta.dy > 5.0) {
/* FUNCTION TO MOVE TO PREVIOUS PAGE */
}
if (details.delta.dy < -5.0) {
/* FUNCTION TO MOVE TO NEXT PAGE */
}
},
child: /* PAGES OF THE PAGEVIEW, JUST USE EMPTY CONTAINERS WITH DIFFERENT COLORS */,
);
},
itemCount: /* NUMBER OF PAGES */,
),
By default, GestureDetector in Flutter only detects single finger gestures. To detect multi-finger gestures, you can use the Listener widget with a GestureRecognizer to recognize the multi-finger gesture.
Here's a code snippet that may helps using Listener and PanGestureRecognizer:
Listener(
onPointerDown: (event) {
if (event.kind == PointerDeviceKind.touch && event.pointer != 0) {
// A second finger is down, start recognizing the gesture
_gestureRecognizer.addPointer(event);
}
},
onPointerUp: (event) {
if (event.kind == PointerDeviceKind.touch && event.pointer != 0) {
// The second finger is up, stop recognizing the gesture
_gestureRecognizer.removePointer(event);
}
},
child: Container(
// Your widget tree
),
);
// Initialize the gesture recognizer in your widget state
PanGestureRecognizer _gestureRecognizer = PanGestureRecognizer()
..onStart = (details) {
// This is called when the gesture starts
// Check that it is a two-finger gesture before handling it
if (_gestureRecognizer.numberOfPointers == 2) {
// Handle the two-finger drag gesture
print('Two-finger drag started');
}
}
..onUpdate = (details) {
// This is called when the gesture is updated
// Check that it is a two-finger gesture before handling it
if (_gestureRecognizer.numberOfPointers == 2) {
// Handle the two-finger drag gesture
print('Two-finger drag updated');
}
}
..onEnd = (details) {
// This is called when the gesture ends
// Check that it is a two-finger gesture before handling it
if (_gestureRecognizer.numberOfPointers == 2) {
// Handle the two-finger drag gesture
print('Two-finger drag ended');
}
};

How to differentiate between finger touch gestures and mouse pointer gestures in flutter web?

Im working on flutter web.
I need to implement finger touch specific code and mouse specific codes.
I have used Listener widget.but it recognizes both mouse panning and finger panning on screen.
But I need only mouse panning.
Kindly tell me the code which differentiate between finger touch gestures and mouse pointer gestures.
Listener code:
Listener(
onPointerMove: (details) {
print('moved');
},
child: Container(height:500,width:300));
From the TapDownDetails you can get the PointerDeviceKind. Please refer the below code.
GestureDetector(
onTapDown: (TapDownDetails details){
PointerDeviceKind pointerType = details.kind;
}
);
In Listener
Listener(
onPointerMove: (details) {
print('moved');
},
onPointerDown: (details){
PointerDeviceKind pointerType = details.kind
},
child: Container(height:500,width:300));

How to draw custom shape in flutter and drag that shape around?

At the moment I can draw rectangles using CustomPainter. Below the code inside the paint method of my CustomPainter.
for (var rectPoints in rectangles) {
paint.color = rectPoints.color;
paint.strokeWidth = rectPoints.strokeWidth;
if (rectPoints.selected != null && rectPoints.selected == true) {
paint.color = Colors.black45;
}
var rect = Rect.fromLTWH(
rectPoints.startPoint.dx,
rectPoints.startPoint.dy,
rectPoints.endPoint.dx - rectPoints.startPoint.dx,
rectPoints.endPoint.dy - rectPoints.startPoint.dy);
canvas.drawRect(rect, paint);
}
var rect = Rect.fromLTWH(startPoint.dx, startPoint.dy,
endPoint.dx - startPoint.dx, endPoint.dy - startPoint.dy);
canvas.drawRect(rect, paint);
A rectangle is a custom object with startPoint, endPoint and some other properties needed to draw that specific rectangle. Now I want to select a rectangle and re-position it. Any help would be appreciated. Thanks
You'll need to track the state of the rectangles' positions independent of the canvas drawing. The easiest way to do that is to use a StatefulWidget. You'll also need to use a GestureDetector to capture the pan events. Then you can wire up the gesture details to the position of the rectangles and call the painter to redraw everything.
Here's a simple app that shows how to do it with one rectangle. Should be straightforward to expand it to handle multiple ones.
import 'package:flutter/material.dart';
void main() => runApp(MyApp());
class MyApp extends StatelessWidget {
#override
Widget build(BuildContext context) {
return MaterialApp(
title: 'Draggable Custom Painter',
home: Scaffold(
body: CustomPainterDraggable(),
),
);
}
}
class CustomPainterDraggable extends StatefulWidget {
#override
_CustomPainterDraggableState createState() => _CustomPainterDraggableState();
}
class _CustomPainterDraggableState extends State<CustomPainterDraggable> {
var xPos = 0.0;
var yPos = 0.0;
final width = 100.0;
final height = 100.0;
bool _dragging = false;
/// Is the point (x, y) inside the rect?
bool _insideRect(double x, double y) =>
x >= xPos && x <= xPos + width && y >= yPos && y <= yPos + height;
#override
Widget build(BuildContext context) {
return GestureDetector(
onPanStart: (details) => _dragging = _insideRect(
details.globalPosition.dx,
details.globalPosition.dy,
),
onPanEnd: (details) {
_dragging = false;
},
onPanUpdate: (details) {
if (_dragging) {
setState(() {
xPos += details.delta.dx;
yPos += details.delta.dy;
});
}
},
child: Container(
color: Colors.white,
child: CustomPaint(
painter: RectanglePainter(Rect.fromLTWH(xPos, yPos, width, height)),
child: Container(),
),
),
);
}
}
class RectanglePainter extends CustomPainter {
RectanglePainter(this.rect);
final Rect rect;
#override
void paint(Canvas canvas, Size size) {
canvas.drawRect(rect, Paint());
}
#override
bool shouldRepaint(CustomPainter oldDelegate) => true;
}
I have developed a library called
touchable for the purpose of adding gesture callbacks to each individual shape you draw on the canvas. You can draw your shapes and add onPanUpdate or onTapDown callbacks to drag your shape around.
Here's what you can do to detect touch and drag on your circle.
Here's a small example taken directly from the pub dev site :
Wrap your CustomPaint widget with CanvasTouchDetector. It takes a builder function as argument that expects your CustomPaint widget as shown below.
import 'package:touchable/touchable.dart';
CanvasTouchDetector(
builder: (context) =>
CustomPaint(
painter: MyPainter(context)
)
)
Inside your CustomPainter class's paint method , create and use the TouchyCanvas object (using the context obtained from the CanvasTouchDetector and canvas) to draw your shape and you can give gesture callbacks like onPanUpdate , onTapDown here to detect your drag events.
var myCanvas = TouchyCanvas(context,canvas);
myCanvas.drawRect( rect , Paint() , onPanUpdate: (detail){
//This callback runs when you drag this rectangle. Details of the location can be got from the detail object.
//Do stuff here. Probably change your state and animate
});