Echarts in the Windows on the touch screen computer, double refers to zoom - echarts

Echarts how to implement used on Windows touch-screen computer like in the mobile terminal can double refers to the zoom function
dataZoom: [
{
xAxisIndex: [0, 1],
type: 'inside',
}
}
The pattern on the Windows touch-screen computer can not use a touch screen zoom function, in the mobile terminal is normal use, how to implement this feature in Windows touch-screen computer?

Related

Flutter: How matrix_gesture_detector rotate in Linux (Ubuntu)?

I want to make an APP that can zoom in, zoom out and rotate by the gesture.
I found the package: matrix_gesture_detector that can work perfectly in Android but I don't know how to make it rotate in the Linux (Ubuntu 20.04) app, I tried to use the mouse and hold "Ctrl", "Shift", and "Alt" but it can't rotate (only zoom in-out). In Android, I can use 2 fingers to zoom and rotate.
I tried this example

Touch location Siri Remote tvOS Swift

Is it possible to detect the coordinates of the tap on the surface of the Siri Remote? I would wish to recognized the left half and right half and assign a function to it when the user is doing a double tap on it.
Thanks!
There is no way to get location of the tap. The coordinates of any gesture in Siri Remote always start from the center of the touchpad.

How to calculate visual angle when presenting a 2D image Google VR?

I am new to developing in Unity, but would like to show a static image to my participants in an experiment.
I have difficulty understanding what the visual angle will be of said image and how I can control it.
]
The visual angle would probably be the field of view, of the camera. If you are just using the default unity camera, you can set the field of view on the script.
If you are just placing two cameras, (one for each side of the screen / one for each eye), you would want them to be the same. You can change the field of view via script with Camera.fieldOfView. By changing it via script, you could change them at the same time, by the same amount.

Implanting mouse simulation need visual feed back that mouse is clicked

Implanting mouse simulation application using c# and winApi to control other windows application , need to blink the cursor when I simulates Mouse click event. so that I will get a visual feed back that mouse is clicked.
You could use ControlPaint.DrawReversibleFrame() or ControlPaint.DrawReversibleLine() to draw a quick series of disappearing concentric shapes to give feedback. When you draw the exact same frame/line twice with these methods they erase themselves and restore whatever was there before.

Is there a way to see which pixels are what coordinates with crosshairs on iPhone simulator?

Is there a tool that will do this? I want to be running the simulator and then be able to put the mouse over some point and have it tell me what the (x,y) coordinates are. Surely there's a simple tool that does this.
I just use the built-in screenshot snapper from OS X. Just Command+Shift+4 and when you drag it shows the dimensions of the snap you'd take. Press escape to drop it. Works great.
In the Developer Tools -> Applications -> Graphics Tools there is a program called Pixie. It will do what you want. In Preferences you can set it up so that an option-drag will count pixels. You can also set it to just show the pixel coordinates and do the math yourself.
I've used the Iconfactory's xScope for this before. If you create rulers that are the size of the display in the Simulator, you can get a readout of the X and Y coordinates of the mouse pointer as you move across the Simulator screen. Getting the rulers precisely aligned with the edge of the Simulator screen can be a little tricky for applications with dark backgrounds, though.