Question about ionic using "mousedown", "mousemove" - ionic-framework

I'm new to app creation and I have a question.
The following function will work in ionic in order to use it in a touch tablet, or "mousemove", "mousedown" it only works on pc with a mouse?
document.body.addEventListener("mousedown", function(event){
if(circlePointCollision(event.clientX,event.clientY,handle)){
document.body.addEventListener("mousemove",onmousemove);
document.body.addEventListener("mouseup", onmouseup);
}

Mouse Events are the oldest and most widely supported user interaction events on the web, so yeah, they are supported by tablets and just about every device you will encounter.
There are also Touch Events created by Apple and a later event system called Pointer Events (pretty widely supported at this point) that are meant to replace Mouse Events, but on devices that support these types of events, Mouse Events are still triggered for backward compatibility.

Related

Intercept Vive controller input?

I'm building an openvr app for steamvr to assist with seated play (my room is small so my tracking area isn't ideal). My app pretty much just adjusts the play-area height when I hold the grip button and "scroll" on the touchpad so that I can reach objects that are too low/high at variable heights. (I tried "OpenVR Advanced Settings" but the options for keybinding with it is limited to simple button presses so I decided to make my own version).
I'd like to prevent touchpad input from being sent to the game while the grip button is being held, so that the moving on the touchpad doesn't cause movement in game, is this possible at all?
I'm assuming it's not possible, but wondering whether anyone has had any experience with this.
After your clarification in the comments the answer is no, you can not "eat up" device inputs in an application, I usually work on OpenVR drivers and there after you submit a device input and/or any other event its available to anything that expects pose update events, and event subscribers can not stop others from receiving the said events
However there might be a work around (if its still an issue) I know of at least 1 application that can do what you want and that application is OVR Toolkit (when the overlay is active and you try to click something in the overlay, the game running in parallel will not receive the input, however that will only happen if OVR Toolkit overlay surface receives input, it may be a built in OpenVR overlay feature and you don't have to do anything or it can be defined by the developer, I don't really have a want to test this right now)
Sadly though OVR Toolkit is not open source, but there is an open source toolkit for unity for making overlays, which is open source and might be the solution you're looking for, it can be found here

How do you develop simple code for Gtk+ 3 touch input?

I've been searching for a little while to investigate whether or not our Gtk+ 3 application (Gtk#, actually, but bound to Gtk+ 3) will be easily convertible to receive touch events rather than mouse clicks. This is pure investigation at the moment since I don't actually have a touch screen yet (that's also the reason why I can't easily test my theories).
As far as I can tell, the Gnome 3 can indeed do this and the documentation that seems relevant is the gesture stuff, found here.
I've not been able to find any sample Gtk+ code so I'm hoping it's a simple matter of instantiating one of the gesture classes and simply setting it up to deliver events to the main event loop.
Is there anyone with experience with Gtk gestures that can guide me in a simple example? Let's say I have a DrawableArea and I want to receive a very simple, single-touch event that gives me an event with the point within the area that was touched.
Once I have that, I should be able to build on it to handle swipes, pinches and so on.
You cannot inject pointer events as touch events: they are fundamentally different, and they interact with the gesture recognition state machine that lives inside GTK.
GTK has the ability (for debugging purposes) to emulate touch events via pointer, though obviously it cannot emulate multi-touch events because a pointing device only has one event sequence. If you have a recent version of GTK 3.x, you can use the GTK_TEST_TOUCHSCREEN environment variable set to a non-zero value.
If you want to know how to use GtkGesture implementations in your own widgets, then I suggest you look at the GtkGesture API reference, and look at the various types of gestures available, like GtkGestureSwipe or GtkGestureZoom. You're supposed to add GtkGesture instances to your widget when constructing it, and then catch the signals on the specific gesture when they are recognised.

Catching all inputs from external (bluetooth) keyboard

Question
I would like to catch/preview all keyboard inputs that our application receives.
It would be enough if only the inputs from the external keyboard is caught.
I've come around solutions examples such as register a notification observer for UITextViewTextDidChangeNotification and UITextFieldTextDidChangeNotification but since this requires a TextView or TextField to be the current first responder it does not capture all our needs.
An example for this is that we are using the scanning to filter a list where the entire view is the list and navigation buttons.
Are there any way to register an observer that can preview all key inputs the application recieves?
Background:
I have an iPhone application that scans barcodes to identify objects at several different views.
Currently we scan the barcodes either by the camera (using zbar library) or with an MFI-certified barcode scanner.
In order to support more devices (for example an iPad) we are investigating other means to capture bar codes.
One of the device classes we are looking at are bluetooth scanners that identifies as a HID keyboard. This would be a great addition since it be usable with different products and manufactures.
In order to
Another option for iOS 7 and above is to use declare a keyCommands method on the current view controller. It can trap special key combinations. The downside is that you'll need to explicitly declare what you're looking for.
I answered my own question over here about getting special characters out of 2D barcodes from scanners connected in HID mode.
Seems like usage of IOHID** functions may not be rejected by AppStore reviewers because IOKit is "white-list-framework" (Will Apple reject Mac OS App which uses IOKit framework?)
So you can really try to use callback function from this topic IOHIDEventSystemClientScheduleWithRunLoop with EXC_BAD_ACCESS, hope that helps!
I used this code and it works even if your app is in background (just set special background mode), captures all system touch and keyboard events.

touch events vs mouse click events using actionscript 3

Just wanted to ask if there is any advantage for either using mouse click event or touch tap events, when writing apps for mobiles or tablets (for the iphone especially)?
I know that both of them should work fine, but in term of performance, is anyone better? Are there any things I should be aware of when choosing either?
By the way am using actionscript3 to implement the app.
This is probably the best documentation on Adobe AIR touch support:
http://help.adobe.com/en_US/as3/dev/WSb2ba3b1aad8a27b0-6ffb37601221e58cc29-8000.html
Midway through that page it states:
Note: Listening for touch and gesture events can consume a significant amount of processing resources (equivalent to rendering several frames per second), depending on the computing device and operating system. It is often better to use mouse events when you do not actually need the extra functionality provided by touch or gestures.
The only benefit of touch, I would think, would be multi-touch. The TouchEvent has a touchPointID which allows you to track the movement of each touch point. If you don't care about multi-touch, it sounds like Mouse Events would be the way to go.
Excellent question! Tap events are "technically" slower as they monitor multiple input points. If your only concerned with a single touch input, the standard mouse event system is just fine. For touch events, there's a couple objects being created per listener to assist in handling the multitouch functionality (however this is close to a tiny fractional ms loss in performance).
i think that the touchEvent is better than mouseevent when implement the app on tablets!i try it many times!you can have a test

Flash cs5 iOS holding down?

Hey, I was wondering if there was an event or something that can check if my user is holding down a button? (for iphone, iOS)
The Event class is TouchEvent. You'll need to toggle the touch state in a variable if you want to keep the fact someone is pressing down after the fact (of the event).
You can use MouseEvent if you need/desire a single touch point. Though you still need the variable.
You need to set Multitouch.inputMode to MultitouchInputMode.TOUCH_POINT in order to turn on touch events on the iPhone. Unfortunately, you can only test touch/gesture events on the iPhone itself. (MouseEvent and TouchEvent are virtually identical in use: MOUSE_DOWN = TOUCH_BEGIN, MOUSE_MOVE = TOUCH_MOVE, TOUCH_END = MOUSE_UP. The main difference is that you can only have one "mouse" yet multiple touches.)
Personally, I use MouseEvent to test on the machine, with an if-then setting up TouchEvent listeners if Multitouch.supportsTouchEvents is true (which it only is on the iPhone/Android).
If the button is a Cocoa Touch UIButton, you can easily connect a number of its gesture events from Interface builder to code Xcode IBActions (where you write the appropriate methods).
How your project would connect to flash, I am not sure. Give us more details and perhaps someone can better help. ;)