Is it just me, or are the emulated touch events built into the Chrome dev tools really wonky? When I emulate touch events and view a site in any of the given device configurations, the touch events are way off the target. For example, I click a button, and my cursor will move to focus on a input field a hundred or so pixels from the top of where I clicked. It's really frustrating.
Has anyone else experienced this? My version of chrome:
Google Chrome 38.0.2125.101 (Official Build 290379) m
I tried touch events on a few other sites and it seems to work fine. So it must be something to do with the site I'm working on.
Related
How can I check if the device is a touch device or a mouse device?
Using kIsWeb is not sufficient because if using web version of the app on a mobile device kIsWeb returns true, but I need it to return false because it is a touch device.
Checking the platform doesn't work either because if using web version of the app on an iOS device for example returns false for iOS platform check.
Use case - I have two different types of video players for my app. One is suitable for touch devices (you tap to show and hide controls) and one is suitable for mouse devices (controls show when you mouse into the player and hide when you mouse out).
Youtube has the same idea. If I use the youtube app or website on my iPhone I get touch controls. If I use the youtube app or website on my iPad Pro I get touch controls. If I use the youtube website on my Mac I get mouse controls at all screen sizes (even mobile screen sizes).
So I guess I really just need to know platform on the web. I can get platform if not on the web.
Great question #jwasher! I had the same issue - a touch and swipe based UI that was great as a native mobile app, great as an single page web app (SPA) on mobile web browsers, but that was weird and clunky for mouse based interactions when the SPA was used in a desktop web browser.
The solution I have settled on is to wrap sensitive widgets in a MouseRegion (https://api.flutter.dev/flutter/widgets/MouseRegion-class.html) to detect mouse activity at runtime, then reacting by augmenting the UI with buttons to provide a mouse focussed way of triggering actions previously only linked to touch triggers.
You could use this to "mouse-enable" individual widgets, or you could wrap the whole app in a MouseRegion, trip a state field when activity was detected then rebuild the widget tree in a substantially different way for point and click users.
This strategy may incur some minor complexity/CPU overhead on devices that will never have a mouse attached (like a smartphone), but the solution will be more flexible than a build or configuration time capability determination. If a user starts the app on a tablet, then puts it in a stand and attaches a bluetooth mouse - it will react appropriately.
A device isn't "a mouse device" or "a pointer device". Events have an input type--see Pointer event.kind--but not the whole device. A laptop can have a touch screen, and a tablet can have a stylus or external mouse; an app running in those environments can receive both types of event.
Without knowing what you are trying to accomplish with this classification, is hard to advise on how to accomplish it. Trying to style your UI based on a guess of the primary interaction mode, for instance, is a completely different problem than reacting to a mouse event differently than a touch event.
I started programming with the Google Cardboard v0.6 about a year ago. I really nailed what I was trying to do with this software. The problem is, my software requires a toggle between full screen and stereo screen modes which I have applied a canvas button for. It is also supposed to start in full screen with an option of stereo mode.
I have three questions:
With the new SDK, is it possible to script a stereo to full screen toggle routine?
I noticed they make the GoogleVR as a fixed SDK mount within the build settings. I read something along the lines of widget controls within the Android SDK but I'm not to savvy with the way Android Studio reads the APK and how to modify it. Honestly, I'm running Visual Studio with a Source Control library in TFS so I want to keep it out of Android Studio as much as possible.
I also read there is supposed to be a full screen toggle button programmed directly into the SDK but it just doesn't pop up on my screen. Perhaps there's a method of making this button pop up that will save the day?
Even if the button exists, if there is a toggle button I'd love to have the script reference so I can apply it On Start in order to start in full screen mode.
Will toggling full screen reactivate screen canvases?
I know the new GoogleVR does not allow canvases because they have a RenderTexture problem. I'm not too concerned because I'm going to make the Toggle button freeze if no control device is registered to the bluetooth, and if there is I have a button on the control device that returns to full screen (or hopefully with that magical screen button that should exist). Whether I can toggle between the two settings is not going to make a difference if it still doesn't allow for Canvases in full screen.
My greatest frustration right now is with the discontinuation of the scripts on build. I've been using the GVRViewer and such which work just like the V0.6 software, but it appears to completely negate these scripts on build and force the build to use the SDK. I've read in the release notes that at the moment they have no intention of returning to the v0.6 platform and even recommend rolling it back to v0.6 if this is the case, but honestly - if we are forced to use an antiquated version of the software, how long will it be before it gets phased out? In my opinion based on my current observations, this feels a lot like a "one step forward, two steps back" situation.
I know this might be a bit off topic was windows isn't supported by Ionic, but I was wondering if anyone had had any luck fixing some of the CSS oddities I am facing when building my app on a Windows 10 Phone.
The first issue is with pull to refresh. It still seems to function correctly, but the movement of the element being dragged down seems quite jumpy and in certain cases it won't move at all until the user releases their finger from the screen at which point it will jump down and refresh.
The second issue is with footer elements not moving up when the keyboard is showing.
The final issue is with side menus not closing properly. My app currently uses a side menu on both sides and it appears that when I close either side menu a slight portion of the right side menu displays in the header momentarily.
One other thing to note is that when I build the app for local machine (using x86 architecture) all the transitions seem to work fine.
If anyone has any suggestions or would be able to help that would be greatly appreciated!
I want to know whether the open laszlo mouse down events will be converted to touch events while compiling it in mobile format.
Yes, at least on iPad. I tested this myself on an iPad2 running the OpenLaszlo 4.9.0 in HTML5 (aka DHTML) mode of my application last year for R&D purposes and the following were confirmed to work:
1) Touching a button on the screen in the application in OpenLaszlo HTML5 mode properly triggered the onclick event of the button.
2) Drag and drop with your finger on a touch screen in OpenLaszlo HTML5 mode has the same result as dragging and dropping with the mouse on a non-touch screen system.
Note: This was only tested on the iPad2, it was not tested on Android, Windows Phone, Blackberry, etc.
Flash is not relevant for mobile (since Flash Player has just been removed from Google Play store), but Adobe AIR for Android and iOS is an option, if you want to build native applications. In that case, you have to start capturing the touch events using the ActionScript3 API.
My team is developing a HTML5 web application with Edge and JavaScript. We need to support touch devices also, but we've bumped into a problem:
How can we simulate a rollover or mouse-over event on a touch device?
Any idea is welcome, not necessarily a code example.
This is an ergonomic problem, not a technical one.
And the short answer is : you cant :)
Put simply, all the rollover actions on a standard device must be rethink with click actions.
For exemple, a rollover top navigation menu on a touch screen device must work with clicks on the menu instead of roll over actions.
At least this is what we do for multi-support web applications...