How to detect if my chrome app is focussed or not? - google-chrome-app

I want to detect whether my chrome app is focussed or not. Based on the output I want to show a simple "toast" or show a chrome.notifications.create.

Related

Flutter: detect if device has "back navigation" built into the OS?

In Flutter, I would like to know if the device has any functionality built into the OS for navigating back. Or in other words, if the user can trigger navigation without having an actual button within the app.
For example, an iPhone 7 does not have a physical back button or a swipe-back gesture. If the app doesn't have it's own way to navigate, the user can get stuck.
On the contrary, most modern devices have some way of going back built into the system, like a physical/virtual back button or a swipe-back gesture.
Can I distinguish between these types of devices?
From what I have seen it is only IOS that do this, so it can be done with:
if (Platform.isIOS){
#show backButton();
}
But I have not seen a package or function that specifically lets you know this information. Keep me in the loop if you find anything

In Flutter, how can I check if a mouse device or a touch device?

How can I check if the device is a touch device or a mouse device?
Using kIsWeb is not sufficient because if using web version of the app on a mobile device kIsWeb returns true, but I need it to return false because it is a touch device.
Checking the platform doesn't work either because if using web version of the app on an iOS device for example returns false for iOS platform check.
Use case - I have two different types of video players for my app. One is suitable for touch devices (you tap to show and hide controls) and one is suitable for mouse devices (controls show when you mouse into the player and hide when you mouse out).
Youtube has the same idea. If I use the youtube app or website on my iPhone I get touch controls. If I use the youtube app or website on my iPad Pro I get touch controls. If I use the youtube website on my Mac I get mouse controls at all screen sizes (even mobile screen sizes).
So I guess I really just need to know platform on the web. I can get platform if not on the web.
Great question #jwasher! I had the same issue - a touch and swipe based UI that was great as a native mobile app, great as an single page web app (SPA) on mobile web browsers, but that was weird and clunky for mouse based interactions when the SPA was used in a desktop web browser.
The solution I have settled on is to wrap sensitive widgets in a MouseRegion (https://api.flutter.dev/flutter/widgets/MouseRegion-class.html) to detect mouse activity at runtime, then reacting by augmenting the UI with buttons to provide a mouse focussed way of triggering actions previously only linked to touch triggers.
You could use this to "mouse-enable" individual widgets, or you could wrap the whole app in a MouseRegion, trip a state field when activity was detected then rebuild the widget tree in a substantially different way for point and click users.
This strategy may incur some minor complexity/CPU overhead on devices that will never have a mouse attached (like a smartphone), but the solution will be more flexible than a build or configuration time capability determination. If a user starts the app on a tablet, then puts it in a stand and attaches a bluetooth mouse - it will react appropriately.
A device isn't "a mouse device" or "a pointer device". Events have an input type--see Pointer event.kind--but not the whole device. A laptop can have a touch screen, and a tablet can have a stylus or external mouse; an app running in those environments can receive both types of event.
Without knowing what you are trying to accomplish with this classification, is hard to advise on how to accomplish it. Trying to style your UI based on a guess of the primary interaction mode, for instance, is a completely different problem than reacting to a mouse event differently than a touch event.

How to start a Flutter app after tapping a hardware button after a number of times?

Hey can anyone assist me in letting me know what to use to effectively start a flutter app from hardware detection?
I see that there's a hardware detection plug in but this only works when the application is open.
I'm trying to get the app to automatically start or display an Alert Dialog Widget after tapping say the power button after 3 times.
I am not sure what technologies/plugins to use. Can anyone assist?

How to make my app interact with Android home screen?

I am building an app that uses frontal camera to track where user is looking at the screen. When user blinks, smartphone registers that as a tap. Currently it works "inside" my app. I have a few dots around the screen, when i look at one of them and blink, it changes color.
But how can I make it work on the home screen/with other apps? Say if user looked at the facebook app, blinked, it would open.
I was thinking of a pop up window like Skype. I could design it as a cursor and it would be displayed above the home screen and all apps. But if i would blink(tap) it would perform an action inside that popup/widget and not click "through" it.
Are there any codes that can make my app interact with other apps. Maybe Accessibility services?
There is a similar app that can create a mouse on the homescreen, when user waves his hand (no physical touch) it registers that as a tap. How can I recreate that?
Picture attached...
Picture

Switch to fullscreen

I started programming with the Google Cardboard v0.6 about a year ago. I really nailed what I was trying to do with this software. The problem is, my software requires a toggle between full screen and stereo screen modes which I have applied a canvas button for. It is also supposed to start in full screen with an option of stereo mode.
I have three questions:
With the new SDK, is it possible to script a stereo to full screen toggle routine?
I noticed they make the GoogleVR as a fixed SDK mount within the build settings. I read something along the lines of widget controls within the Android SDK but I'm not to savvy with the way Android Studio reads the APK and how to modify it. Honestly, I'm running Visual Studio with a Source Control library in TFS so I want to keep it out of Android Studio as much as possible.
I also read there is supposed to be a full screen toggle button programmed directly into the SDK but it just doesn't pop up on my screen. Perhaps there's a method of making this button pop up that will save the day?
Even if the button exists, if there is a toggle button I'd love to have the script reference so I can apply it On Start in order to start in full screen mode.
Will toggling full screen reactivate screen canvases?
I know the new GoogleVR does not allow canvases because they have a RenderTexture problem. I'm not too concerned because I'm going to make the Toggle button freeze if no control device is registered to the bluetooth, and if there is I have a button on the control device that returns to full screen (or hopefully with that magical screen button that should exist). Whether I can toggle between the two settings is not going to make a difference if it still doesn't allow for Canvases in full screen.
My greatest frustration right now is with the discontinuation of the scripts on build. I've been using the GVRViewer and such which work just like the V0.6 software, but it appears to completely negate these scripts on build and force the build to use the SDK. I've read in the release notes that at the moment they have no intention of returning to the v0.6 platform and even recommend rolling it back to v0.6 if this is the case, but honestly - if we are forced to use an antiquated version of the software, how long will it be before it gets phased out? In my opinion based on my current observations, this feels a lot like a "one step forward, two steps back" situation.