How to trigger capture button of native camera programatically in android - android-camera

I am trying to develop a remote trigger for native camera app. I am searching for this in Google but didn't find any way. I don't want to open camera intent but trigger the capture button of native camera app like wireless selfie stick.

Related

In Flutter, how can I check if a mouse device or a touch device?

How can I check if the device is a touch device or a mouse device?
Using kIsWeb is not sufficient because if using web version of the app on a mobile device kIsWeb returns true, but I need it to return false because it is a touch device.
Checking the platform doesn't work either because if using web version of the app on an iOS device for example returns false for iOS platform check.
Use case - I have two different types of video players for my app. One is suitable for touch devices (you tap to show and hide controls) and one is suitable for mouse devices (controls show when you mouse into the player and hide when you mouse out).
Youtube has the same idea. If I use the youtube app or website on my iPhone I get touch controls. If I use the youtube app or website on my iPad Pro I get touch controls. If I use the youtube website on my Mac I get mouse controls at all screen sizes (even mobile screen sizes).
So I guess I really just need to know platform on the web. I can get platform if not on the web.
Great question #jwasher! I had the same issue - a touch and swipe based UI that was great as a native mobile app, great as an single page web app (SPA) on mobile web browsers, but that was weird and clunky for mouse based interactions when the SPA was used in a desktop web browser.
The solution I have settled on is to wrap sensitive widgets in a MouseRegion (https://api.flutter.dev/flutter/widgets/MouseRegion-class.html) to detect mouse activity at runtime, then reacting by augmenting the UI with buttons to provide a mouse focussed way of triggering actions previously only linked to touch triggers.
You could use this to "mouse-enable" individual widgets, or you could wrap the whole app in a MouseRegion, trip a state field when activity was detected then rebuild the widget tree in a substantially different way for point and click users.
This strategy may incur some minor complexity/CPU overhead on devices that will never have a mouse attached (like a smartphone), but the solution will be more flexible than a build or configuration time capability determination. If a user starts the app on a tablet, then puts it in a stand and attaches a bluetooth mouse - it will react appropriately.
A device isn't "a mouse device" or "a pointer device". Events have an input type--see Pointer event.kind--but not the whole device. A laptop can have a touch screen, and a tablet can have a stylus or external mouse; an app running in those environments can receive both types of event.
Without knowing what you are trying to accomplish with this classification, is hard to advise on how to accomplish it. Trying to style your UI based on a guess of the primary interaction mode, for instance, is a completely different problem than reacting to a mouse event differently than a touch event.

How can I see the game on my glasses when pressing play in the editor

How can I see the game on my glasses when pressing play in the editor?
I'm using GearVR. USB Cable is plugged to my headset.
You have to build and deploy the app on your phone and when it runs then mount it in headset. You can not use unity remote for VR apps. This is because GearVR headset takes priority when you mount the phone. VR apps which have GearVR sdk enabled should default "insert your phone in headset" when you run it. An app can either be VR or Non-VR and UnityRemote is Non-VR App. Can only be used as container for Non-VR games.

Does the Samsung gear vr trigger work on a command Input.GetMouseButtonDown(0)

I made cardboard aplication and i use Input.GetMouseButtonDown(0) on google cadboard it works. But i dont have gear vr device to try on it. I am search on a google and i find that they have bluetooth controller and trigger button on it.
Yes. GearVR has a external controller and a touchpad to trigger input.
And yes Input.GetMouseButton(0) works in GearVR too provided you have to make some changes.

Is it possible to monitor the calling of camera intent in android

I am in need of monitoring the camera hardware in android phone for an application i am building. I want to get an alert every time the camera hardware is accessed. Is there any possible way for this....?
You can set up a BroadcastReceiver that will receive any camera button presses if that's what you want to do.

Will OpenLaszlo mouse events convert to touch events if i compile the code for mobile target?

I want to know whether the open laszlo mouse down events will be converted to touch events while compiling it in mobile format.
Yes, at least on iPad. I tested this myself on an iPad2 running the OpenLaszlo 4.9.0 in HTML5 (aka DHTML) mode of my application last year for R&D purposes and the following were confirmed to work:
1) Touching a button on the screen in the application in OpenLaszlo HTML5 mode properly triggered the onclick event of the button.
2) Drag and drop with your finger on a touch screen in OpenLaszlo HTML5 mode has the same result as dragging and dropping with the mouse on a non-touch screen system.
Note: This was only tested on the iPad2, it was not tested on Android, Windows Phone, Blackberry, etc.
Flash is not relevant for mobile (since Flash Player has just been removed from Google Play store), but Adobe AIR for Android and iOS is an option, if you want to build native applications. In that case, you have to start capturing the touch events using the ActionScript3 API.