Detecting when device in cardboard headset in Unity - unity3d

I'm building a Unity Cardboard app, and would like to detect when the device is in the headset. The NFC in theory has this data, but it does not seem to be exposed in the API. I would like to have the app automatically enter VR mode when in the headset, without the user needing to toggle in and out of a VR mode.
Basically, I want Cardboard.vrModeEnabled to be automatically updated when you enter or exit a headset.
Is this possible? Thanks!

It used to be in the (non-Unity) SDK but was deprecated, for several reasons. For one, the NFC sensors on phones are placed in different places, so the detection was not uniformly reliable. For another, using the sensor this way drains battery quickly.

There are a lot of cardboard models on the market. a lot of them don't come with an NFC tag. so i wouldn't count on it.
Best approach for me is to start in VR mode, when the user touches the screen, disable VR mode for 10 seconds since the last touch and then go back to VR mode.

Related

In Flutter, how can I check if a mouse device or a touch device?

How can I check if the device is a touch device or a mouse device?
Using kIsWeb is not sufficient because if using web version of the app on a mobile device kIsWeb returns true, but I need it to return false because it is a touch device.
Checking the platform doesn't work either because if using web version of the app on an iOS device for example returns false for iOS platform check.
Use case - I have two different types of video players for my app. One is suitable for touch devices (you tap to show and hide controls) and one is suitable for mouse devices (controls show when you mouse into the player and hide when you mouse out).
Youtube has the same idea. If I use the youtube app or website on my iPhone I get touch controls. If I use the youtube app or website on my iPad Pro I get touch controls. If I use the youtube website on my Mac I get mouse controls at all screen sizes (even mobile screen sizes).
So I guess I really just need to know platform on the web. I can get platform if not on the web.
Great question #jwasher! I had the same issue - a touch and swipe based UI that was great as a native mobile app, great as an single page web app (SPA) on mobile web browsers, but that was weird and clunky for mouse based interactions when the SPA was used in a desktop web browser.
The solution I have settled on is to wrap sensitive widgets in a MouseRegion (https://api.flutter.dev/flutter/widgets/MouseRegion-class.html) to detect mouse activity at runtime, then reacting by augmenting the UI with buttons to provide a mouse focussed way of triggering actions previously only linked to touch triggers.
You could use this to "mouse-enable" individual widgets, or you could wrap the whole app in a MouseRegion, trip a state field when activity was detected then rebuild the widget tree in a substantially different way for point and click users.
This strategy may incur some minor complexity/CPU overhead on devices that will never have a mouse attached (like a smartphone), but the solution will be more flexible than a build or configuration time capability determination. If a user starts the app on a tablet, then puts it in a stand and attaches a bluetooth mouse - it will react appropriately.
A device isn't "a mouse device" or "a pointer device". Events have an input type--see Pointer event.kind--but not the whole device. A laptop can have a touch screen, and a tablet can have a stylus or external mouse; an app running in those environments can receive both types of event.
Without knowing what you are trying to accomplish with this classification, is hard to advise on how to accomplish it. Trying to style your UI based on a guess of the primary interaction mode, for instance, is a completely different problem than reacting to a mouse event differently than a touch event.

how to use the cardboard sdk for pc vr game?

so I want to create a vr game using unity3d and cardboard sdk for PC(windows), which I'll stream to my phone screen using kinoConsol. I created a simple scene when I build it for android,it works fine , I mean it shows the dual sbs camera(screen), but a windows build shows only one normal camera(screen).. is there a way I can use the cardboard sdk to show the sbs camera(screen) in a windows build ?? if not is there any thing else available to achieve this ?
Side by side is easy, just place two cameras where the eyes should be and change their viewport rect to half width. Now you have a side by side stereo renderer without any external library. Cardboard also adds some distortion to the lenses, but it is not that important to use it in your case.
Your second, and much bigger problem is the gyroscope - you have to somehow communicate the position of your headset to your unity app on your pc. This is not trivial and probably will require finding or building an persistent service on your android device that will send the orientation data to your desktop app.

Will OpenLaszlo mouse events convert to touch events if i compile the code for mobile target?

I want to know whether the open laszlo mouse down events will be converted to touch events while compiling it in mobile format.
Yes, at least on iPad. I tested this myself on an iPad2 running the OpenLaszlo 4.9.0 in HTML5 (aka DHTML) mode of my application last year for R&D purposes and the following were confirmed to work:
1) Touching a button on the screen in the application in OpenLaszlo HTML5 mode properly triggered the onclick event of the button.
2) Drag and drop with your finger on a touch screen in OpenLaszlo HTML5 mode has the same result as dragging and dropping with the mouse on a non-touch screen system.
Note: This was only tested on the iPad2, it was not tested on Android, Windows Phone, Blackberry, etc.
Flash is not relevant for mobile (since Flash Player has just been removed from Google Play store), but Adobe AIR for Android and iOS is an option, if you want to build native applications. In that case, you have to start capturing the touch events using the ActionScript3 API.

Can Siri be disabled within an app?

I'm working an on iOS game and recently tested on an iPhone 4S. Siri activates sometimes when my thumb covers the proximity sensor. This is a feature of the 4S. Instead of holding the home button, users can put the phone to their ear to activate Siri. But in my game the activation is not intended and it interrupts gameplay.
Can Siri be disabled within an app? Is this an iPhone 4S bug?
Setting :
[UIDevice currentDevice].proximityMonitoringEnabled = YES;
Disables Siri from activating when you activate the proximity sensor. As a result although, it blacks out the screen when the proximity sensor is activated instead.
Apple deprecated support for this.
http://developer.apple.com/library/ios/#DOCUMENTATION/UIKit/Reference/UIApplication_Class/DeprecationAppendix/AppendixADeprecatedAPI.html#//apple_ref/occ/instp/UIApplication/proximitySensingEnabled
I'd file a bug report.
Discussion
YES if proximity sensing is enabled; otherwise NO. Enabling proximity sensing tells iOS that it may need to blank the screen if the user's face is near it. Proximity sensing is disabled by default.
This the replacement which only allows you to get notification, not disable it.
Discussion
Enable proximity monitoring only when your application
needs to be notified of changes to the proximity state. Otherwise,
disable proximity monitoring. The default value is NO.
http://developer.apple.com/library/ios/DOCUMENTATION/UIKit/Reference/UIDevice_Class/Reference/UIDevice.html#//apple_ref/doc/uid/TP40006902-CH3-SW25

is that possible to open and preview back-facing and front-facing camera at same time?

I'm using HTC sensation for testing.
the version of Android is 2.3.4.
and there are two cameras on this device.
I could open camera separated (do preview NOT at same time).
BUT, once, if I try to open camera at same time.
I'll got a RuntimeException - Fail to connect to camera service for front-facing camera.
does anyone have idea ??
According to Android Camera Api,
Your application should only have one Camera object active at a time
for a particular hardware camera.
So I guess it should not be possible.