Update:
void Awake() {
if(UnityEngine.VR.VRSettings.loadedDeviceName.Equals("HoloLens"))
UnityEngine.VR.InputTracking.Recenter();
}
Doesn't appear to do anything. Found this related post titled, "Reset Hololens Origin" inquiring about a similar behavior. Two individuals affiliated with Unity in someway (both bare the 'Unity technologies' profile tag) brought up InputTracking.Recenter(). One adds, "[S]ome changes landed recently, and part of those changes were hooking up InputTracking.Recenter properly on WindowsMR."
The snippet of code I include above is an attempt to implement this behavior. I most definitely could be trying to call this code in an improper/incorrect location but, as it is now, I do not see it working.
end update
When a user opens a Unity Hololens application they are given control over where the application starts by giving them a generic white box to denote the application and allowed to place it any where via the air tap gesture.
Problem:
If a user blooms out of the application but does not close this white box, the application is left in a suspended or 'tombstoned' state. When the application is opened again, the Hololens remembers the old white box placement and reloads the application their.
This isn't desired behavior as with multiple users and or a variety of locations, the application appears to not be loading to the user when the Hololens is bringing the suspended application back into the running state but not updating its position to current user gaze.
Question:
Can one detect, in awake, where the user is looking and move the application manually?
UnityEngine.WSA.Application.windowActivated : Fired when application window is activated.
Somewhere like this?
Any google searches on this question or variations in wording turn up nothing. Unsure of the possibility of moving the app itself.
If this isn't possible would the next alternative be to force the app to close when I want, say for example, in the suspended state? In an attempt to prevent this type of multiple apps open situation.
Related
I'm working on a system that automatically generate and release scriptable object that represent uniqueness for gameobject with a given component attach.
The generation process seems to be working really fine, but I'm facing problem when I want to release the scriptable object.
The given script is tagged "ExecuteInEditMode" and is implementating OnDestroy method to advertise a manager that his scriptable object should be deleted.
The problem is that OnDestroy is called in 4 situation from what i can tell :
Click on play
Switching scene
Manual removal (the only one I want to work with)
On editor shutdown
I've been able to avoid the 2 first on with this :
if (Application.isPlaying == false && Math.Abs(Time.timeSinceLevelLoad) > 0.00001f)
But I don't find any good solution for "On editor shutdown" I've seen that EditorApplication.quitting can be use for that case, but documentation warn that this event is not called when unity crash or is forced to quit, and I don't want to lose all my scriptable object if the editor crash.
I'm looking for a robust solution to avoid this problem if someone can help me please.
Thanks, have a nice day
Just handle the application quitting event.
this event is not called when unity crash or is forced to quit, and I don't want to lose all my scriptable object if the editor crash.
When unity crashes, so will your program, so it won't work any ways. Just make sure that under normal operating procedure (so a normal shutdown) you save your modifications.
In Unity3D if you edit your scene and you don't save, if the editor crashes you lose all your changes. The same will happen for what you're building here. In order to mitigate this risk (since prevention is reasonably difficult if not impossible) you can opt to save every minute, always save each change to a temporary location, or save for example every play.
For example i added a script to my project that autosaves the scene(s) whenever i press play, so when editing the scene i press play now and then to test, and it all gets autosaved.
I'm building an openvr app for steamvr to assist with seated play (my room is small so my tracking area isn't ideal). My app pretty much just adjusts the play-area height when I hold the grip button and "scroll" on the touchpad so that I can reach objects that are too low/high at variable heights. (I tried "OpenVR Advanced Settings" but the options for keybinding with it is limited to simple button presses so I decided to make my own version).
I'd like to prevent touchpad input from being sent to the game while the grip button is being held, so that the moving on the touchpad doesn't cause movement in game, is this possible at all?
I'm assuming it's not possible, but wondering whether anyone has had any experience with this.
After your clarification in the comments the answer is no, you can not "eat up" device inputs in an application, I usually work on OpenVR drivers and there after you submit a device input and/or any other event its available to anything that expects pose update events, and event subscribers can not stop others from receiving the said events
However there might be a work around (if its still an issue) I know of at least 1 application that can do what you want and that application is OVR Toolkit (when the overlay is active and you try to click something in the overlay, the game running in parallel will not receive the input, however that will only happen if OVR Toolkit overlay surface receives input, it may be a built in OpenVR overlay feature and you don't have to do anything or it can be defined by the developer, I don't really have a want to test this right now)
Sadly though OVR Toolkit is not open source, but there is an open source toolkit for unity for making overlays, which is open source and might be the solution you're looking for, it can be found here
I'm working on an app that includes a custom camera application on Glass. I want to be able to hard-set different camera parameters, but I'm having difficulty figuring out which ones I actually have access to.
I tried calling parameters.flatten() and got a whole bunch of options that I thought I would be able to use, but when I tried testing them, nothing happened. (For example, when I tried setting the color effect to sepia, the result was still in normal color). Is there any documentation or code I can look at that will tell me which parameter options I actually have?
There are a few open issues on our issue tracker about camera parameters that do not behave as expected:
Issue 302: GDK: Camera effects not registered or take affect
Issue 303: GDK: Request: Additional camera focus modes (with auto focus)
Issue 304: GDK: Camera scene mode does not register or take affect
You may want to follow those so that you can be updated as the GDK evolves.
Hello I'm trying to get the multitasking work properly, but unfortunately I'm kinda lost. My problem is when I re-enter the game, it takes several seconds for the game to come back and show the pause screen. My question is; is there any way to put some sort of loading screen until the game comes back, so I can at least indicate that its not frozen? I've never used Xcode directly. I'm using Unity 3d to build my game. I made a little bit of research and if I'm not mistaken I'm supposed to use "applicationDidEnterBackground" app delegate method. My question is How can I put a custom loading screen using that method in Xcode?
Thanks
In -applicationDidEnterBackground:, you're given the opportunity to "clean up" the UI before the screenshot is taken. Apple says you should remove "sensitive data" (the screenshots might be persisted to "disk"?), but it also lets you do other things. In one app, we hide the label on a countdown timer so it doesn't appear to jump when you switch back to the app.
To change the "loading screen", simply display a full-screen view over the other views and remove it in -applicationWillEnterForeground:. Alternatively, pause the game in the first place!
(Really, you should be pausing the game in -applicationWillResignActive: which happens when the user double-taps home or the user receives a SMS/notification. I'm pretty sure it's called when the app is backgrounded, too.)
First some background; for the tl;dr version skip to "the problem" below.
Background
This is really more of a user interface question than a technical one, but I think it fits better here than on the UI site anyway.
Since iOS (iPhone OS) 4.0 apps can run in the background, and actually always does so instead of quitting. Quitting an application requires pressing the home button, pressing it again, holding your finger on the application icon until it starts shaking, touching the "close" indicator, pressing the home button again. Not really intuitive.
The reason for this is of course that usually the user shouldn't care if the app is quit or just suspended, because it doesn't matter. But for some apps it does.
The problem
I have an app that logs location changes; think "RunKeeper" if that's familiar. The user starts the app, chooses to start recording (to a file), or just uses it to view distance, speed, etc. When the user is done (or wants to do something else), they hit the home button. The app disappears. Now one of several things happen:
The app quits, closing any ongoing recording (iPhoneOS 3.1.3 and below).
The app continues running in the background, using the GPS and draining the battery (iOS 4). This is appropriate if the user for example wanted to switch to the iPod app to change their soundtrack.
The app continues running in the background, using the GPS and draining the battery (iOS 4). This is completely inappropriate if the user was done and wanted to quit the app.
There is of course no way for the app to see the difference between cases three and four. A quick look around RunKeeper forums indicates that many users have no clue what is happening and get seriously confused by this.
Ideas
So what's the best way to solve this from a user interface point of view?
When the home button is pressed, go into background as is the default action. It's the user's problem to actually kill the app if they are done. It's the platforms fault that this is non-obvious and slightly tedious.
When the home button is pressed and a recording is in progress, continue in background. If we are not recording, quit the application.
Present a toggle widget somewhere in the application that allows the user to choose what happens when the home button is pressed. (This might run afoul of the HIG; I'm not sure)
...?
What do you think?
I though to the third option without reading at it ^^ (I don't think it would be against HIG)
So a "big" button when recording to change the "mode" ? Or a switch ;-) (Like "Continue Recording when in "background" ?")
And maybe asking the user about it when he start registering ? (with a default value available in prefs page ?)
And as you can change songs without quitting the app, maybe setting the default value as stopping record when the app is going to background.
But I haven't any other solution for you :-/