I have built an android game and published it. It works fine in Unity editor and Blue Stack emulator, and also works fine in jelly bean 4.1 on a real device.
The problem is that in Android Nougat and Marshmallow the game starts normally, the main menu appears and menus work normally. But whenever the user opens the first level, the game enters something like a paused state - the player does not move, controls do not work and even the traffic is stopped (it is a racing game). However, the pause menu can be opened using back button works fine and the user can use it normally.
This problem do not appear in Jelly Bean.
I am using unity 5.6.2f1 personal (64bit).
using google admob and iap.
I have android sdk build tool 27.0.3
NOTE1: I confirmed that Start() function runs fine on start of scene. but update() function do not run. all update functions of all scripts are stopped. i am not changing scaled time at all.
NOTE2: It only happens on android marshmallow and higher.Everything works fine on lower devices.
Related
I develop a game of VR(Oculus Quest 2) in Unity.
In the Windows standalone build, the controller and hand tracking do not work if the HMD is unmounted and then mounted.
The detailed situation is as follows.
When I put on the HMD and start the game, I don't have any problems at first; it plays fine until I remove the HMD.
If you remove the HMD during a game, wait a moment, and then put it back on, the controller and hand tracking will not function.
With the HMD removed, a slight movement of the mouse on the PC will restore the controller and hand tracking.
At this time, the camera position becomes (0, 0, 0) on the PC screen and stops tracking the HMD position.
The above problem does not occur in APK builds.
This does not occur when I press the play button in the Unity editor and check on the actual device using Oculus Link.
Versions
Oculus application: 38.0
Oculus Quest2: 38.0
Unity: 2021.2.7f1
Oculus Integration: 38.0
MRTK: 2.7.3
I do not see the key to solution at all and would appreciate your listing it if it is what, or it may seem to be revealed as it is enough even for seeming to become the hint.
I was having the same problem and after a night spent on this, I found a solution. There are many bugs in Oculus Integration package. So instead, in:
Project Settings > XR Plugin In Management do not use Oculus, use OpenXR instead.
Caveat: If you use default XR rig, by default the controllers will not be tracked, but it's easily fixed. In Project Settings > XR Plugin In Management>OpenXR under Interaction Profiles add a Oculus Touch Controller profile.
Background: not only OpenXR solves this bug but a range other bugs that I found with Oculus package. Using Oculus, there were always problems (using PCVR via Oculus Link) of correctly detecting the headset taken on/off. OpenXR does everything perfectly and consistently. Also provides a range of ways (via OpenXRFeatures) on how to track session state (ie headset taken off/ tracking lost / regained...)... None of it available via Oculus (that work consistently, that is). Via Oculus there was not even a consistent way to detect user presence, again because the app is getting wrong inputs from Quest via Oculus. Hope this help.
Anytime I start the Unity application on my MacBook Pro the console ends up expanding, till it has finished loading, preventing me from accessing any other application till I minimise or close it. How can I prevent this from happening, so that the console doesn't expand fullscreen when launching Unity?
this piece of code doesn't work. When I set a breakpoint on the if statement, the debugger breaks. But the GetKeyDown never get's triggerd, when pressing Escape or the mobile back button (Android).
void Update()
{
if (Input.GetKeyDown(KeyCode.Escape))
Application.Quit();
}
I also tried if (Input.GetButtonDown("Cancel")). Doesn't work either.
I deployed the game to my mobile device. Maybe some settings have been messed up? Another project works just fine (2D Desktop). I also used the Unity Remote App.
I am using Unity 2018.3.0b2?
EDIT: Solved somehow. Unity Remote App maybe was an issue. Back button only works on real Android device.
I've programmed a virtual reality game for the HTC Vive in Unity. I used Steam VR.
If I play the game on Unity, everything runs perfectly. It runs also perfectly if I build it. But if I copy my build to another computer, nothing's working.
It looks like this:top right corner
Does anyone no how to solve this?.
My Vive is setup correctly and other games are working just fine.
Thank you
Roman
Make sure before each build that OpenVR is set as first target in the player settings and not "None" (sometimes Unity changes it).
On the other computers, they should obviously have Steam and SteamVR installed
Since Steam workflow is to launch games from Steam Library,which launch SteamVR automaticly. make sure the SteamVR app is launched before launching your .exe.
I am currently encountering issues with detecting a touch for an iOS test game. I have imported the Cross PlatformInput library provided by Unity and did include the DualTouchControls in my Scene like so:
When I run the app on my phone I can see the black box but when I press the screen it does not jump my rocket (character). These are the settings of my Jump DualTouchControl:
And here is the rocket itself with its game environment:
What would I like to achieve?
I would like to jump the rocket whenever a user taps his screen on his mobile phone. Previously I didn't work with the DualTouchControls because I mainly build it for desktop for testing purposes. It works perfectly on desktop tho using the spacebar.
Hopefully someone can help me out with some tips.