I was testing the build settings for my project that got a Main UI canvas which is displaying some objects, and when the mouse is on the current object, a copy of it is instantiated in an above canvas (well in the editor it's under the other canvases, so in real view it's above all), with a layer above all other like this:
zoomCard = Instantiate(gameObject, new Vector2(980, 750), Quaternion.identity);
zoomCard.transform.SetParent(ZCanvas.transform, false);
zoomCard.layer = LayerMask.NameToLayer("Zoom");
//adds a zoom
RectTransform rect = zoomCard.GetComponent<RectTransform>();
rect.sizeDelta = new Vector2(260, 399);
So in game View everything works fine.
But as soon as I build it in html5, the zoomed card never shows. I tried with safari, Chrome, and Firefox, same thing.
So I tried to build it in Mac executable too : same thing.
How can I possibly debug this? As on my game view render I have no problem.
So I've found in other posts that, when unity crashes or when you update your project into a new version (my case).
Unity does not import some shaders for some given classes with no reason.
The workaround that worked for me was this one:
Browse to the folder/directory of your project.
Delete the Project Settings Folder.
You'll need to re import the scenes in your build, and evertything will work as it should
Related
I'm pretty new to Unity. I have a button that works perfectly in the editor. When I build it and run on Windows or Mac, the button stops working. Other buttons on the same canvas work fine.
If I recreate the button from scratch and then build after each setting, it seems that it stops working when move it - if the X position is 0 it works, otherwise it will not. It still works in the editor at runtime, but when I build it, it stops working.
I'm not sure where to look to see what is happening. Any ideas?
That doesn't seem right. Can you confirm that the Raycast Target is checked on your Button component? It might be that it is off and another object in the scene is preventing you from clicking on it.
when exporting a demo project (sphere + ovr controller and avatar) to oculus quest, instead of oculus quest controllers, the scene insists to show me the oculus go controller.
I use unity 2018.3.14 and 2019.1.9 oculus integration v1.35 and 1.38.
windows 10.
in oculus rift, the whole scene works perfectly
among the issues this causes
1. controller movement is very limited
2. only 1 hand is shown at a time
2. trigger does not execute the scripts attached to the event.
I followed the proper configuration of oculus scene shown here
https://www.youtube.com/watch?v=qiJpjnzW-mw&t=1s
in OVRCameraRig -> target devices i tried all options (quest, gear+go and both) but generally made sure its on quest
https://www.dropbox.com/s/chbhpvz5u5fv9b2/oculus%20state.PNG?dl=0
(is there another place where the controller should be set?)
I made sure the right controller is chosen in the models prefab
https://www.dropbox.com/s/ejof63acjlb491z/oculus%20prefabs.PNG?dl=0
I tried to update the integration to v1.39 (only got worse, both controllers became invisible but from the oculus forum that's another problem).
I tried different unity versions.
I tried to factory reset the device.
I tested beat sable to be certain that on other apps the controllers work just fine.
have anyone encountered a similar issue?
I had same issue but resolved by following steps.
In Unity menu, Click: Oculus > Tools > Remove AndroidManifest.xlm
In Unity menu, Click: Oculus > Tools > Create store-compatible AndroidManifest.xlm
In Unity Window, Open: Assets > Plugins > Android > AndroidManifest.xlm.
Make sure AndroidManifest.xlm has
category android:name="android.intent.category.LAUNCHER"
but not
category android:name="android.intent.category.INFO".
Try Build and Run.
Good Luck.
After update 5.6.1f my WebGL build can't load scene correctly.
Develop build and DebugSumbols activated - same result. No errors, no warnings.
Always was load empty scene, but with NotDestroyOnLoad objects after prevois scene.
Same problem, but no answer.
https://forum.unity3d.com/threads/black-screen-when-changing-scene-in-webgl-using-unity-5-6.473367/
When creating the build for Android we used the option "Split Application Binary" at the Player settings. Of course this setting is stored in the project. It just happens to be the setting that makes scene switching in WebGL impossible! After some hard work in stripping down our project it occurs that this is the only setting that causes the bug. I really don't know why an Android setting ends up having a major impact on a WebGL build. Something Unity should investigate!
(using Unity 5.6.1f1)
https://forum.unity3d.com/threads/black-screen-when-changing-scene-in-webgl-using-unity-5-6.473367/
I am working on a simple AR project using Unity and ARToolkit. For testing purposes, I have created a test project to track the image and create a simple 3d sphere and it is working perfectly when I play it from inside the Unity editor. The problem is that when I create the build .exe of the project, the application does not augment the 3D model of the marker when I put the marker in front of Camera (it should work as it is working when playing from inside the editor). It is also showing two errors as I build the project. Note that I have included the .dll files in the folder where .exe file is located.
Also I see that after building the project, the UID of the marker disappears as well as it is showing perfectly when played from inside the Unity Editor.
Kindly guide me in this matter as I have to submit this project to the university final year project. ThankYou
several things on this:
the disappearing of the marker ID during the build of the APP is normal and does not affect your result
Please check if you have selected the correct scene when building your APP:
Build view
select the "Add open scenes" button and tick the scene(s) you would like to include in your APP.
(It happens quite often that the wrong scene is picked)
Also ensure that you are copying the correct versions of the dlls (32bit vs. 64bit) but if you copied the ones from the [appname]_data/Plugins directory you should be fine.
Let me know if that works for you.
I have a Scene which has many models. I Baked that scene and got some lightmaps. I put those lightmaps and those models inside one bundle category and that scene in another as I can't add both models and scene inside one bundle category. I can download those models and scene properly and loads the scene. Everything works perfectly but lightmaps don't get load. It loads in Editor but it doesn't work in mobile devices(IOS/Android). I tried logging LightmapSettings.lightmaps.Length also and it gives proper output in mobile. But those maps won't load. Anyone knows whats the problem?
Not sure if this is the problem but try it.
Go to Files->Settings-> Select Android Switch click Switch Platform.
Rebuild your Asset. Name it AndroidAsset.
Go to Files->Settings-> Select iOS Switch click Switch Platform.
Rebuild your Asset. Name it iOSAsset.
When building the for each platform,load each Asset with the platform Asset it was built for.