One of my team members just cloned my unity project which was working fine on my pc(unity 5.5) where as it is displaying "Display 1 no cameras rendering" error in the scene on my teammate's pc(unity 5.4).
We are using a git repository and this is the first time we have encountered this problem.
Another thing we just noticed is that there are place holders for all the objects that were part of hierarchy(on my system). For some reason there names are not getting displayed, but when when you select the area, something seems to be selected.
Can someone please help us solve the issue.
One of my team members just cloned my unity project which was working
fine on my pc(unity 5.5) where as it is displaying "Display 1 no
cameras rendering" error in the scene on my teammate's pc(unity
5.4)
Take a moment and look at the words in bold. You can't do that.
You shouldn't try to open a Unity project made with higher Unity version with lower Unity version. This applies to most software too. What's happening is that Unity added new feature and changed some of Unity's binary saved data in Unity 5.5 that Unity 5.4 doesn't have.
When you load a project from Unity 5.5 in Unity 5.4, it won't be able to understand those new features leading to weird behavior such as the one he's having.
Your teammate will have to update to Unity 5.5 in order to load the Project made with Unity 5.5.
I just had this same issue with "Display 1 no cameras rendering".
After a quick look around the interface, I clicked on my main camera and in the Camera area of the Inspector panel, I saw that my Target Display was set to "Display 4". Not sure how it got there, but changing that back to "Display 1" fixed the problem for me.
If you code and setting don't have other error,I guess you right-click on the gamewindow
Just the Game,cancel the warn if----
OK,just it.
Sorry,My English is not good enough.
I has the same issue , undoing all the steps until the last correct function , i found i erase the main camera without take care.
Related
I'm having a weird issue, may be a simple fix.
I've got a UI only "game" using the new UI Toolkit. It's a little kind of a drawing program. I've got a draw area in the middle with "tool buttons" on the sides. Everything works fine with Mouse, Pen, and touch when drawing (using scripts I can access all types of pointers), but for some reason touch doesn't work with the UI buttons only.
What's even weirder is that touch on UI buttons works when testing directly in Unity Play mode (I've got a touch screen laptop), but doesn't work when I make a Build.
In my Project Settings -> Input System Package, I've got Pen, Mouse, and Touchscreen active under "Supported Devices"
The new UI Toolkit is so new there's no help or similar issues I can find online.
if its still relevant:
I had the same issue and used "Standalone Input Module" instead of "Input System UI Input Module" in EventSystem. It says it's the old option, but it works for me :D
I add the touch screen here and it works now.
Just a follow up since I ended up finding my answer somewhere else.
In the "Input System UI Input Module" component in the EventSystem, I changed the "pointer Behavior" to "Single Unified Pointer" and that fixed it. Not sure if that's just a work-around, but it works great now.
I'm currently developing an AR+VR app using GoogleVR and Vuforia on Unity iOS. Everything works fine, but at some point (I don't remember since when!) Google's native UI layer(vertical alignment line, back button, setting button,...) is missing and I can't change the viewer profile.
Versions are: Gvr 1.10 + Vuforia 6.2
So I tried to manually call the ShowSettingsDialog() in GvrViewer.cs, but it won't work either.
A new project with fresh sdk has no problem, so it would be specific to my project. I doubt the NativeUILayer would be affected by Unity's camera settings, layers, Canvas settings or so.
I can't figure out what might cause this kind of problem. So I need any suggestions to narrow the cause.
Problem solved.
I also use a custom SDK OpenCVForUnity, and the plugin OpenCVForUnityAppController.mm causes the problem.
It also attach RenderDelegate, sets GraphicsDevice etc.
It looks like that it overrides GoogleVR's iOSDevice so GVR couldn't figure out what device this is.
I am working on a simple AR project using Unity and ARToolkit. For testing purposes, I have created a test project to track the image and create a simple 3d sphere and it is working perfectly when I play it from inside the Unity editor. The problem is that when I create the build .exe of the project, the application does not augment the 3D model of the marker when I put the marker in front of Camera (it should work as it is working when playing from inside the editor). It is also showing two errors as I build the project. Note that I have included the .dll files in the folder where .exe file is located.
Also I see that after building the project, the UID of the marker disappears as well as it is showing perfectly when played from inside the Unity Editor.
Kindly guide me in this matter as I have to submit this project to the university final year project. ThankYou
several things on this:
the disappearing of the marker ID during the build of the APP is normal and does not affect your result
Please check if you have selected the correct scene when building your APP:
Build view
select the "Add open scenes" button and tick the scene(s) you would like to include in your APP.
(It happens quite often that the wrong scene is picked)
Also ensure that you are copying the correct versions of the dlls (32bit vs. 64bit) but if you copied the ones from the [appname]_data/Plugins directory you should be fine.
Let me know if that works for you.
I'm new to augmented reality, and I'm using vuforia 4.2.3 and unity 5, I followed all the steps trying to make a test run and whenever the camera detects the target the whole screen turn white, I've tried many thing but none of them worked, can someone help me?
I had a problem that sounds similar.
There is a known bug that causes white screen, could be related to that? See here for more info.
What worked for me was to change the VideoBackground.shader code, as described on that thread.
Go to Project>>Qualcomm Augmented reality>>Shaders
Double-click VideoBackground. This opens up Mono.
In the code, change where it says:
"queue"="geometry-11"
to this:
"queue"="Geometry"
Save, rebuild etc.
Worked for me.
I am new to Unity. I have done a character walk animation in 3ds max and imported in Unity. I created Xcode project for iOS through Unity, and animation works as expected.
I want to develop some UI button controls on screen and animate this animation in my iOS app, only when this button is clicked. How do i code it now for this UI controls and events? Do i need do adding these controls and events on the Xcode project (which created by Unity) (or) I can do everything like this kind of native code in Unity itself?
Please advise!
Thank you!
Getsy.
You can do all UI in the Unity itself. You have these options:
Old Unity GUI system. It is pretty easy to program, but it is awful in terms of performance and usability for designers — it's created completely from code, with no editors. It's almost never used in commercial products except for debugging and developer tools. However, it's still a good option for prototypes.
Use other GUI package. There are a lot of 2d and UI packages for Unity of all kinds. Currently the most popular one is NGUI, which is pad, but also has an evalution version.
Create your own UI framework (still in Unity). Just wanted to mention that this is a viable option, but it's obviously the worst one for your case.
Wait for the new 2D/GUI Unity framework. It's supposed to come out in 4.3 version, and it is just around the corner; more than that, the original NGUI author is working on it.
In you place, I'd create basic prototype controls with built-in Unity3d GUI, and by the time I'd need to create something more presentable, new Unity GUI would hopefully already be there.