I'm a visual artist and I was using Adobe Aero to bring my VR drawings into AR. Recently I found out about Reality Composer and will definitely switch now because of the people occlusion. The only problem I have is, that I want my shadows to appear darker, similar to Aero (right now they are very weak).
I tried to do all kinds of stuff but couldn't find a solution. I heard that it could maybe be possible with coding but I unfortunately can't code. Does anyone know how I could achieve that?
Related
I really searched a long time for a solution of this problem, but I couldn't find it. Maybe one of you know, how I can fix this problem.
I created a Unity VR project for the Oculus Quest 2 and downloaded the Meta Avatar Plugin. I followed this tutorial on YouTube.
Everything is working fine during the Game Mode in Unity. But when I am building it, the Avatars has completely white textures, like in this screenshot.
I am using Unity Version 2021.3.5f1.
I think it has something to to with the building process/ Shader setup from Computer to Android, but I am not sure where or what I can change to make it run.
Does anyone has an idea?
After one day it worked suddenly. I think it was the answer from Philipp, with the Graphics settings:
Try this: Click on your relevant surface in Play mode and check what Shader the Material is using. Then stop, and go to to Edit -> Project Settings -> Graphics. Scroll down to the Always Included Shaders list and add the Shader you noted before to that list. Now compile again and see if the issue persists. (If it does, you may want to look into e.g. the Player -> Color Space setting, which can be Gamma or Linear.) –
Philipp Lenssen
I am new to Unity and trying to get basic hands working in terms of being able to see the hands and having them move in accordance with my own hands (preferably using controller, which I know has limited control over what it can detect hands are doing).
I configured OVRHandPrefab as shown in this article, but I do not see the hands. I have tried using with my (physical) hands only as well, but I don't see the hands. I tried disabling hand-tracking support, but that didn't help either.
I've tried all the options in "Hand Tracking Support" in OVRCameraRig
and am using the default values for the two OVRHandPrefab objects except for changing one of them to match the right hand (since left hand seems to be default).
I also tried using the OVRCustomHandPrefab_L and ..._R, but while I do see the hands they don't animate at all in accordance with me pressing buttons or triggers. I'm not sure if these prefabs are supposed to animate out of the box though.
If anyone can suggest any troubleshooting suggestions or any steps where I can get basic animated hand models working, I'd appreciate it.
I'm using Unity 2020.3.18f1.
Use the OVRCustomHandPrefab_L and ..._R and click "automap bones" button under OVR Custom Skeleton for each one.
I trying to change the barrel distortion coefficients for the HTC Vive to create a distortion in the HMD. Is OpenVR the best method to do this?
The only thing I can suggest to you at this point is to search for OculusRiftEffect.
This is an old plugin for THREE.js that is now useless in normal use because it needed to show you the deformed view on your screen. In most application you don't want that, but you might want to show that to the students. The example was hardcoded to the lenses of Oculus Rift DK2 (or DK1 if you uncomment some stuff inside), but the optics don't differ that much and the effect should be even more visible.
It is removed from their current version, so check out old THREE.js revisions or some stale demos on the internet, and you'll find something. Search something around 3 years back.
I’ve been using NGUI for years and I’m thinking of switching to UGUI.
Does anyone know well-known games that have been developed using UGUI? I know a lot that have used NGUI but not sure about UGUI. I’d like to see if the games are good.
I would appreciate any input.
Well, I don't know any well known games made using uGUI but u can check some of the games which i helped develop and they are using uGUI.
Jump and Rush
Poptropica
I also switched from NGUI to uGUI back when Unity 4.6 was released. At the time i felt NGUI was still better and uGUI just lacked many of the features that NGUI had. Now i feel uGUI is better. Still some tools i extracted from NGUI which uGUI lacked and edited it to support unity UI. Those were tween library and some editor tools. Of course we have iTween, leantween, etc. but interface which NGUI provided through inspector was better.
I suggest you to switch to uGUI. Everything possible with NGUI is possible with uGUI also and it is much more efficient performance wise also. It is open source
like NGUI .
Check this link for source : bitbucket.org/Unity-Technologies/ui
I've been trying to get my head around how to do a player vision like what is in Teleglitch (If the player can see the objects they're shown, otherwise it's black).
(gameplay from Teleglitch : http://www.youtube.com/watch?v=0OBXdEwawqI)
I am currently developing an turn-based strategy game where I would like this feature. Problem is that I actually have the Vision Cones already but I am not sure how I would go about doing this.
I've seen some guides around to do the vision cones, how do I actually apply the "darkness"?
I have a feeling that the best way would be a shader, but I can't seem to find what I am looking for.
Thanks
In Teleglitch, it looks like they simply render black polygons over the shadow areas. Another approach in Unity is to use depth masks. Unity 4.2 Pro also supports stencil buffers which are useful for masking effects.