I am very new to unity and I am trying to add post processing to a scene. I copied a tutorial, however when I added the post processing layer to my camera and went to change the layer to post processing, there was no post processing layer.
There should be a layer called post processing here, unless I did something wrong:
It doesn't seem like anyone else has had this problem but that could be because it's an easy fix.
I'd recommend restarting the project once, assuming you've correctly imported Post Processing from the Package Manager (To make sure the layer names load up correctly).
Also, the Layer name can be created by you, and assigned to desired GameObjects. So, the name "PostProcessing" as a layer doesn't really matter. you can just add in the layer name, if it doesn't appear, and continue as it was in the tutorial
You can assign it to Default if it doesn't appear (Unity will warn you, though).
PostProcessing layer is usually generated when you imported the package. If it's not there, try to use a project template that has PostProcessing in it while creating a new project.
Related
I'm new here, so feel free to give tips where needed. I am running into trouble using the Unreal engine combined with the HoloLens 2.
I would like to access the special black/white cameras of the HoloLens, for tracking purposes. These are normally not accessible. However, they can be activated by using the “perceptionSensorsExperimental” capability. This should be possible, since it also works with Unity: https://github.com/doughtmw/HoloLensForCV-Unity
I have tried to add the capability in the Unreal Project Settings: Config\HoloLens\HoloLensEngine.ini” -> “+RescapCapabilityList=perceptionSensorsExperimental”. The project still builds as expected, but I noticed that it doesn’t matter what I add here. Even something random like “+abcd=efgh” doesn’t break the build.
However, if I add “+CapabilityList=perceptionSensorsExperimental”, I get “Packaging (HoloLens): ERROR: The 'Name' attribute is invalid - The value 'perceptionSensorsExperimental' is invalid according to its datatype 'http://schemas.microsoft.com/appx/manifest/types:ST_Capability_Foundation' - The Enumeration constraint failed.”. I conclude: 1.) I’m making the changes in the right file. 2.) The right scheme needs to be configured in order for “+RescapCapabilityList=perceptionSensorsExperimental” to work as expected.
My question is how do I add the right schema to my Unreal project? (like in the Unity example referenced above, which uses “http://schemas.microsoft.com/appx/manifest/foundation/windows10/restrictedcapabilities”), I cannot find any example and I cannot find any proper place to put it. Not in the settings, not in the xml/ini files. Clearly, I am missing something.
Any thoughts are much appreciated!
Updated. We released HoloLens-ResearchMode-Unreal plugin
I have been trying to use forge AR/VR toolkit to fetch Revit model from BIM360 to Unity.
I have created a scene, posted the job and got the manifest(shows status complete).
In Unity (Vr. 2019.2.12f1), I import the forge unity package. It shows the following warnings.
If i skip through these warnings, enter the model URN, Access Token and Scene ID, and hit play, nothing gets loaded.
I get error 422: Unprocessable Entity. However it also says scene loaded in 2.something seconds. Image shown below.
I tried the same workflow once earlier and it worked. I am not sure what the issue is now.
Ok, so I just tried urn, token etc from forge sample gallery. It works. So clearly there is a problem with my workflow.
When creating scenes from BIM360 models, make sure to include the project_id and version_id fields:
I have tried to get a response on Github but with no activity about this issue there I will ask here.
I have been following the documentation and I am stuck when I have imported the WwiseResonanceAudioRoom mixer effect on the bus in Wwise and I do not see anything in the properties. I am not sure if I am supposed to? Right after that part of the documentation is says "Note: If room properties are not configured, the room effects bus outputs silence." I was wondering if this was the case and yes it outputs silence. I even switched the effect to see if it will just pass audio and it does just not with the Room effect, so at least I know my routing is correct.
So now this leads up to my actual question. How do you configure the plugin?? I know there is some documentation but there is not one tutorial or a step by step for us non code savvy audio folk. I have spent the better half of my week trying to figure this out b/c frankly for the time being this is the only audio spatialization plugin that features both audio occlusion, obstruction and propagation within Wwise.
Any help is appreciated,
Thank you.
I had Room Effects with Resonance Audio working in another project last year, under its former name, GVR. There are no properties on the Room Effect itself. These effect settings and properties reside in the Unity Resonance prefabs.
I presume you've follow the latter tutorial on Room Effect here:
https://developers.google.com/resonance-audio/develop/wwise/getting-started
Then what you need to do is to add the Room Effect assets into your Unity project. The assets are found in the Resonance Audio zip package, next to the authoring and SDK files. Unzip the Unity stuff into your project, add a room Effect in your scene and you should be able to see the properties in the inspector of the room object?
Figured it out thanks to Egil Sandfeld Here ! https://github.com/resonance-audio/resonance-audio-wwise-sdk/issues/2#issuecomment-367225550
To elaborate I had the SDKs implemented but I went ahead and replaced them anyways and it worked!
Hi I am making a game on unity with airconsole and when I try to import the airconsole controls github repo to my unity project I get a lot of javascript errors, so I can't use the controller generator and it hinders me, A LOT.
Thanks in advance, you would save my skin.
Your HTML, CSS and JavaScript files, as well as all additional resources your controller will need (images etc.), need to be in Assets/WebGLTemplates/AirConsole so Unity will not try to compile them into C# or UnityScript.
Your controller HTML file itself can be somewhere else in the project, the file within Assets/WebGLTemplates/AirConsole will be updated automatically to match it as long as it is linked in the AirConsole Object in your scene.
If you add the plugin including example scenes into an empty project, you will see a working folder structure.
Edit/Update: in case you are getting errors along the lines of The type or namespace name 'ILGenerator' could not be found (or the same but with DynamicMethod) please see details on Reddit or StackOverflow.
Essentially: you have to set API Compatibility level o 4.x in your Player Settings for your selected platform.
I am developing an android app where I have to train my app to recognize two images and four objects.I created one single database where I added all the images and objects target in vuforia developer site and created the unity package. Now neither image nor object is getting recognized.
Probably the problem is the same for objects and images.
I think you should share some more info about what your doing as well as some meaningful code implementing it.
W/O that, I would suggest:
verify that the database and trackables are loaded and active # runtime
if so, see in console that the trackables are tracked by Vuforia
if so, verify the code enabling your augmentations
Please confirm whether have run trough these steps already and what results you got. I can share some code and further tips once the issue is a little but more specific.
Regards