I just tried to create a Smoke and Fire particle system in Unity version 2019.1.14f1. I installed ShaderGraph and followed a Tutorial by Brackeys. I couldn't go far because I didn't have HDRP on the project. It was simply a 3D project. I used this website to figure out how to change to HDRP. I did so, but it won't render the project. It says "Platform WebGL with device Open GLES3 is not supported, no rendering will occur". I then tried to switch back to the normal rendering system. Now it won't let me share my WebGl game. I don't know for sure that these two problem are connected, but seems like it. I don't know if that is a problem with my computer, but I have added the specs below. How do I switch my current rendering system to HDRP, so I can create a nice Fire/Smoke Particle System?
WebGL is based on OpenGL ES and unfortunately, HDRP doesnt support OpenGL ES devices so there is no way to run it on WebGL.
However, URP (Universal Render Pipeline) would be much better way for WebGL.
And yes, URP supports Shader Graph.
Here is URP page: https://unity.com/srp/universal-render-pipeline
Related
I am having trouble playing an audio source in a unity scene using a vuforia area target. I am just learning Unity and Vuforia with not much experience. I wrote a basic unity script without using the calls for Vuforia. I am assuming I need the "DefaultTrackableEventHandler.cs" included. Vuforia documentation is not to descriptive and I am looking for some guidance if available.
I've been working on a project that uses the universal render pipeline and the new Light2D for some scenes. I load the lights in via scripts on awake. For some reason, everything looks fine in the editor but when I make builds of the game, none of the lighting effects display and the project looks just like it did before I switched to the URP. I'm using Unity 2019.4.10f1.
Any ideas on if there is a setting I need to enable for the lights to work with the build or if this is a Unity bug or just not supported yet?
Thanks!
I am new to Unity game development. I making a mobile game. In LWRP I made a custom shader graph (glow shader). It works fine in unity in my pc. But when I build a .APK and install and play on my android device it shows a plain block with no shaders.
Am I missing something?
This my shader graph.
After clear inspection, I noticed most of the shader works. Only the occlusion part doesn't work properly. I have set occlusion to 5. Normally what occlusion does in my graph is kind off saturates the material a little and gives a deep reflective glow kinda effect to the shader. It works on PC. But on the phone, I noticed the color did get saturated a little but didn't notice the occlusion glow at all.
(I know I can produce nearly similar results without using occlusion, but I really felt occlusion suits more for my taste :P and I will be using occlusion for other objects too)
quality settings:
Check your Graphics API in player setting, it should be under Other Settings
I'm developing a Unity app for the Hololens 1 that uses Vuforia. Unfortunately, I cannot get the camera to work with Vuforia, it remains frozen in place and does not follow head movement. When I disable Vuforia, the camera tracks fine.
My setup is as follows:
* Windows 10
* Unity 2019.1.4f1
* MRTK v2.0.0 RC2
* Vuforia 8.1.11
I tried following the steps outlined here:
https://github.com/Microsoft/MixedRealityToolkit-Unity/issues/1461#issuecomment-373714387
To no avail. I also tried having both cameras active, same result. The Vuforia Hololens sample that can be found in the Unity asset store is severely outdated (using the old Holotoolkit, not MRTK), so it is not very useful to me. I noticed that older versions of Vuforia allow the script on the camera to be set to "world center": "camera", but this option is now forced to "device" when Vuforia is configured for the hololens.
Can anyone tell me how to properly configure my scene for MRTK 2 and Vuforia? I'd be eternally grateful for a link to an up to date example project.
EDIT:
This seems to be an issue only when using Unity's holographic remote. I would still very much like to resolve that though, since deploying is very time-consuming and makes debugging almost impossible.
This worked for me:
Import MRTK package and add it to the scene. This will create a MainCamera under MixedRealityPlayspace Game Object.
Then GameObject > VuforiaEngine > ARCamera. This will create an ARCamera with two components: Vuforia Behavior and Default Initialization Error Handler. Copy these two components and add them to the MainCamera created when you added MRTK.
Finally delete ARCamera.
I use Windows 10, Unity 2018.4, MRTKv2.0 and Vuforia 8.
Good luck.
I am working on a moba game in unity, with photon and i use world canvas to display the player life. In the edit mode, i can play without any problem, but in build exe, i can't see the world space canvas. i tried using other cameras, the only way i can see those canvases is to set a camera to render target and display the render result in a raw image. Here are the settings:
Is this an unity bug? I am using the lightweight render pipeline from unity 2018.1.3. Or am I doing something wrong?
I fixed it by using 2D sprites