why unity post processing effect not work in webgl? - unity3d

Everything is perfect in editor. bloom、mothionblur.. but when i built project into webgl and play in chrome no bloom effect at all.
i'm using Built-in render pipeline and linear color space . even manual setting graphic apis to webgl 2.0 but no luck.
please help me with that.
here is the setting
player setting
quality setting
effect in editor
effect in chrome

Seems like you need to enable HDR for your quality level as it is described here.

Related

Meta Avatars have white textures in Unity Build

I really searched a long time for a solution of this problem, but I couldn't find it. Maybe one of you know, how I can fix this problem.
I created a Unity VR project for the Oculus Quest 2 and downloaded the Meta Avatar Plugin. I followed this tutorial on YouTube.
Everything is working fine during the Game Mode in Unity. But when I am building it, the Avatars has completely white textures, like in this screenshot.
I am using Unity Version 2021.3.5f1.
I think it has something to to with the building process/ Shader setup from Computer to Android, but I am not sure where or what I can change to make it run.
Does anyone has an idea?
After one day it worked suddenly. I think it was the answer from Philipp, with the Graphics settings:
Try this: Click on your relevant surface in Play mode and check what Shader the Material is using. Then stop, and go to to Edit -> Project Settings -> Graphics. Scroll down to the Always Included Shaders list and add the Shader you noted before to that list. Now compile again and see if the issue persists. (If it does, you may want to look into e.g. the Player -> Color Space setting, which can be Gamma or Linear.) –
Philipp Lenssen

URP Renderer Feature works on mobile?

I'm developing a VR project using Universal Render Pipeline.
I'm using the Renderer Feature to use the screen space outline in my project.
It works good on pc, but not works on mobile(android). Also, other custom renderer features won't work.
Tested on unity 2019.3.7f1 and Universal RP 7.1.8, 7.4.1.
Is there a way to make the renderer feature work on mobile?
Make sure you have changed all "Opaque Texture" fields to true state in all your UniversalRP assets. Worked for me.

Using HDRP Rendering Pipeline

I just tried to create a Smoke and Fire particle system in Unity version 2019.1.14f1. I installed ShaderGraph and followed a Tutorial by Brackeys. I couldn't go far because I didn't have HDRP on the project. It was simply a 3D project. I used this website to figure out how to change to HDRP. I did so, but it won't render the project. It says "Platform WebGL with device Open GLES3 is not supported, no rendering will occur". I then tried to switch back to the normal rendering system. Now it won't let me share my WebGl game. I don't know for sure that these two problem are connected, but seems like it. I don't know if that is a problem with my computer, but I have added the specs below. How do I switch my current rendering system to HDRP, so I can create a nice Fire/Smoke Particle System?
WebGL is based on OpenGL ES and unfortunately, HDRP doesnt support OpenGL ES devices so there is no way to run it on WebGL.
However, URP (Universal Render Pipeline) would be much better way for WebGL.
And yes, URP supports Shader Graph.
Here is URP page: https://unity.com/srp/universal-render-pipeline

Oculus Quest Single-Pass and Multi-Pass not working?

I am using the following configuration:
Unity: 2019.2.13f1
Device: Oculus Quest Using LWRP
Issues:
(a) When I change the "Stereo Rendering Mode" to "Single-Pass", the rendering of the screen is too small and too far.
(b) When I change the "Stereo Rendering Mode" to "Multi-Pass", the rendering is only visible on the Left-Eye.
(c) The only Mode that works is "Multi-View". Unfortunately, there is also of jittery motion when this is used. The images that are near the user starts to jitter and this is very much visible.
The (c) is the reason that I would like to use Single/Multi pass rendering since then it would overcome the problem.
Has anyone faced these similar issues?
This is a recurrent problem with LWRP/URP because it use post-processing effects, and single-pass stereo rendering needs you to tweak shaders to support this
And thus, unless vis major, it is best suited to stick with standard rendering pipeline.

How to access target texture for rendering from native world in Unity (VR)?

I wanna render VR from a native plugin. Is this possible?
So, far, what I have is this:
void OnRenderObject()
{
GL.IssuePluginEvent(getRenderCallback(), 1);​
}​
This is calling my native plugin and rendering something, but it makes this mess here.
It seems that I am rendering to the wrong texture. In fact, how to get the correct texture/framebuffer or whatever trick necessary to render VR from the native plugin OpenGL? For example, I need information about the camera being rendered (left or right) in order to render properly.
Any hints? I am trying with both Gear VR AND Google VR. Any hint about any of them is welcome.
My interpretation is, instead of Unity's default render from MeshRenderer, you want to overwrite that with your custom OpenGL calls.
Without knowing what you are rendering and what should be the correct behavior, I can't help more.
I can imagine that if you set up your own states such as turning on/off particular depth tests, blends, these would overwrite whatever states Unity set up for you prior to your custom function. And if you don't restore those states Unity is not aware and produces incorrect result.
As a side note, you may want to check out
https://github.com/Samsung/GearVRf/
It's Samsung's open source framework engine for GearVR. Being similar to Unity but open source, you may do something similar to what you posted while knowing what happens underneath