Unity bloom effect not working (cinemachine, URP) - unity3d

I've been working on a fairly large project in Unity. Until recently, we weren't using universal render pipeline (URP) and made that change. When we did, all the bloom effect we already had in place to create a neon glow look for our game disappeared, and everything looks opaque. All the tutorials I found say that it's fairly simple to add this effect, but I can't make it work. I'm starting to get suspicious of the Cinemachine in the main camera, since it's brought us problems in the past. I'll just state some of the things I've tried:
The post processing checkbox is ticked in the camera view.
My materials have emission turned on with colors that used to work before URP.
The camera has the volume component, with an override of bloom and the values of intensity and threshold super high.
I've also dropped a global Volume in the scene with a volume component with bloom effect (one solution suggested I do that)
I think that's everything. Please let me know of any suggestions!

Related

Pink/Blue Tint on Build Unity URP Post-Processing Bloom

pink build
blue build
To preface this, we are using Unity 2019.3.0f6 and URP 7.2.1.
About 1/3 of the time we load a scene of our game (in build only) it is tinted pink with an orange strip at the top (pictured in “pink build”) or blue with a green strip at the top (pictured in “blue build”), until you enter the next scene. The other 2/3 of the time the screen is fine.
After looking through the shader compilation in the log, it appears that all of them are loading (nothing in the output changes between when pink/blue build happens and when it doesn’t). We then tried turning on and off different components on the camera just to isolate what might be causing the problem, (as the tint effect only happens below the Unity canvas so we thought it might be a problem with the camera/postprocessing).
We’ve narrowed the problem down to the Bloom Override on our Volume Profile on the Volume Component for Post Processing. Turning off the Bloom Override makes the problem go away in build, but we would like to keep the Bloom effect.
the volume component
We’ve tried printing all of the values under the Bloom Override to see if there is an anomaly when it does the pink/blue v.s. when it doesn’t, but there is no difference. Literally nothing is different in our logs when it works and doesn’t work. The only thing we know for sure is that turning off the Bloom fixes the problem. If anybody has run into this, this is a desperate call for help, because we have absolutely no idea where to go from here.
graphics settings screenshot 1
graphics settings screenshot 2
This is probably, a rather individual problem. Since people with a similar problem have not yet answered, I will advise a general and simple solution. Create a new project, where there will be new settings URP, and already add a bloom effect in it. If everything works well and bloom works, then transfer these settings to your project (preferably create a backup, just in case).
If that doesn't help, then as an option, try changing other settings that appear before 2/3 of the game works well. Maybe it's some individual effects, that create a problem. Or according to my assumptions, in a vignette. It can also be in a sprite, that is drawn on top of the bloom (It is advisable to do all this, in a separate backup, so as not to accidentally break the project). Also, the problem may be in the camera, if the URP settings, sprites, etc. work well, and the bloom effect should not fail.
As a last resort, if this also does not work, then try to send a complaint to Unity support. There is a possibility, that this may be a mistake in the bloom effect itself.
Although I will assume that bloom removes the problem. But the error is in something else, and is displayed at the moment when the bloom effect is turned on. But since the URP is quite new, I do not exclude that this is a mistake of the bloom effect itself.
I had a similar problem in the past, it fixed by changing LUT size to (16 or 32 or 64...), in urp asset settings/ post-processing.

Unity2D bloom doesn't work although the object emits light

I'm working on a 2D game and I want to add neon effects to certain objects. I tried using Universal Render Pipeline, however, I still can't seem to get bloom to... bloom.
I will try my best to describe what I tried before:
Before: I followed a tutorial about using Universal Render Pipeline. I made a lit 2D material, created a point light and increased its intensity. I couldn't get the bloom working so I reverted back to the built-in render pipeline.
Currently: Using the built-in render pipeline, I installed the post-processing package from the package manager. I added Post-process Layer and Post-process Volume components to the Main camera. I set up a post-processing profile and added the Bloom component to it. In the Post-process Layer, I set the Layer to "Glowing Object" and appropriately changed the layers of the objects I want neon effect on. I dragged the profile into the appropriate field in the Post-process Volume. For the objects, I made separate materials that have emission set to true. However, I still don't have a blurry effect on these said objects.
Here is the game view when the directional light is disabled (although I added a culling mask to the "Glowing Object" layer, I disabled it for clarity):
The objects GLOW, but they don't BLOOM which is very confusing to me. I apologize if I'm missing something obvious but I really tried to search around for the solution. I will now provide images of each relevant game object in case I couldn't explain it well.
The object materials:
The main camera (includes post-processing components):
Thank you so much for your help.

How do I use different Post-Processing effects on different cameras in Unity 2017.4.2f2?

Before I explain my situation, it's important that I mention I'm using an older version of Unity, 2017.42f2 (for PSVITA support). for this reason, I'm also using the old Post-Processing Stack.
I'm making a game in which I want a lot of bloom on the UI, but not as much for the rest of the game. I used a camera to render the UI, gave it Post-Processing (Screen Space - Camera on Canvas),
and another one to render the rest of the game
Each is given a different profile to use different effects.
My expectation is that the UI renderer camera would only apply it's effects to the Canvas. Instead, it also applies them to the camera beneath it; the game renderer camera.
As you can see I used Don't Clear clear flags. I also tried Depth-only, to see if it would make a difference.
I'm lost as to what I could do. Grain and bloom get applied to everything
yet the profile for those effects is only given to the UI renderer Camera's Post Processing Behavior Script.
Does anyone have any suggestions or ideas? I'm lost.

Using Minecraft character with skin into a unity game

I've decided to use Minecraft like characters in my small game since I do not know how to make 3d models (nor I want to learn how to do such thing in the near future).
However the task now seem a little harder than expected:
I've tried looking in the asset store for prefabs to use but without success.
So, I've decided to try and make a model on blender(by not knowing a thing about non parametric 3d modeling, my knowledge of blender is extremely limited) and import it into my unity game.
And surprisingly, I managed to create the model using McPrep, export it and import it into unity maintaining objects that drive the bones (the output is a bit messy but I think I can manage to clean it up a little).
However the imported version does not have any skin and appears in a gray shade.
Turns out that the output does not keep materials/textures with it!
I've tried getting the texture used by blender and it returns the same skin I fed into mcprep so, by using the same skin, I've tried creating a material with it by getting the .png and using it as texture in a unlit texture material.
However, the result is a bit messy as shown here (left is Blender, right is Unity):
How may I make the texture on unity3d be better and right? (I've heard there is a way using a C# script but I really don't know what it is nor how to do it)
EDIT:
Thanks for the answers before, I did set the filter to point obtaining the texture to be a bit better. However the part that should be transparent is displayed in black on top of the other part (I think).
The image on the right is only filter point and the one on the left is point + alpha is transparency and the transparent shader using unlit transparent
ANSWER FOUND:
As Bart said, make sure the textures' Filter Mode is set to Point, but additionally:
Minecraft player characters are made of 2 layers, the second layer usually has lots of transparency and is used for clothing or other relief detail. So you need to use a transparency-capable shader on your material in Unity.
You're running into a filtering issue. In your case you want no filtering to happen. So select your texture, and in the inspector change the import settings so that your "Filter Mode" is set to "Point". In this case it will do no filtering of the input and your large pixels should appear sharp as you want.

3D AR Markers with Project Tango

I'm working on a project for an exhibition where an AR scene is supposed to be layered on top of a 3D printed object. Visitors will be given a device with the application pre-installed. Various objects should be seen around / on top of the exhibit, so the precision of tracking is quite important.
We're using Unity to render the scene, this is not something that can be changed as we're already well into development. However, we're somewhat flexible on the technology we use to recognize the 3D object to position the AR camera.
So far we've been using Vuforia. The 3D target feature didn't scan our object very well, so we're resorting to printing 2D markers and placing them on the table that the exhibit sits on. The tracking is precise enough, the downside is that the scene disappears whenever the marker is lost, e.g. when the user tries to get a closer look at something.
Now we've recently gotten our hands on a Lenovo Phab 2 pro and are trying to figure out if Tango can improve on this solution. If I understand correctly, the advantage of Tango is that we can use its internal sensors and motion tracking to estimate its trajectory, so even when the marker is lost it will continue to render the scene very accurately, and then do some drift correction once the marker is reacquired. Unfortunately, I can't find any tutorials on how to localize the marker in the first place.
Has anyone used Tango for 3D marker tracking before? I had a look at the Area Learning example included in the Unity plugin, by letting it scan our exhibit and table in a mostly featureless room. It does recognize the object in the correct orientation even when it is moved to a different location, however the scene it always off by a few centimeters, which is not precise enough for our purposes. There is also a 2D marker detection API for Tango, but it looks like it only works with QR codes or AR tags (like this one), not arbitrary images like Vuforia.
Is what we're trying to achieve possible with Tango? Thanks in advance for any suggestions.
Option A) Sticking with Vuforia.
As Hristo points out, Your marker loss problem should be fixable with Extended Tracking. This sounds definitely worth testing.
Option B) Tango
Tango doesn't natively support other markers than the ARTags and QRCodes.
It also doesn't support the Area Learnt scene moving (much). If your 3DPrinted objects stayed stationary you could scan an ADF and should have good quality tracking. If all the objects stay still you should have a little but not too much drift.
However, if you are moving those 3D Printed objects, it will definitely throw that tracking off. So moving objects shouldn't be part of the scanned scene.
You could make an ADF Scan without the 3D objects present to track the users position, and then track the 3D printed objects with ARMarkers using Tangos ARMarker detection. (unsure - is that what you tried already?) . If that approach doesn't work, I think your only Tango option is to add more features/lighting etc.. to the space to make the tracking more solid.
Overall, Natural Feature tracking by Vuforia (or Marker tracking for robustness) sounds more suited to what I think your project is doing, as users will mostly be looking at the ARTag/NFT objects. However, if it's robustness is not up to scratch, Tango could provide a similar solution.