Skybox affecting sprites - unity3d

The default skybox affects the sprites colour. I'm not sure how to stop this from happening. You can see below that the outline of the sprite is brown instead of grey.
If I set the main camera clear flags and background to Solid colour & black respectively, the brown from the skybox still shows through.
I'm not sure what information would be useful for others to assist me with this, so if there is any info I can add, just let me know and I'll update accordingly.

What you're seeing here is alpha blending, your 'outline' pixels have low alpha, so when rendered Unity will blend their color with the background. It generally represents transparency, but in this case it's behaving as a kind of anti-aliasing (it makes your sprites look smoother and less pixely) In that sense, it's working as intended. Look at your sprites at normal resolution and you'll notice they should look good.
If you prefer the pixelated look and really don't want this behaviour, you need to edit your sprites to have full alpha on those pixels.

I'm not sure about Empty's answer:
Are you using the standard shader? That shader uses the environment map as an ambient color on top of the diffuse color (albedo). This is perfectly normal behaviour and makes colors of everything blend nicely with each other. (with an orange sky white objects will look onage-ish. ) This is part of "physically correct lighting" (PBL). It is not at all physically correct, but it comes close enough for now.
If you don't want the environment to affect the colro that mucha you could either alter the environment map to a less dramatic color (normal blue/white sky) or use an unlit shader/material.

Window->Rendering->Lighting Settings
Under Debug Settings, select Generate Lighting.
De-select Auto Generate.
From the Baked Lightmaps set Lighting Data Asset to None.
Remove generated Lighting from the project.

Related

All shaders render black while using shadergraph in unity 2020 1.10f

I am doing a simple animation in unity and I wanted to add portal efecct using shadergraph 8.2.0 and everything renders black
like this
shader setings
I have no clue whatsoever what it is
I'm still learning the pipeline, however, I would think you need to scale up the texture on the top left, I can't read its name from the image. But you are getting the red/yellow mix, getting the swirl, but the area that's being blacked out is comparable to the texture I mentioned. Sorry I don't know how to put it exactly, but the white to grey to black gradient is not large enough so you are rendering the center fine just need to expand the white-grey area, I think scaling the texture up would produce the effect you want.

Unity 2D shader graph not correct colored

I tried to create outline shader for my Sprite:
I watched tutorials on YouTube (CodeMonkey and Brackeys) about this, but it worked by half. In Scene Graph I see preview:
But outline color I so faded:
And also I can't see this shader on Scene Preview:
What am I doing wrong?
Thanks for your attention
P.S.: Of course I set render pipeline like on this video
P.P.S: Material settings:
P.P.P.S: Shader file
Ok first of all you hadn't set the color in the according Outline Color slot int he Material settings.
Then what you are using is Add which results in additive color combining towards white.
Since your original texture isn't fully black but grey-ish the outline color is added to the already existing color(s) making it also lighter!
Instead you could use a Blend node and re-use the output of the Substract node as blend texture (Opacity).
Then you have to play a bit with the mode but I think maybe Overwrite or PinLight would be what you want.
(Actually you wouldn't need the additional Multiply node for the Outline Color)
Btw before you added your file I just reproduced one from scratch and it is way less complex than yours ;)

Unity - Render Texture from Camera's targetTexture produces seams

I am attempting to render a specific section of my scene using a separate camera, and a render texture. That object is on a separate layer that the main camera is not rendering, but a separate camera is. The secondary camera has a target texture set to be a render texture that I have created. Everything is working as intended except for the fact that the object, when rendered to a texture, has a bunch of seams that are not present when rendering directly to the screen.
What it looks like when rendered directly to the screen:
Correct
What it looks like when rendered to a texture, and then displayed on a quad in the scene:
Incorrect
Notice how the second image has a bunch of transparent "lines" in between the sprites where there shouldn't be any.
I am using a basic transparent shader to display the render texture on the quad (since the background isn't part of the render texture, just the black crowd part). I have tried a number of different shaders, and none of them seem to make a difference.
The render texture's settings are: Width: Screen.width Height: Screen.height Format: RenderTextureFormat.ARGBFloat;
Unity Version: 5.2.3f1 - iOS Platform
Note: The reason I am doing this is so that I can apply a "Blur" image effect to the texture, and make the crowd in the foreground appear to be out of focus. Any alternative suggestions for how to do this are also welcome.
I'm not quite sure -- but it almost sounds like you have line ghosting. You may want to give this a read and let me know if that's what you're dealing with or not:
The reason for this is due to how the texture image was authored, combined with the filtering that most 3d engines use when textures are displayed at different sizes on screen.
Your image may have coloured areas which are completely opaque, coloured areas which are partially transparent, and areas which are completely transparent. However, the areas where your alpha channel is completely transparent (0% opacity) actually still have a colour value too. In PNGs (or at least, the way Photoshop exports PNGs) seems to default to using white for the completely transparent pixels. With other formats or editors, this may be black. Both are equally undesirable when it comes to use in a 3d engine.
You may think, "why is the white colour a problem if it's completely transparent?". The problem occurs because when your texture appears on screen, it's usually either upscaled or downscaled depending whether the pixels in the texture's image are appearing larger or smaller than actual size. For the downsizing, a series of downscaled versions get created during import. These downscaled versions get used when the texture is displayed at smaller sizes or steeper angles in relation to the view, and is intended to improve visual quality and make rendering faster. This process is called "mip-mapping" - read more about mip-mapping here. For upscaling, simple bilinear interpolation is normally used.
The scaled versions are usually created using simple bilinear interpolation, which means that the transparent pixels are mixed with the neighbouring visible pixels. With the mipmaps, for each smaller level, the problem with the invisible mixing with the visible pixel colours increases (with the result that your nasty white edges become more apparent at further distances away).
The solution is to ensure that these completely transparent pixels have a colour value which matches their neighbouring visible pixels, so that when the interpolation occurs, the colour 'bleed' from the invisible pixels is of the appropriate colour.
To solve this (in Photoshop) I always use the free "Solidify" tool from the Flaming Pear Free Plugins pack, like this:
Download and install the Flaming Pear "Free Plugins" pack (near the bottom of that list)
Open your PNG in photoshop.
Go to Select -> Load Selection and click OK.
Go to Select -> Save Selection and click OK. This will create a new alpha channel.
Now Deselect all (Ctrl-D or Cmd-D)
Select Filter -> Flaming Pear -> Solidify B
Your image will now appear to be entirely made of solid colour, with no transparent areas, however your transparency information is now stored in an explicit alpha channel, which you can view and edit by selecting it in the channels palette.
Now re-save your image, and you should find your white fuzzies have dissappeared!
Source: http://answers.unity3d.com/questions/10302/messy-alpha-problem-white-around-edges.html
Turns out that the shader I was using for my scene was using "Blend SrcAlpha OneMinusSrcAlpha" for some reason, when it should have been using "Blend One OneMinusSrcAlpha". This was causing objects with alpha less than 1 to make the objects under them become semi-transparent as well exposing the camera's clear colour background.

Unity - How can I control fog color based on skybox color?

I have a gradient on a skybox, and I want the terminal fog color to be the same as the sky color behind the object that is being occluded. But it still needs to be fog and respect distance.
I see one way of doing it without actually using fog, with a gradient of my choosing on a skybox. The skybox gets masked out by the depth map, and then overlaid onto the non-sky geometry. You would be able to look down into the depths, and the blackness below gets darker gradually like a "fog" because it is masked by the depth operation. Look up and see the same phenomenon with the lighter color. Basically control "fog" terminal color by camera angle.
What would be the best way to implement this, and could it be performant on mobile devices?

OpenGL ES flash of bright light

I'm having some trouble achieving my desired effect in an OpenGL ES app I'm working on. I can use OpenGL ES 1.1 or 2.0. What I'm trying to achieve is the following...
In a 2D ortho scene (black clear color), render a red square to the screen with some transparency so it is a darker red (or just set to darker red color). This is no problem for me. Then, when a user clicks in the region of the square, I want it to quickly flash in a bright flash of light (just in the region of the square). This flash doesn't have to persist long at all, just enough that if the user was in a completely dark room, this flash of light would create a brief noticeable flash in the users face. I've been having some trouble getting a "light bloom" or glow effect to work efficiently, and was wondering if anyone had ideas for a quick, efficient way to make the color flash brightly for a split second. Possibly through the use of some kind of texturing trick that I don't know of. Also, the flash doesn't have to blur outside of the region, it can be fully contained within the region with sharp edges. Really all I'm after is the aesthetic of the flash lighting the immediate area around the screen.
Disable texturing, glColor3ub(255,255,255), render square, wait a bit, redraw square normally.