I have a bunch of models with inverted normals for outlines. However, I want to render the meshes in front of the outlines so there's no glitchy clipping. Does anyone know of a shader or way to render the meshes in front of the outlines?enter image description here
Changing the render queue for the outline shader should fix your issues.
In the outline shader:
Tags{ "Queue" = "Geometry-1"}
Just make sure the queue number to render the outline is before the other objects. Example of tags location below.
Related
I have this texture and when I add it to the "Sample texture 2d" component, it is drawn where there is simply no sprite (transparent background). It looks like this. With the help of a shader, I did outline. And because of this, when I connect outline with texture, they blend where there is no texture. How do I solve this problem?
Shader Graph's "Sample Texture 2D" Node doesn't preview the alpha channel (transparency) correctly, although it does keep the alpha value (since it returns a vec4, rather than a vec3).
It's unclear what you want to do here. If you want to make the outline be a solid color, rather than using the texture, use a "Blend" node rather than the "Add" node you're currently using, with the "Overlay" mode selected.
Many image editors save the background colour even for full alpha values. If the texture isn't set to sprite Unity displays the hidden "artist smears" areas.
If the preview bugs you, you can reexport the image with GIMP and save with "Save background color" disabled. Unity will display alpha as black, which might be more pleasant to look at.
I made 2d outline shader but The outline does not render beyond the boundary of the sprite.
so I need to resize shader area.
Is there a simple way to solve it, other than remaking sprites and changing positions in the vertex shader?
In principle you can only draw within the geometry of your object.
In theory, you could tweak the geometry in the shader to expand the border but I have no idea how to do it reliably.
Almost all of the edge shaders I'm aware of use a full-screen post-processing pass to render outlines
I created a hlsl shader which is rendering a sierpinski fractal using Raymarching. Currently I have assigned the shader to a material, this material is assigned to a cube which I placed in the scene. So the sierpinski fractal is displayed / rendered on the cube geometry.
How can I use the whole screen / camera view to display my shader? I don’t want to add my shader to a material which I assign to a geometry.
In case someone is coming to this question and have the same problem as I had, you can do the following:
use a Graphics.Blit to execute a certain shader
store this information in a RenderTexture
assign this RenderTexture to e.g. a Canvas RawImage.texture
I am new to writing shaders. I want to use a texture for 6-sided skybox in unity and I want that texture to be repeated several times also called tiling.
But the default 6-sided skybox shader in unity doesn't have tiling option. Can anyone write a custom shader for 6-sided skybox in unity which has option to tile textures? I also want an option to apply a color tint on the texture if possible. Thanks in advance.
Tiling can be achieved by multiplying texcoord by the number of tiles you want. Or in Surface shader it's uv_YourTex (likely MainTex) instead of texcoord. Writing from a phone so can't post an example, but it's really just one multiplication.
I don't know your specific scenario, but I needed to get more detailed sky with not very detailed texture and instead of UV I used view direction to sample a texture. It made it look like clouds in the distance are further away + clouds can move as you move . It's in this asset.
View direction sampling will not help if you are trying to make space though, which seams to be the case.
Also IMHO tiling on the skybox might be too visible.
I am importing an fbx model but parts of it are invisible in the scene And game at certain angles.
Attached image SS-1 = the full model - You can see that some parts of the mesh are invisible Attached image SS-2 = closeup of an invisible mesh selected
Attached image SS-3 = Main Camera Settings
Attached image SS-4 = Model import settings
Any ideas what's going on?
The normals of your mesh are not set properly, so the culling algorithm treats it as a back-face that should not be rendered.
If you can edit the model so that you can inverse the normals that would work. Most modeling tools have convenient tools or direct routines for "flipping normals". However, if that is not possible, a trick is to change the culling settings from your material: When the culling mode is set to Cull Back (which is the default setting), the polygons that are not facing the camera are not rendered. Then, for the mesh that is not visible, you can change the culling property from Cull Back to Cull Front. This way it will be visible.
The caveat is that most of the time material properties might be overlooked as Cull Front and No Cull settings are not as common as Cull Back. Also, performance-wise, you will have a different shader running because of that mesh.
Try two-sided shader, as suggested here
The easiest solution I found for this problem is:
1.Select the object that's going invisible on certain angles.
2.Navigate to Mesh Renderer > Render Face > Both
The problem should be solved now
My Unity version is 2019.3