I have two cylinders in my scene. The outer cylinder (Cylinder A) has planes facing the outside and planes facing the inside to represent glass.
Cylinder A has a standard shader on it, set to transparent mode, it has an albedo Alpha of 143. It has also a shininess to it represented by it's smoothness and this is set to 1.
Cylinder B has a Standard Shader on it, set to transparent mode, It has an albedo alpha that animates from 255 to 0 over a short period of time. It has a Normal map set to 0.3 and no smoothness.
Here is an image to act as a guide.
The problem involves the Cylinder B not 'playing ball'. It should be inside Cylinder A - and would look right if Cylinder A's shininess was visible to the viewer.
I have provided a second example where Cylinder B is in Opaque mode (Standard Shader). It looks right - except it cannot animate a fade.
I have looked into this. From what I have experienced - Cutout Mode will either be Transparent or Opaque, nothing in between. Fade mode has the same effect.
I can see that this has to do with Z Ordering and that ZWriting may be the way to fix this. I am unfamiliar with it's placement within shaders or if it will work while an animation fades out the alpha of Cylinder B and wondering can anyone point me in the right direction for a further understanding.
Any help is greatly appreciated.
Related
I have thoses two meshes:
In my game, I put the hat on the hair at runtime:
As you can see, as expected, the hair is visible outise the hat part.
How can I achieve this in Unity (what kind of mask shader should I use?):
I've tryed to make a depth mask but it hides every meshes in my scene. I just want to hide the hair, not others meshes.
And what if I have two player having the same case? Would player mask hide player 2 hair? How can I avoid that?
What I would do:
write a C# code that gets the pivot position (bottom part of the hat) and its up vector every frame.
build a plane with these values. The up vector would be the normal vector of the plane and a plane can be defined by a point and a normal vector.
I would pass the equation of the plane to the shader (via Material.SetFloat or Material.SetVector) and evaluate if the world positions of the hair vertices are in the correct or in the wrong side of the plane.
I'm trying to achieve an effect on single 2D sprites that is similar to those used on animes when characters are moving fast.
My start point was using the Tilling and Offset node on URP shader graphs to distort the sprite, i could change the tilling based on variables such as time but that didn't achieve the desired effect, the main problem with that node is that it distorts the whole sprite on the same amount, while the desired effect would be a distortion that varies along the height of the sprite.
Anyone got any insights on this?
Here's my reference point,
base sprite:
distorted (i would like a more detailed - less distorted effect but i hope you get the idea):
Edit 1: My current progress
I am using Unity 5 to develop a game. I'm still learning, so this may be a dumb question. I have read about Depth Buffer and Depth Texture, but I cannot seem to understand if that applies here or not.
My setting is simple: I create a grid using several quads (40x40) which I use to snap buildings. Those buildings also have a base, made with quads. Every time I put one one the map, the Quads overlap and they look like the picture.
As you can see, the red quad is "merging" with the floor (white quads).
How can I make sure Unity renders the red one first, and the white ones are background? Of course, I can change the red quad Y position, but that seems like the wrong way of solving this.
This is a common issue, called Z-Fighting.
Usually you can reduce it by reducing the range of “Clipping Planes” of the camera, but in your case the quads are at the same Y position, so you can’t avoid it without changing the Y position.
I don't know if it is an option for you, but if you use SpriteRenderer (Unity 2D) you don’t have that problem and you can just set “Sorting Layer” or “Order in Layer” if you want modify the rendering order.
I have a projector component and I need to find the angle that projected texture falls at to exclude the projecting on vertical faces.
My projector is under the mouse pointer and works ok when it is over an horizontal face:
I would like the projector to switch off on vertical faces to avoid this bad effect:
If possible, I would like to do it in the shader code to avoid the vertical projected image even if the cursor is located on the corners of an horizontal face and a part "goes out" on vertical face.
I found this solution in C#:
if (Physics.Raycast(MouseRay,out hitInfo)){
if(hitInfo.normal.y>0) {
// draw
} else {
// not draw
}
}
But only it works on curved surfaces and not, for example, on the face cubes.
How can I do this properly?
Normally they would use an image on a quad using TGA transparency, which rotates itself to the face that the middle of the object is aligned to, using ray to find the vertex and making it's absolute normal.
Other ways of doing it would be quite tricky, perhaps using decals... If you did it using a shader, it would take so much time... it's a case of problem solving not being ordered in order of importance for fast development. Technically you can project a volumetric texture on top of whatever object you are using... that way you can add your barred circle projected from a point in space towards the object, as a mathematical formula, it takes a while to do, check out volumetric textures, i have written some and in your case it needs the mouse pos sent to texture and maths to add transparent zone and red zone to texture. takes all day.
It's fine to have a flat circle that flips around when you change the pointer onto a different face, it will just look like a physical card and it's much easier to code, 10 minutes instead of many hours.
I am trying to add a transparent texture on top a cube. Only the front face is not transparent. Other sides are transparent. What could be the problem?. Any help is appreciated.
EDIT : I found that the face which is drawn first is opaque.
3 face of the cube is drawn.
Opaque face.((This face's index is given first in GLdrawElements))
Transparent face.
You most probably ran into a sorting problem. To display transparent geometries correctly the faces of the object have to be sorted from back to front.
Unfortunately there is no built-in support for that in opengl-es (or in any gfx-library in existance). The only possibility is to sort your polygons, recreate your object each frame and draw it with correctly ordered faces.
A workaround would be using additive transparency instead of normal transparency. Additive transparency is an order independent calculation. You have to remember to turn off z-buffer writes while drawing because otherwise some geometry may be ocluded.
Additive transparency is achieved by setting both blendfunc values to GL_ONE.