Hide parts of mesh overlapping another mesh in Unity - unity3d

I have thoses two meshes:
In my game, I put the hat on the hair at runtime:
As you can see, as expected, the hair is visible outise the hat part.
How can I achieve this in Unity (what kind of mask shader should I use?):
I've tryed to make a depth mask but it hides every meshes in my scene. I just want to hide the hair, not others meshes.
And what if I have two player having the same case? Would player mask hide player 2 hair? How can I avoid that?

What I would do:
write a C# code that gets the pivot position (bottom part of the hat) and its up vector every frame.
build a plane with these values. The up vector would be the normal vector of the plane and a plane can be defined by a point and a normal vector.
I would pass the equation of the plane to the shader (via Material.SetFloat or Material.SetVector) and evaluate if the world positions of the hair vertices are in the correct or in the wrong side of the plane.

Related

Why is my Unity plane seemingly 10 times too big

I'm a relative Unity noob. I have a fairly simple scene. Currently in the following you will see a plane (object WorldTilemapGfx) and 2 sprites (Tile C: 0 R: 0, and Tile C: 1 R: 0).
In the following picture you see I've selected one of the sprites. Its scale is 1 x 1, and its at position 1, 0.
Now I select the other sprite.
So far the positions and sizes seem ok.
Now if I select the game object with a "plane" mesh it shows in the inspector as scale 2, 1. This is the scale I expect since it is supposed to be as wide as two of the tiles above, and as high as only 1 of them.
However its visually 10 times too big.
If I increase the X scale of one of my tiles by 10, then the relative sizes between tile and plane look ok
Also the image used for my tile is 256 x 256.
Can someone suggest what I am missing? Thanks.
See Unity Mesh Primitives
Plane
This is a flat square with edges ten units long oriented in the XZ plane of the local coordinate space. It is textured so that the whole image appears exactly once within the square. A plane is useful for most kinds of flat surface, such as floors and walls. A surface is also needed sometimes for showing images or movies in GUI and special effects. Although a plane can be used for things like this, the simpler quad primitive is often a more natural fit to the task.
whereas
Quad
The quad primitive resembles the plane but its edges are only one unit long and the surface is oriented in the XY plane of the local coordinate space. Also, a quad is divided into just two triangles whereas the plane contains two hundred. A quad is useful in cases where a scene object must be used simply as a display screen for an image or movie. Simple GUI and information displays can be implemented with quads, as can particles, sprites
and “impostor” images that substitute for solid objects viewed at a distance.
Ok.. confirmed using a Quad gave me what I expected in scale.. I now understand that the underlying plane mesh is actually 10 x 10 in size.
https://forum.unity.com/threads/really-dumb-question-scale-of-plane-compared-to-cube.33835/#:~:text=aNTeNNa%20trEE%20said%3A-,The%20plane%20is%20a%2010x10%20unit%20mesh.,a%20quick%20floor%20or%20wall.

ARKit use Lidar mesh to smooth estimated planes

I'm trying to use ARKit's mesh scene reconstruction (with lidar) data to improve detected plane/geometry detection.
Right now, when pointing to a surface, ARKit gives me a very rough rectangle (far from actual surface's dimension). It happens almost instantly, but still far from the actual shape.
I'm trying to use this plane info, hit detection, and mesh data, to actually draw a smoothed rectangle around the detected surface. I don't expect full code, but rather just some hints of what to do.
Note: I'm using SceneKit (not RealityKit).
This is what I have so far for visualization:
Basically, I want the blue rectangle to better adjust to the real world shape by using the already available mesh data.
instead of using plane extents, use anchor.geometry

How Does Unity Assign Pivot Point Location on Script Generated Meshes

I have tried to find any information on how the Unity assigns pivot points to object but all I keep finding is threads on how to move pivot points and that it can't be done. I am creating a 2D game with a background that is randomly created with meshes that are wrapped in empty GameObjects. These objects are organically shaped but they have a property that returns a rectangle that bounds the object so that they can be placed in a way that they are not overlapping. The trouble is that the algorithm assumes that the pivot point is going to be the center of the object. What I would like to know is how does Unity decide where the pivot point will be set to so that I can predict how much I will need to move my mesh inside the parent object so that the pivot point will be in the center of the bounding rectangle.
Possible fix:
Try create the meshes during runtime and see if it always places the pivot points at a certain corner or at least relatively speaking the same location.
If it does that you would know where the pivot point is and could take it into account in your code, if you also know the size of the mesh you spawn.
So I think most general and correct answer that I can come up with is that unity assigns the pivot point to the center of the GameObject that you apply the Mesh to. The local coordinates of the vertices of the mesh depending on how you create them mighht place your mesh so that its logical center is not the same as the that of the empty GameObject that it is attached to. What I did to fix the issue was to make a vector from local point (0,0,0) to the center of bounding rectangle and translate the vertices I use to make my mesh by that vector inverted. It wasn't perfect but by far close enough to ensure that I won't have any overlapping meshes.

Visible surface in some angle in Unity

I have surface of floor like in screenshot http://prntscr.com/amqstw. If I move camera in some angle I don`t see angle floor : http://prntscr.com/amqt19. How I may resolve this problem.
That effect is due to backface culling.
In that angle (probably) the camera is inside the mesh of the floor, so the normal vectors of the cube (I presume) are facing the other way, and they get "culled" (become invisible).
You can turn it of in two ways:
By editing the mesh in your modeling software so that it becomes a
"double-sided mesh", or
By finding a shader online which, once
applied to the floor object, deactivates its backfire culling (harder
to do, without screwing up something else)

Find angle face under mouse pointer in Unity 3d

I have a projector component and I need to find the angle that projected texture falls at to exclude the projecting on vertical faces.
My projector is under the mouse pointer and works ok when it is over an horizontal face:
I would like the projector to switch off on vertical faces to avoid this bad effect:
If possible, I would like to do it in the shader code to avoid the vertical projected image even if the cursor is located on the corners of an horizontal face and a part "goes out" on vertical face.
I found this solution in C#:
if (Physics.Raycast(MouseRay,out hitInfo)){
if(hitInfo.normal.y>0) {
// draw
} else {
// not draw
}
}
But only it works on curved surfaces and not, for example, on the face cubes.
How can I do this properly?
Normally they would use an image on a quad using TGA transparency, which rotates itself to the face that the middle of the object is aligned to, using ray to find the vertex and making it's absolute normal.
Other ways of doing it would be quite tricky, perhaps using decals... If you did it using a shader, it would take so much time... it's a case of problem solving not being ordered in order of importance for fast development. Technically you can project a volumetric texture on top of whatever object you are using... that way you can add your barred circle projected from a point in space towards the object, as a mathematical formula, it takes a while to do, check out volumetric textures, i have written some and in your case it needs the mouse pos sent to texture and maths to add transparent zone and red zone to texture. takes all day.
It's fine to have a flat circle that flips around when you change the pointer onto a different face, it will just look like a physical card and it's much easier to code, 10 minutes instead of many hours.