Any way to get a URP shader to tile at a constant size, regardless of face orientation? - unity3d

I need a shader which I can apply to a surface, and have it tile a texture at a constant size. Think 'stretchable brick wall' I looked at this world shader hoping I could adapt. https://www.youtube.com/watch?v=vIh_6xtBwsI&ab_channel=JustinFoley
The problem with world UV's is they only project along the major axes. I need it to follow the rotation of the object, just not affected by scale.
This is what I was trying:
But as you can imagine, it is still affected by scale, and appears to align with the y plane:

Related

ARKit use Lidar mesh to smooth estimated planes

I'm trying to use ARKit's mesh scene reconstruction (with lidar) data to improve detected plane/geometry detection.
Right now, when pointing to a surface, ARKit gives me a very rough rectangle (far from actual surface's dimension). It happens almost instantly, but still far from the actual shape.
I'm trying to use this plane info, hit detection, and mesh data, to actually draw a smoothed rectangle around the detected surface. I don't expect full code, but rather just some hints of what to do.
Note: I'm using SceneKit (not RealityKit).
This is what I have so far for visualization:
Basically, I want the blue rectangle to better adjust to the real world shape by using the already available mesh data.
instead of using plane extents, use anchor.geometry

How to achieve variable distortion along height on single 2D Sprites

I'm trying to achieve an effect on single 2D sprites that is similar to those used on animes when characters are moving fast.
My start point was using the Tilling and Offset node on URP shader graphs to distort the sprite, i could change the tilling based on variables such as time but that didn't achieve the desired effect, the main problem with that node is that it distorts the whole sprite on the same amount, while the desired effect would be a distortion that varies along the height of the sprite.
Anyone got any insights on this?
Here's my reference point,
base sprite:
distorted (i would like a more detailed - less distorted effect but i hope you get the idea):
Edit 1: My current progress

Cull off parts above the mesh

So, I want to make scene same to this Sphere Scene
Now I have mesh with random generation as a ground and a sphere. But I dont't know how to cull off spheres geometry above mesh. Tried to use Stencil, and hightmap. Stencil rendered ground in front, but sphere above ground is still rendered. Using heightmap, to get know if it needs to render (I compared height map and worldPos) is problematic, because the texture is superimposed over the all sphere, and not projected onto it. Can you help. Is there any shader function to cull off all above mesh.
I did something similar for an Asteroids demo a few years ago. Whenever an asteroid was hit, I used a height map - really, just a noise map - to offset half of the vertices on the asteroid model to give it a broken-in-half look. For the other half, I just duplicated the asteroid model and offset the other half using the same noise map. The effect is that the two "halves" matched perfectly.
Here's what I'd try:
Your sphere model should be a complete sphere.
You'll need a height map for the terrain.
In your sphere's vertex shader, for any vertex north of the equator:
Sample the height map.
Set the vertex's Y coordinate to the height from the height map. This will effectively flatten the top of the sphere, and then offset it based on your height map. You will likely have to scale the height value here to get something rational.
Transform the new x,y,z as usual.
Note that you are not texturing the sphere. You're modifying the geometry. This needs to happen in the geometry part of the pipeline, not in the fragment shader.
The other thing you'll need to consider is how to add the debris - rocks, etc. - so that it matches the geometry offset on the sphere. Since you've got a height map, that should be straightforward.
To start with, I'd just get your vertex shader to flatten the top half of the sphere. Once that works, add in the height map.
For this to look convincing, you'll need a fairly high-resolution sphere and height map. To cut down on geometry, you could use a plane for the terrain and a hemisphere for the bottom part. Just discard any fragment for the plane that is not within the spherical volume you're interested in. (You could also use a circular "plane" rather than a rectangular plane, but getting the vertices to line up with the sphere and filling in holes at the border can be tricky.)
As I realised, there's no standard way to cull it without artifacts. The only way it can be done is using raymarching rendering.

HLSL lighting based on texture pixels instead of screen

In HLSL, how can I calculate lighting based on pixels of a texture, instead of pixels that make up the object?
In other words, if I have a 64x64px texture being rendered on a 1024x768px screen, I want to calculate the lighting as it affects the 64x64px space, resulting in jagged pixels instead of a smooth line.
I've researched dozens of answers but I'm not sure how I can determine at all times if a fragment is a part of a pixel that should be fully lit or not. Maybe this is the wrong approach?
The current implementation uses a diffuse texture and a normal map. It results in what appear as artifacts (diagonal lines) in the output:
Note: The reason it almost looks correct is because of the normal map, which causes some adjacent pixels to have normals that are angled just enough to light some pixels and not others.

Find angle face under mouse pointer in Unity 3d

I have a projector component and I need to find the angle that projected texture falls at to exclude the projecting on vertical faces.
My projector is under the mouse pointer and works ok when it is over an horizontal face:
I would like the projector to switch off on vertical faces to avoid this bad effect:
If possible, I would like to do it in the shader code to avoid the vertical projected image even if the cursor is located on the corners of an horizontal face and a part "goes out" on vertical face.
I found this solution in C#:
if (Physics.Raycast(MouseRay,out hitInfo)){
if(hitInfo.normal.y>0) {
// draw
} else {
// not draw
}
}
But only it works on curved surfaces and not, for example, on the face cubes.
How can I do this properly?
Normally they would use an image on a quad using TGA transparency, which rotates itself to the face that the middle of the object is aligned to, using ray to find the vertex and making it's absolute normal.
Other ways of doing it would be quite tricky, perhaps using decals... If you did it using a shader, it would take so much time... it's a case of problem solving not being ordered in order of importance for fast development. Technically you can project a volumetric texture on top of whatever object you are using... that way you can add your barred circle projected from a point in space towards the object, as a mathematical formula, it takes a while to do, check out volumetric textures, i have written some and in your case it needs the mouse pos sent to texture and maths to add transparent zone and red zone to texture. takes all day.
It's fine to have a flat circle that flips around when you change the pointer onto a different face, it will just look like a physical card and it's much easier to code, 10 minutes instead of many hours.