Shader is not working properly in my 2d game - unity3d

I was looking for doing a sprite wave like water on a lake/ocean, i found a tutorial on internet, but it was for a 3d game, i applied the tutorial but the shader is moving the sprite in the Z axis, so the wave effect is not notizable from 2d view, but in 3d view it looks fine, it's a 2d game so i need it to move in the correct direcction, this is my first shader so i don't understand it so much.

Use a x=0, y=1, z=0 vector instead of the Normal Vector node.
That will make the wave motion go in the "upward" direction on the object instead of the "outward" direction.

Related

3d vectors from a Unity Plane to a flat 2D point as a pixel on a texture

If I had a sharp sword and I were to perfectly slice an object in half, I would like to sample the colours at various points along this flat, freshly cut face, and place these colours on a texture.
Imagine the face is a Unity Plane defined by its Vector3 normal that goes through a location Vector3 p.
Let the texture be a 100 x 100 sized image.
Lets say the samples I want to take are three 3D points all on this plane, and defined as Vector3 A, B and C.
How do I go about converting the 3D points (x,y,z) from the defined plane into a 2D pixel (x,y) of this texture?
I have read many similar questions but honestly could not understand the answers. I don't know in my scenario if I'm dealing with Orthographic vs Projection perspective, whether I need to create a "conversion matrix", whether I need be concerned about rotations, or if there is just a simpler solution.
I appreciate any tips or suggestions. Thanks

Is it possible to pass a custom position to calculate lighting in unity surface shaders?

I want to create a shader that uses different coordinates for light calculations than for what's being displayed. This probably sounds strange, but I would like to do this for lighting a top-down 2D game.
I want to write a vertex shader that offsets the Z coordinate by the value of the Y coordinate for display, but uses unmodified coordinates for lighting calculation.
Is this possible to do, and if so, where would I start?
So far I have a surface shader that offsets the Z coordinate by the value of the Y coordinate, but unity is using the modified coordinates to calculate lighting, I would like unity to use the unmodified coordinates for light calculations.

How to set direction of arrows in shadergraph

I'm pretty new to shader graph and shaders in general. I'm working on a 2D project and I'm trying to make a shader that rotates an arrow to make a flow-like material and use it on a sprite shape.
Basically what I want to do is make a proper version of this:
What I'm currently doing is multiplying the Y position of the position node by an exposed vector 1 and using it in Rotate node (which I know is pretty hacky and won't work if the shape is not an arc.)
Aligning UV with arbitrary mesh seems bit hard. Why not bend pre-made mesh instead? Graph below bends vertex positions around axis Z at given point and strength (0 makes mesh invisible tho), but, you can easily replace that Position node with UV and plug results into Sample Texture 2D. I just guess bending a mesh will give you better/easier results.
Create a subdivided and well UV-mapped rectangle plane
Bend that plane with a vertex shader (attached graph bends around Z axis)
graph is based on code from Blender source

How to transform a non-planar surface on a plane using a pair of 2D and 3D control points?

I have a set of control point pairs. One part of the pair is in world coordinates (3D). The other one is in pixel corrdinates of the image (2D).
My goal is to transform a surface you can see in this image onto a flat plane. The problem is that the surface is not perfectly flat, it kinda looks like a ribbon. Otherwise I could have used OpenCV's getPerspectiveTransform() or Matlab's fitgeotrans().
I know that I can use OpenCV's solvePnP() or Matlab's estimateWorldCameraPose() to get the pose of the camera. The camera matrix is known and the image is rectified. But what is the next step then? How can I transform my ribbon shaped surface onto a flat plane, i.e. get an orthographic top view? That is the step, I'm stuck on.

How do you calculate the 2D Plane on which an array of Vector3 points are sitting on?

Assume you have a large array of Vector3 points. These points were plotted by hand by the player when asked to draw a 2d shape (i.e. a 2d triangle) in a 3-dimentional environment (drawing 2d shapes with htc vive controller in a virtual environment).
I would like to project these points on to a plane to 'flatten' the shape they drew. To do this properly I need to know the average plane they're sitting on so that there is minimal 'squishing/distorting' when projecting them on to it.
Basically, I want to ask them to draw any 2D Shape in the air (which won't be 2D because humans) rotated any way they want and convert that shape into a 2d picture with a known plane for further manipulation.
If you would like to know more specifics I would be happy to provide them.