Texture stretching when applied to sphere - unity3d

I am completely new to this so I expect I may be doing something trivial incorrectly. I have a regular sphere in my Unity3D scene and I have a .jpeg image with a number in various places and orientations. When I apply the image to the sphere as a texture, the numbers centrally located in the image display fine on the sphere, but those closer to the top or the bottom of the image file appear warped on the sphere. For e.g with the number 12, the base of the 1 and 2 are bigger and the number tapers the further up you go when rendered on the sphere.

This is not an error on your part; this is the way the texture is mapped to a sphere by default.
To 'fix' this you would have to compensate that distortion in your texture directly or modify the UV coordinates of the sphere.

Related

How to set direction of arrows in shadergraph

I'm pretty new to shader graph and shaders in general. I'm working on a 2D project and I'm trying to make a shader that rotates an arrow to make a flow-like material and use it on a sprite shape.
Basically what I want to do is make a proper version of this:
What I'm currently doing is multiplying the Y position of the position node by an exposed vector 1 and using it in Rotate node (which I know is pretty hacky and won't work if the shape is not an arc.)
Aligning UV with arbitrary mesh seems bit hard. Why not bend pre-made mesh instead? Graph below bends vertex positions around axis Z at given point and strength (0 makes mesh invisible tho), but, you can easily replace that Position node with UV and plug results into Sample Texture 2D. I just guess bending a mesh will give you better/easier results.
Create a subdivided and well UV-mapped rectangle plane
Bend that plane with a vertex shader (attached graph bends around Z axis)
graph is based on code from Blender source

Rotate vertices selected using weight map on UVs in Unity3D's Shader Graph around pivot point

TLDR: Can't figure out the correct Shader Graph setup for using UV and vertex displacement to cheaply animate a (unrigged) mesh.
I am trying to rotate a part of the mesh based on the UV coordinates, e.g: fromX 0 toX 0.4, fromY 0 toY 0.6. The mesh is created uv-mapped with this in mind.
I have no problem getting the affected vertices in this area. Problem is that I want to rotate these verts for customizable axis e.g. axis(X:1, Y:0, Z:1) using a weight so that the rotation takes place around a pivoted point. I want the bottom selection to stay connected to the rest of the mesh while the other affected vertices neatly rotate around this point.
The weight can be painted by using split UV channels as seen in the picture:
I multiply the weighted area with a rotation node to rotate it.
And I add that to the negative multiplied position (the rest of the verts, excluding the rotated area) to get the final output displacement.
But the rotated mesh is bent. I need it to be stiff as in the whole part rotated with weight=1 except for the very pivoting vertex.
I can get it as described using a weight=1 based rotation, but the pivot point becomes the center of the mesh, not the desired point.
How can I do this correctly?
Been at it for days, please help :')
I started using Unity about a month ago, and this is one of the first issues I faced.
The node you are using will always transform the vertices around the origin.
I think you have two options available:
Translate the vertices by the offset of where you want to rotate the wings. This would require storing the pivot point of the wings in the mesh somehow - This could done by utilizing a spare UV channel, or by using the vertex color channel.
Use bones and paint the weights in your chosen 3D package. This way, you can record the animation, and use Unity's skinned mesh shader to render it.
Hope that helps.
Try this:
I've used the UV ranges from your example applied to a sphere of unit size. The spheres original pivot is in the centre, and its adjusted pivot is shifted 0.5 on the Y axis.
The only variable the shader doesn't know, is the adjusted pivot position; so I pass this through the material.
I've not implemented your weight in the graph, as I just wanted to show you the process. You can easily plug that in.
The color output is just being used for debug purposes.
The first image is with the default object pivot.
The second image is with the adjusted pivot.
The final image is the graph. (Note the logic group is driving the vertex rotation based on the UV mask).

How to transform a non-planar surface on a plane using a pair of 2D and 3D control points?

I have a set of control point pairs. One part of the pair is in world coordinates (3D). The other one is in pixel corrdinates of the image (2D).
My goal is to transform a surface you can see in this image onto a flat plane. The problem is that the surface is not perfectly flat, it kinda looks like a ribbon. Otherwise I could have used OpenCV's getPerspectiveTransform() or Matlab's fitgeotrans().
I know that I can use OpenCV's solvePnP() or Matlab's estimateWorldCameraPose() to get the pose of the camera. The camera matrix is known and the image is rectified. But what is the next step then? How can I transform my ribbon shaped surface onto a flat plane, i.e. get an orthographic top view? That is the step, I'm stuck on.

How to apply a texture image in GLGravity Teapot iphone?

For openGl expample, Xcode given a project GlGravity. But Instead of showing yellow color how to apply a Texture image without textureCoordinates?.
You need some kind of texture coordinate, otherwise the whole concept of textures makes no sense: A texture is a function mapping a set of n coordinates to some value (depth, luminance, alpha, colour or combination of those) defined by data the samples are taken from and interpolated.
You can generate the texture coordinates, either statically from your mesh, or in the vertex shader. Or you supply them directly. But you need some texture coordinates to make this work. A very cheap and simple texture coordinate generator is using the vertex position as texture coordinate; this will project your texture along the coordinates axes onto the model. So if you've got a 2D texture it will be applied in the XY plane, as if there were a parallel projecting slide projector at coordinates (0,0,\infinity).

White edges appearing at edge of cube

When I render a cube and texture it I end up with white edges along the cube. I've checked the vertex and texture coordinates and they look fine to me. My texture is a power of 2. It is a texture map containing 4x4 textures in which each texture is 16x16 pixels. Does anyone have any suggestions?
I guess you are experiencing texture bleeding. You can solve it by either using GL_CLAMP on your textures or adjusting slightly your UV coordinates to 0.0005 and 0.0095 (for instance) instead of 0 and 1 to compensate for the texture sampling artifacts.