When I render a cube and texture it I end up with white edges along the cube. I've checked the vertex and texture coordinates and they look fine to me. My texture is a power of 2. It is a texture map containing 4x4 textures in which each texture is 16x16 pixels. Does anyone have any suggestions?
I guess you are experiencing texture bleeding. You can solve it by either using GL_CLAMP on your textures or adjusting slightly your UV coordinates to 0.0005 and 0.0095 (for instance) instead of 0 and 1 to compensate for the texture sampling artifacts.
Related
TLDR: Can't figure out the correct Shader Graph setup for using UV and vertex displacement to cheaply animate a (unrigged) mesh.
I am trying to rotate a part of the mesh based on the UV coordinates, e.g: fromX 0 toX 0.4, fromY 0 toY 0.6. The mesh is created uv-mapped with this in mind.
I have no problem getting the affected vertices in this area. Problem is that I want to rotate these verts for customizable axis e.g. axis(X:1, Y:0, Z:1) using a weight so that the rotation takes place around a pivoted point. I want the bottom selection to stay connected to the rest of the mesh while the other affected vertices neatly rotate around this point.
The weight can be painted by using split UV channels as seen in the picture:
I multiply the weighted area with a rotation node to rotate it.
And I add that to the negative multiplied position (the rest of the verts, excluding the rotated area) to get the final output displacement.
But the rotated mesh is bent. I need it to be stiff as in the whole part rotated with weight=1 except for the very pivoting vertex.
I can get it as described using a weight=1 based rotation, but the pivot point becomes the center of the mesh, not the desired point.
How can I do this correctly?
Been at it for days, please help :')
I started using Unity about a month ago, and this is one of the first issues I faced.
The node you are using will always transform the vertices around the origin.
I think you have two options available:
Translate the vertices by the offset of where you want to rotate the wings. This would require storing the pivot point of the wings in the mesh somehow - This could done by utilizing a spare UV channel, or by using the vertex color channel.
Use bones and paint the weights in your chosen 3D package. This way, you can record the animation, and use Unity's skinned mesh shader to render it.
Hope that helps.
Try this:
I've used the UV ranges from your example applied to a sphere of unit size. The spheres original pivot is in the centre, and its adjusted pivot is shifted 0.5 on the Y axis.
The only variable the shader doesn't know, is the adjusted pivot position; so I pass this through the material.
I've not implemented your weight in the graph, as I just wanted to show you the process. You can easily plug that in.
The color output is just being used for debug purposes.
The first image is with the default object pivot.
The second image is with the adjusted pivot.
The final image is the graph. (Note the logic group is driving the vertex rotation based on the UV mask).
I have recently been working on a voxel game that uses greedy meshes. Faces will vary from 1*1 to 64*64 unity. For the flat areas in the game it makes more sense to combine multiple smaller terrain tiles into bigger ones with a tiled texture, but this poses a problem for my sprite atlas. Each uv has a reference to a spot on the atlas, but for larger greedy faces the texture gets stretched. I want the uv to tile the correct amount of times on a per-face basis to produce the same result as if the larger faces were a bunch of smaller ones, only without the extra geometry.
Here is an example of what I want that was achieved in openGl:
OpenGl Face-Based-Tiling
See how the larger faces are tiled to give the impression of smaller ones? The texture was from an atlas similar to the following:
Texture Atlas
I only have a basic knowledge of shaders in unity, but how would I write a shader in unity to accomplish this?
In order to do this, you need to first pass the shader a few bits of information: The x and y dimensions of the face (in terms of tiles) and the uv coordinates of the lower left and upper right corners of the tile. You can then use the following equations to calculate the uv coordinates assuming the face's uv values (inputUV) range from the lower left corner to the upper right corner of the sprite:
float2 newUV = (inputUV - llCorner)/(urCorner - llCorner); //converts from input uv to values between (0.0, 0.0) and (1.0, 1.0)
newUV.x = newUV.x * xDim % 1; //makes the newUV coordinates repeat xDim times on the x axis
newUV.y = newUV.y * yDim % 1; //makes the newUV coordinates repeat yDim times on the y axis
newUV = newUV*(urCorner - llCorner) + llCorner; //converts values between (0.0, 0.0) and (1.0, 1.0) to values between the lower left corner and the upper right corner
I haven't actually tested it, but I think this should work. I hope this helps!
I am completely new to this so I expect I may be doing something trivial incorrectly. I have a regular sphere in my Unity3D scene and I have a .jpeg image with a number in various places and orientations. When I apply the image to the sphere as a texture, the numbers centrally located in the image display fine on the sphere, but those closer to the top or the bottom of the image file appear warped on the sphere. For e.g with the number 12, the base of the 1 and 2 are bigger and the number tapers the further up you go when rendered on the sphere.
This is not an error on your part; this is the way the texture is mapped to a sphere by default.
To 'fix' this you would have to compensate that distortion in your texture directly or modify the UV coordinates of the sphere.
I am creating a plane using unity. The thing is that this plane has 200 triangles and 100+ vertices. Because my plane is not altered( i only set a texture on it), i think that the plane should have 2 triangles and 4 vertices. Can this be done?
UPDATE:
i think i need to change the mesh filter for my plane. I've created one with only 4 vertices in blender, but after i delete the original plane mesh in unity and place the new 4-vertices mesh, i only get a color that can be found on my texture, but not the texture itself. Can i place a texture on a 4 vertices mesh? if so, what am i doing wrong?
Using a 4 vertices/2 triangles planes should work, and is also a good idea :-)
It's probably only showing part of the texture because it has some wrong UV coordinates.
Try using this plane:
https://dl.dropbox.com/u/4375689/Permanent/Plane.3DS
I'm not sure if the size is correct, but it should have the correct UV coordinates.
That is right, the default Unity plane has more than 2 tris and 4 vertices. To have such a mesh you need to either import it or create it in code.
I've built this script called ProceduralPlane that allows you to set the number of segments in the editor: https://github.com/doukasd/Unity-Components/tree/master/UnityPackages
For openGl expample, Xcode given a project GlGravity. But Instead of showing yellow color how to apply a Texture image without textureCoordinates?.
You need some kind of texture coordinate, otherwise the whole concept of textures makes no sense: A texture is a function mapping a set of n coordinates to some value (depth, luminance, alpha, colour or combination of those) defined by data the samples are taken from and interpolated.
You can generate the texture coordinates, either statically from your mesh, or in the vertex shader. Or you supply them directly. But you need some texture coordinates to make this work. A very cheap and simple texture coordinate generator is using the vertex position as texture coordinate; this will project your texture along the coordinates axes onto the model. So if you've got a 2D texture it will be applied in the XY plane, as if there were a parallel projecting slide projector at coordinates (0,0,\infinity).