HLSL lighting based on texture pixels instead of screen - unity3d

In HLSL, how can I calculate lighting based on pixels of a texture, instead of pixels that make up the object?
In other words, if I have a 64x64px texture being rendered on a 1024x768px screen, I want to calculate the lighting as it affects the 64x64px space, resulting in jagged pixels instead of a smooth line.
I've researched dozens of answers but I'm not sure how I can determine at all times if a fragment is a part of a pixel that should be fully lit or not. Maybe this is the wrong approach?
The current implementation uses a diffuse texture and a normal map. It results in what appear as artifacts (diagonal lines) in the output:
Note: The reason it almost looks correct is because of the normal map, which causes some adjacent pixels to have normals that are angled just enough to light some pixels and not others.

Related

Cull off parts above the mesh

So, I want to make scene same to this Sphere Scene
Now I have mesh with random generation as a ground and a sphere. But I dont't know how to cull off spheres geometry above mesh. Tried to use Stencil, and hightmap. Stencil rendered ground in front, but sphere above ground is still rendered. Using heightmap, to get know if it needs to render (I compared height map and worldPos) is problematic, because the texture is superimposed over the all sphere, and not projected onto it. Can you help. Is there any shader function to cull off all above mesh.
I did something similar for an Asteroids demo a few years ago. Whenever an asteroid was hit, I used a height map - really, just a noise map - to offset half of the vertices on the asteroid model to give it a broken-in-half look. For the other half, I just duplicated the asteroid model and offset the other half using the same noise map. The effect is that the two "halves" matched perfectly.
Here's what I'd try:
Your sphere model should be a complete sphere.
You'll need a height map for the terrain.
In your sphere's vertex shader, for any vertex north of the equator:
Sample the height map.
Set the vertex's Y coordinate to the height from the height map. This will effectively flatten the top of the sphere, and then offset it based on your height map. You will likely have to scale the height value here to get something rational.
Transform the new x,y,z as usual.
Note that you are not texturing the sphere. You're modifying the geometry. This needs to happen in the geometry part of the pipeline, not in the fragment shader.
The other thing you'll need to consider is how to add the debris - rocks, etc. - so that it matches the geometry offset on the sphere. Since you've got a height map, that should be straightforward.
To start with, I'd just get your vertex shader to flatten the top half of the sphere. Once that works, add in the height map.
For this to look convincing, you'll need a fairly high-resolution sphere and height map. To cut down on geometry, you could use a plane for the terrain and a hemisphere for the bottom part. Just discard any fragment for the plane that is not within the spherical volume you're interested in. (You could also use a circular "plane" rather than a rectangular plane, but getting the vertices to line up with the sphere and filling in holes at the border can be tricky.)
As I realised, there's no standard way to cull it without artifacts. The only way it can be done is using raymarching rendering.

Seams on edges of voxel chunk meshes

I'm currently working on voxel terrain generation in Unity, and I've run into something annoying:
From certain camera angles, you can see seams between the edges of chunk meshes, as pictured below:
What I know:
This only occurs on the edge between two meshes.
This is not being caused by texture bleeding (The textures are solid colors, so I'm using a very large amount of padding when setting up the UVs).
The positions of all vertices and meshes are showing up as exact integers.
Disabling anti-aliasing almost entirely fixes this (You can still see the occasional speck along the edge).
I'm using Unity's default Standard shader.
Can someone explain what's causing this, and whether there's a way to solve this other than disabling AA?
Almost certainly the side faces are demonstrating z-buffer fighting with the top faces — precision is imperfect so along the seam of your geometry rounding errors are making the very top of the brown face of one cube seem to be closer to the camera than the very top of the green face of the next.
Ideally, don't draw the brown faces that definitely aren't visible — if a cube has a neighbour on face X then don't draw either its face X or its neighbour's adjoining face.

Find angle face under mouse pointer in Unity 3d

I have a projector component and I need to find the angle that projected texture falls at to exclude the projecting on vertical faces.
My projector is under the mouse pointer and works ok when it is over an horizontal face:
I would like the projector to switch off on vertical faces to avoid this bad effect:
If possible, I would like to do it in the shader code to avoid the vertical projected image even if the cursor is located on the corners of an horizontal face and a part "goes out" on vertical face.
I found this solution in C#:
if (Physics.Raycast(MouseRay,out hitInfo)){
if(hitInfo.normal.y>0) {
// draw
} else {
// not draw
}
}
But only it works on curved surfaces and not, for example, on the face cubes.
How can I do this properly?
Normally they would use an image on a quad using TGA transparency, which rotates itself to the face that the middle of the object is aligned to, using ray to find the vertex and making it's absolute normal.
Other ways of doing it would be quite tricky, perhaps using decals... If you did it using a shader, it would take so much time... it's a case of problem solving not being ordered in order of importance for fast development. Technically you can project a volumetric texture on top of whatever object you are using... that way you can add your barred circle projected from a point in space towards the object, as a mathematical formula, it takes a while to do, check out volumetric textures, i have written some and in your case it needs the mouse pos sent to texture and maths to add transparent zone and red zone to texture. takes all day.
It's fine to have a flat circle that flips around when you change the pointer onto a different face, it will just look like a physical card and it's much easier to code, 10 minutes instead of many hours.

OpenGL ES 2. Texture mapping to a quad (not square or rect)?

http://smotr.im/9KwB
There are some pictures:
1)a texture I have
2)a line I have already drawed with triangle strip
3)the result I need to achieve
The question is how to draw a texture inside these segments which are usually not regular quads?
What do you mean, "usually not regular quads"? Are they quads or not?
If they are quads, just assign texture coordinates normally. The hardware will interpolate correctly.
If they are not quads, why don't you make them quads?

Problem with glTranslatef

I use the glTranslate command to shift the position of a sprite which I load from a texture in my iPhone OpenGL App. My problem is after I apply glTranslatef, the image appears a little blurred. When I comment that line of code, the image is crystal clear. How can i resolve this issue???
You're probably not hitting the screen pixel grid exactly. This will cause texture filtering to blur it. The issue is a bit complicated: Instead of seeing the screen an texture as a array of points, see it as sheets of grid ruled paper (the texture sheet can be stretched, sheared, scaled). To make things look crisp the grids must align perfectly. The texture coordinates (0,0) and (1,1) don't hit the center of the texels but the outer edges of the texture sheet. Thus you need a little bit to offset and scale to address the texel centers. And the same goes for placing the target quads on the screen, where the vertex position must be aligned with the edges of the screen, not the pixel centers. If your projection and modelview matrix are not setup in a way that one unit in modelview space is one pixel wide and the projection fills the whole screen (or window viewport) it's difficult to get this right.
One normally starts with
glViewport(0,0, width, height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, width, 0, height, -1, 1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
// modelview XY range 0..width x 0..height now covers the whole viewport
// (0,0) don't address the lower left pixel but the lower left edge of this
// (width,height) similarily addresses the upper right corner
// drawing a 0..width x 0..height quad with texture coordinates 0..1 x 0..1
// will cover it perfectly
This will work as long as the quad as exactly the same dimensions (i.e. it's vertex positions match) the texture coordinates and the vertex positions are integers.
Now the interesting part is: What if they don't meet those conditions. Then aliasing occours. In GL_NEAREST filtering mode things still look crisp, but some lines/rows are simply missing. In GL_LINEAR filtering mode neighbouring pixels are interpolated with the interpolation factor beding determined how far off grid they are (in laymans terms, the actual implementation looks slightly different).
So how to solve your issue: Draw sprites in a projection/modelview that matches with the viewport, use only integer coordinates for the vertex coordinates and make your texture cover the whole picture. If you're using only a part of the texture coordinate range, things get even more interesting, since one addressed the texture grid, not the texel centers.
I would recommend looking at your modelview matrix declaration and be sure that glLoadIdentity() is being called to ensure that the matrix stack is clean before applying the transform.