How to do texture mapping in openglES? ( Mapping 2D face into a 3D mesh ) - iphone

I need to convert a 2D face image to a 3D image. For this I thought of using texture mapping with openglES. I tried a lot googling to find some samples I couldnt get any. Can some one guide me to do this?
Input: 2D image
Output : 3D image
Platform : ios

As you know, OpenGL is using 3D or 2D vertices that has a few attributes such as position, normal value, color, texture coordinate. So you should set these values first and you can render.
In ES 2.0 clearly you have to give these values to Vertice Shader
and then you have to transfer two values texture coordinate , normal value to Fragment Shader
and then in Fragment Shader, you can control these values with sampler texture for rendering your face object.
If you work In IOS, It's going to be very help .
Explanation :
http://ofps.oreilly.com/titles/9780596804824/chtextures.html
Source Code :
http://www.developers-life.com/iphone-3d-samples.html

Related

RenderSettings customReflection texture has invalid type, 2D given while only CUBE is supported. Custom reflection texture will not be used

so I started an unity project w ps1 graphics style, and it been smth like 2 days I have this error but the game still working kinda fine, can someone explain me why this happening? Thanks!
(Im not using URP)
Full error :
RenderSettings customReflection texture has invalid type, 2D given while only CUBE is supported. Custom reflection texture will not be used in UnityEditor.EditorApplication:Internal_RestoreLastOpenedScenes ()
In the Texture 2D Import Settings, change Texture Shape from 2D to CUBE.
In your project, click on your texture. Import settings are found in the inspector.
Cube defines the Texture as a cubemap. You could use this for Skyboxes or Reflection Probes
, for example. This type is only available with the Default, Normal Map, and Single Channel Texture types.
From the Texture Shape Reference of Texture Importer Documentation.

Evaluate depth for orthographic camera

I have a post processing shader. For simplicity, my post processing shader only shows the _CameraDepthTexture at the given uv. This shader is written in code.
I'm moving to shader graph and I want to have a material for all of my objects and achieve the exact same effect (show the same depth color), althought I can't use Scene Depth node. How can I generate the exact same color for my objects in Shader Graph?
As the depth is related to the distance between the camera and the objects, I'm trying to set the depth like this:
I take the vector (vertex world position - camera world position).
I project this vector into the camera direction vector
I remap this length of the projection vector from (near plane, far plane) to (1, 0)
It looks like my depth is the same as _CameraDepthTexture, but when objects are too close to the camera, they are different (my version is darker).
How can I write a shader without Scene Depth node that generates the exact same color as _CameraDepthTexture? My camera is orthographic with orthographic size 10.4, near = -50 and far = 50.

Unity Point-cloud to mesh with texture/color

I have a point-cloud and a rgb texture that fit together from a depth camera. I procedurally created a mesh from a selected part of the point-cloud implementing the quickhull 3D algorithm for mesh creation.
Now, somehow I need to apply the texture that I have to that mesh. Note that there can be multiple selected parts of the point-cloud thus making multiple objects that need the texture. The texture is just a basic 720p file that should be applied to the mesh material.
Basically I have to do this: https://www.andreasjakl.com/capturing-3d-point-cloud-intel-realsense-converting-mesh-meshlab/ but inside Unity. (I'm also using a RealSense camera)
I tried with a decal shader but the result is not precise. The UV map is completely twisted from the creation process, and I'm not sure how to generate a correct one.
UV and the mesh
I only have two ideas but don't really know if they'll work/how to do them.
Try to create a correct UV and then wrap the texture around somehow
Somehow bake colors to vertices and then use vertex colors to create the desired effect.
What other things could I try?
I'm working on quite a similar problem. But in my case I just want to create a complete mesh from the point cloud. Not just a quickhull, because I don't want to lose any depth information.
I'm nearly done with the mesh algorithm (just need to do some optimizations). Quite challenging now is to match the RGB camera's texture with the depth camera sensor's point cloud, because they of course have a different viewport.
Intel RealSense provides an interesting whitepaper about this problem and as far as I know the SDK corrects these different perspectives with uv mapping and provides a red/green uv map stream for your shader.
Maybe the short report can help you out. Here's the link. I'm also very interested in what you are doing. Please keep us up to date.
Regards

Unity Geometry Shader: Dynamic Grass

So, I have a functioning voxel engine that creates smoothed terrain in chunks of 1x1x1 meter in a 1024 radius around my player.
I wanted to create a geometry shader that not only continues to texture the ground appropriately, but also creates grass (preferably waving with the wind).
I have found some basic billboard geometry shaders to get me started, but they seem to cause the mesh to stop texturing. Is there anyway to do both from one shader?
Do I pass the mesh triangles and the new grass triangles on to the fragment shader with a flag? Thanks in advance!
You can do this by implementing two passes to the shader as it turns out. My first pass is a simple surface shader, but my second pass is the geometry shader. The multi-pass still results in 130 FPS, so it seems to be adequate.

draw texture on 4 vertices plane

i have a standard plane created with unity and replaced its mesh filter (that had 121 tri, 202 vertices) with a mesh filter made in blender that has 2 tri/4 vertices.
if i set the material up with a texture, i get only a very small portion of the texture drawn on the plane. How can i draw the full texture on the new plane?
You need to adjust your UV mapping so that the 4 vertices cover the whole image. Take a look at the this demo file especially at the UV scene layout.
If a texture shows that way it means either the UVs of the imported model are wrong or the texture tilling or offset in the material are wrong.
Instead of importing a mesh for such simple shape you can create one procedurally in code like this: https://github.com/doukasd/Unity-Components/blob/master/ProceduralPlane/Assets/Scripts/Procedural/ProceduralPlane.cs