Unity: How to Create a projection such that the local coordinates match the monitor coordinates - unity3d

I'm learning about mesh rendering in unity.
I followed the documentation and was able to draw a blue quad on the screen.
However I can't figure out how render thinng such that the entire monitor is used.
I think what I'm missing is setting some sort of projections for this coordinate system to match. But how can I do that? Ideally I would like to do:
x_start = 0;
x_end = W;
y_start = 0;
y_end = H;
And have this quad cover the entire screen.
Here is the code:
using System.IO;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR;
public class MeshRendererTest : MonoBehaviour
{
// Start is called before the first frame update
static Material mat;
float z;
float W;
float H;
Matrix4x4 projectionMatrix;
void Start(){
W = 2480.0f;
H = 2416.0f;
//projectionMatrix = Matrix4x4.Ortho(0, (int)W, (int)H, 0, -1, 100);
Shader shader = Shader.Find("Hidden/Internal-Colored");
//Shader shader2 = Shader.Find("Standard");
//Shader shader2 = Shader.Find("Unlit/UnlitAlphaWithFade");
mat = new Material(shader);
mat.hideFlags = HideFlags.HideAndDontSave;
//texMat.hideFlags = HideFlags.HideAndDontSave;
// Turn backface culling off
mat.SetInt("_Cull", (int)UnityEngine.Rendering.CullMode.Off);
//texMat.SetInt("_Cull", (int)UnityEngine.Rendering.CullMode.Off);
// Turn off depth writes
mat.SetInt("_ZWrite", 0);
//texMat.SetInt("_ZWrite", 0);
z = 999.0f;
mat.SetColor("_Color", Color.blue);
mat.SetPass(0);
MeshRenderer meshRenderer = gameObject.AddComponent<MeshRenderer>();
if (meshRenderer == null){
Debug.Log("Mesh Renderer is NUll");
return;
}
//meshRenderer.sharedMaterial = new Material(Shader.Find("Standard"));
meshRenderer.sharedMaterial = mat;
MeshFilter meshFilter = gameObject.AddComponent<MeshFilter>();
if (meshFilter == null){
Debug.Log("Mesh Filter is NUll");
return;
}
Mesh mesh = new Mesh();
float x_start = 0;
float x_end = x_start + 100;
float y_start = 0;
float y_end = y_start + 600;
Vector3[] vertices = new Vector3[4]
{
new Vector3(x_start, y_start, z),
new Vector3(x_end, y_start, z),
new Vector3(x_start, y_end, z),
new Vector3(x_end, y_end, z)
};
mesh.vertices = vertices;
int[] tris = new int[6]
{
// lower left triangle
0, 2, 1,
// upper right triangle
2, 3, 1
};
mesh.triangles = tris;
Vector3[] normals = new Vector3[4]
{
-Vector3.forward,
-Vector3.forward,
-Vector3.forward,
-Vector3.forward
};
mesh.normals = normals;
Vector2[] uv = new Vector2[4]
{
new Vector2(0, 0),
new Vector2(1, 0),
new Vector2(0, 1),
new Vector2(1, 1)
};
mesh.uv = uv;
meshFilter.mesh = mesh;
}
void OnRenderObject(){
TestStuff();
}
void TestStuff(){
Camera.main.ResetProjectionMatrix();
Matrix4x4 newProj = Matrix4x4.identity;
newProj = newProj * transform.localToWorldMatrix;
newProj = newProj * Camera.main.projectionMatrix;
newProj = newProj * projectionMatrix;
Camera.main.projectionMatrix = newProj;
}
}

Related

How do I extrude a face of a mesh in Unity

I would like to see an example on how to proceduraly extrude a face of a mesh,parts of a mesh face (if it's tesselated) using Unity Meshes API and C# Scripting.
private static Mesh Testp(Mesh mesh)
{
List<Vector3> vertices = new List<Vector3>(mesh.vertices.ToArray());
List<Vector3> normals = new List<Vector3>(mesh.normals.ToArray());
List<int> triangles = new List<int>(mesh.triangles);
for (var i = 0; i < mesh.triangles.Length / 3; i += 3)
{
var faceNormal = CalculateFaceNormal(normals, triangles, i);
if (Math.Abs(Vector3.Dot(Vector3.forward, faceNormal) - (-1f)) < 0.0001f)
{
triangles.Add(vertices.Count);
triangles.Add(vertices.Count+1);
triangles.Add(vertices.Count+2);
vertices.Add(vertices[triangles[i]] + faceNormal * 1f);
vertices.Add(vertices[triangles[i+1]] + faceNormal * 1f);
vertices.Add(vertices[triangles[i+2]] + faceNormal * 1f);
normals.Add(faceNormal);
normals.Add(faceNormal);
normals.Add(faceNormal);
}
}
mesh.Clear();
mesh.vertices = vertices.ToArray();
mesh.normals = normals.ToArray();
mesh.triangles = triangles.ToArray();
return mesh;
}
this is what I've tried so far, but this doesnt really extrude a face, it just creates a new face offseted by a value.

Bad usage of Physics.OverlapBox

I don't understand the usage of method Physics.OverlapBox.
I want to put ten walls on a 4x4 plane in my scene. The width and location of the walls is calculated randomly. In the odd number of cycles in the creation loop the wall is rotated 90 degrees.
The walls presented in the scene should not collide with other walls... but it's not working.
void Reset()
{
int walls = 10;
for (int w = 0; w < walls; w++)
{
for (int i = 0; i < 5; i++)
{
float x = Random.Range(-20f, 20f);
float z = Random.Range(-20f, 20f);
Vector3 center = new Vector3(x, 1.51f, z);
int scalex = Random.Range(4, 13);
Quaternion quaternion = Quaternion.identity;
if (w % 2 == 1)
quaternion=Quaternion.Euler(0, 0, 0);
else
quaternion=Quaternion.Euler(0, 90, 0);
Collider[] colliders = Physics.OverlapBox(center, new Vector3(scalex, 3, 1) / 2, quaternion);
Debug.Log(colliders.Length);
if (colliders.Length == 0)
{
GameObject wall = GameObject.CreatePrimitive(PrimitiveType.Cube);
wall.transform.position = center;
wall.transform.localScale = new Vector3(scalex, 3, 1);
wall.transform.rotation = quaternion;
wall.tag = "wall";
break;
}
}
}
}
After your comment I think I now know what the issue is:
The physics engine simply doesn't "know" your walls colliders yet since the physics engine is updated in the next FixedUpdate.
You might want to call Physics.Simulate after each wall in order to manually trigger a physics update.
The docs are a bit unclear if it is necessary but you might have to disable Physics.autoSimulation before the loop and turn it back on after you are finished with the wall creation.
The solution proposed by derHugo (a lot of thanks) works from Start method. I added rotation to the walls. No collisions.
The code:
public void Start()
{
Physics.autoSimulation = false;
for (int i = 0; i < 200; i++)
{
createWall();
Physics.Simulate(Time.fixedDeltaTime);
}
Physics.autoSimulation = true;
}
void createWall()
{
float x = Random.Range(-20f, 20f);
float z = Random.Range(-20f, 20f);
Vector3 position = new Vector3(x, 1.51f, z);
Quaternion rotation = Quaternion.Euler(0, Random.Range(0,360), 0);
int scalex = Random.Range(2, 5);
Collider[] colliders = Physics.OverlapBox(position, new Vector3(scalex, 1, 1)/2, rotation);
if (colliders.Length==0)
{
GameObject wall = GameObject.CreatePrimitive(PrimitiveType.Cube);
wall.transform.localPosition = position;
wall.transform.localScale = new Vector3(scalex, 3, 1);
wall.transform.rotation = rotation;
wall.tag = "wall";
}
}

Sewing up two meshes

Good afternoon! I'm trying to sew up two meshes. I do this as follows: first I convert the sprite into a mesh, then I duplicate the resulting mesh, shift it along the "z" axis, invert it, and then sew it up. But I faced such a problem: he sews rectangular meshes well, but in circular meshes there are some defects on the sides. So, how can you sew up these sides? (Materials and code)
public class ConvertSpriteInMesh : MonoBehaviour
{
public Sprite sprite;
private MeshDraft meshDraft = new MeshDraft();
private Mesh mesh;
void Start()
{
GetComponent<MeshFilter>().mesh = SpriteToMesh(sprite);
SewingUp();
}
/// <summary>
/// Sewing up nets
/// </summary>
private void SewingUp()
{
mesh = GetComponent<MeshFilter>().mesh;
meshDraft = new MeshDraft(mesh);
int leftVertical = mesh.vertices.Length / 2; // getting the beginning of the left vertical of the mesh
int index = mesh.vertices.Length;
for (int i = 0; i < leftVertical - 1; i++)
{
meshDraft.AddQuad(mesh.vertices[i], mesh.vertices[i+1], mesh.vertices[i + leftVertical + 1],mesh.vertices[i+leftVertical],
index);
index += 4;
}
GetComponent<MeshFilter>().mesh = meshDraft.ToMesh(); // assign the resulting mesh
}
/// <summary>
/// Convert Sprite to Mesh
/// </summary>
/// <param name="_sprite"></param>
/// <returns></returns>
private Mesh SpriteToMesh(Sprite _sprite)
{
// declaring variables
Mesh mesh = new Mesh();
Vector3[] _verticles;
int[] _triangle;
// assigning values
_verticles = Array.ConvertAll(_sprite.vertices, i => (Vector3)i);
_triangle = Array.ConvertAll(_sprite.triangles, i => (int)i);
// changing the size of the array
Array.Resize(ref _verticles, _verticles.Length * 2);
Array.Resize(ref _triangle, _triangle.Length * 2);
// adding another side
for (int i = 0; i < _verticles.Length / 2; i++)
{
_verticles[_verticles.Length / 2 + i] = new Vector3(_verticles[i].x, _verticles[i].y, 0.5f);
}
for (int i = 0; i < _triangle.Length / 2; i++)
{
_triangle[_triangle.Length / 2 + i] = _triangle[i] + (_verticles.Length / 2);
}
// invert the second side
for(int i = _triangle.Length / 2; i < _triangle.Length; i += 3) {
var temp = _triangle[i];
_triangle[i] = _triangle[i + 1];
_triangle[i + 1] = temp;
}
// assigning the mesh
mesh.vertices = _verticles;
mesh.triangles = _triangle;
mesh.RecalculateBounds();
mesh.RecalculateNormals();
return mesh;
}
}
public partial class MeshDraft {
public string name = "";
public List<Vector3> vertices = new List<Vector3>();
public List<int> triangles = new List<int>();
public List<Vector3> normals = new List<Vector3>();
public List<Vector4> tangents = new List<Vector4>();
public List<Vector2> uv = new List<Vector2>();
public List<Vector2> uv2 = new List<Vector2>();
public List<Vector2> uv3 = new List<Vector2>();
public List<Vector2> uv4 = new List<Vector2>();
public List<Color> colors = new List<Color>();
public MeshDraft(Mesh mesh) {
name = mesh.name;
vertices.AddRange(mesh.vertices);
triangles.AddRange(mesh.triangles);
normals.AddRange(mesh.normals);
tangents.AddRange(mesh.tangents);
uv.AddRange(mesh.uv);
uv2.AddRange(mesh.uv2);
uv3.AddRange(mesh.uv3);
uv4.AddRange(mesh.uv4);
colors.AddRange(mesh.colors);
}
public void AddQuad(Vector3 v0, Vector3 v1, Vector3 v2, Vector3 v3, int index, Color color = default(Color)) {
vertices.Add(v0);
vertices.Add(v1);
vertices.Add(v2);
vertices.Add(v3);
Vector3 normal0 = Vector3.Cross(v2 - v1, v3 - v1).normalized;
Vector3 normal1 = Vector3.Cross(v1 - v0, v2 - v0).normalized;
normals.Add(normal0);
normals.Add(normal0);
normals.Add(normal1);
normals.Add(normal1);
colors.Add(color);
colors.Add(color);
colors.Add(color);
colors.Add(color);
triangles.Add(index);
triangles.Add(index + 1);
triangles.Add(index + 2);
triangles.Add(index);
triangles.Add(index + 2);
triangles.Add(index + 3);
}
public Mesh ToMesh() {
var mesh = new Mesh { name = name };
mesh.SetVertices(vertices);
mesh.SetTriangles(triangles, 0);
mesh.SetNormals(normals);
mesh.SetTangents(tangents);
mesh.SetUVs(0, uv);
mesh.SetUVs(1, uv2);
mesh.SetUVs(2, uv3);
mesh.SetUVs(3, uv4);
mesh.SetColors(colors);
return mesh;
}
Successful stitching (screen)
Bad stitching (screen)
I was given an answer on another forum, who is interested, I will leave a link here - https://www.cyberforum.ru/unity/thread2823987.html

Mesh is getting black on change

I have written a small code, which is mooving vertices down on mouse click. The problem is in that mesh on change is getting black (only in some places).
Here's the code
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class MinerScript : MonoBehaviour
{
public MeshFilter filter;
public MeshCollider col;
public GameObject Player;
public float mineSpeed = 200;
Mesh mesh;
Vector3[] vertices;
void Update()
{
filter = transform.parent.GetComponent<PlayerGravity>().mesh;
mesh = filter.GetComponent<MeshFilter>().sharedMesh;
vertices = mesh.vertices;
Ray ray = new Ray(transform.position, transform.forward * 2);
if (Physics.Raycast(ray))
{
if (Input.GetMouseButton(1))
{
int index = Mathf.RoundToInt(ClosestIndexToPoint(ray));
vertices[index] += (-Player.transform.eulerAngles) * Time.deltaTime * mineSpeed;
mesh.vertices = vertices;
mesh.RecalculateBounds();
mesh.RecalculateNormals();
Color[] colors = new Color[mesh.vertices.Length];
colors[index] = Color.red;
mesh.SetColors(colors);
//col.sharedMesh = mesh;
}
}
}
public float ClosestIndexToPoint(Ray ray)
{
if (Physics.Raycast(ray, out RaycastHit hit))
{
Mesh m = hit.transform.GetComponent<MeshFilter>().sharedMesh;
col = hit.transform.GetComponent<MeshCollider>();
int[] tri = new int[3] {
m.triangles[hit.triangleIndex * 3 + 0],
m.triangles[hit.triangleIndex * 3 + 1],
m.triangles[hit.triangleIndex * 3 + 2],
};
float closestDistance = Vector3.Distance(m.vertices[tri[0]], hit.point);
int closestVertexIndex = tri[0];
for (int i = 0; i < tri.Length; i++)
{
float dist = Vector3.Distance(m.vertices[tri[i]], hit.point);
if (dist < closestDistance)
{
closestDistance = dist;
closestVertexIndex = tri[i];
}
}
return closestVertexIndex;
}
else
return -1;
}
I don't think the problem is in color, as i've tried to apply the teture on it.
About code:
It throws a ray, and if it hits any mesh, it is calling func "ClosestIndexToPoint". It returns value of type float. This function gets the triangle a ray hits, and then finds the closest vertecs to it. Then it just mooves it down (or the rotation of our Player object, cause a have a non - flat mesh).

how to apply texture on 3d model, dynamically?

I have a file where written array of vertexes, indexes, uv, textures and so on, in two words everything in order to draw model with texture on it. For example it is should be 3d cube with texture like wood.
So, what I have for now is - I can present vertexes of cube(I see my model), but I don't know how to apply a texture for this.
there is my code -
public void Start()
{
m_stream = DecoderAPI.create_stream_decoder_obj();
string pathToFile = "path_to_my_file";
bool isInitialized = DecoderAPI.stream_init_model(m_stream, pathToFile);
if (isInitialized)
{
m_curFrame = DecoderAPI.stream_get_frame_obj(m_stream, 1);
MeshRenderer meshRenderer = gameObject.AddComponent<MeshRenderer>();
meshRenderer.sharedMaterial = new Material(Shader.Find("Standard"));
Mesh mesh = new Mesh();
//Vertices***
int vertexCount = DecoderAPI.frame_get_vertex_count(m_curFrame);
int xyzArrSize = vertexCount * 3;
float[] xyzArray = new float[xyzArrSize];
IntPtr xyz = DecoderAPI.frame_get_vertex_xyz(m_curFrame);
Marshal.Copy(xyz, xyzArray, 0, xyzArrSize);
Vector3[] vertices = new Vector3[vertexCount];
for (int i = 0; i < vertexCount; i++)
{
vertices[i] = new Vector3(xyzArray[i * 3], xyzArray[i * 3 + 1], xyzArray[i * 3 + 2]);
}
mesh.vertices = vertices;
//***
//Faces***
int faceCount = DecoderAPI.frame_face_count(m_curFrame);
int trisArrSize = faceCount * 3;
int[] tris = new int[trisArrSize];
IntPtr indices = DecoderAPI.frame_face_indices(m_curFrame);
Marshal.Copy(indices, tris, 0, trisArrSize);
mesh.triangles = tris;
//***
mesh.RecalculateNormals();
MeshFilter meshFilter = gameObject.AddComponent<MeshFilter>();
meshFilter.mesh = mesh;
//TEXTURE ****
int uvCount = DecoderAPI.frame_get_uv_count(m_curFrame);
IntPtr uvData = DecoderAPI.frame_get_uv_data(m_curFrame);
IntPtr textureObj = DecoderAPI.frame_get_texture_obj(m_curFrame);
DecoderAPI.TextureInfo textureInfo = DecoderAPI.texture_get_info(textureObj);
int width = textureInfo.width;
int height = textureInfo.height;
int channels = textureInfo.channels;
int stride = textureInfo.stride;
DecoderAPI.ColorType color_type = textureInfo.color_type;
IntPtr pixels = textureInfo.pixels;
HOW TO APPLY THIS TEXTURE DATA TO MY MODEL????
//***
DecoderAPI.frame_release(m_curFrame);
}
}
I found this answer - https://answers.unity.com/questions/390878/how-do-i-apply-a-texture-to-a-3d-model.html
but I need to know to apply it dynamically
Any suggestions? Or maybe some thinks to tutorials?
EDIT
public void Start()
{
m_stream = DecoderAPI.create_stream_decoder_obj();
string pathToFile = "my_path_to_file";
bool isInitialized = DecoderAPI.stream_init_model(m_stream, pathToFile);
if (isInitialized)
{
m_curFrame = DecoderAPI.stream_get_frame_obj(m_stream, 1);
MeshRenderer meshRenderer = gameObject.AddComponent<MeshRenderer>();
meshRenderer.sharedMaterial = new Material(Shader.Find("Standard"));
Mesh mesh = new Mesh();
//Vertices***
int vertexCount = DecoderAPI.frame_get_vertex_count(m_curFrame);
int xyzArrSize = vertexCount * 3;
float[] xyzArray = new float[xyzArrSize];
IntPtr xyz = DecoderAPI.frame_get_vertex_xyz(m_curFrame);
Marshal.Copy(xyz, xyzArray, 0, xyzArrSize);
Vector3[] vertices = new Vector3[vertexCount];
for (int i = 0; i < vertexCount; i++)
{
vertices[i] = new Vector3(xyzArray[i * 3], xyzArray[i * 3 + 1], xyzArray[i * 3 + 2]);
}
mesh.vertices = vertices;
//***
//Faces***
int faceCount = DecoderAPI.frame_face_count(m_curFrame);
int trisArrSize = faceCount * 3;
int[] tris = new int[trisArrSize];
IntPtr indices = DecoderAPI.frame_face_indices(m_curFrame);
Marshal.Copy(indices, tris, 0, trisArrSize);
mesh.triangles = tris;
//***
mesh.RecalculateNormals();
//UV***
int uvCount = DecoderAPI.frame_get_uv_count(m_curFrame);
IntPtr uvData = DecoderAPI.frame_get_uv_data(m_curFrame);
int uvArrSize = uvCount * 2;
float[] uvArr = new float[uvArrSize];
Vector2[] uv = new Vector2[uvCount];
Marshal.Copy(uvData, uvArr, 0, uvArrSize);
for(int i = 0; i < uvCount; i++)
{
uv[i] = new Vector2(uvArr[i * 2], uvArr[i * 2 + 1]);
}
mesh.uv = uv;
//***
MeshFilter meshFilter = gameObject.AddComponent<MeshFilter>();
meshFilter.mesh = mesh;
//TEXTURE ****
IntPtr textureObj = DecoderAPI.frame_get_texture_obj(m_curFrame);
DecoderAPI.TextureInfo textureInfo = DecoderAPI.texture_get_info(textureObj);
int width = textureInfo.width;
int height = textureInfo.height;
int channels = textureInfo.channels;
int stride = textureInfo.stride;
DecoderAPI.ColorType color_type = textureInfo.color_type;
IntPtr pixels = textureInfo.pixels;
Texture2D texture = new Texture2D(width, height);
texture.LoadRawTextureData(pixels, width * channels *height);
texture.Apply();
meshRenderer.material.SetTexture("_MainText", texture);
//***
DecoderAPI.frame_release(m_curFrame);
}
}
But for now I am getting such an error
UnityException: LoadRawTextureData: not enough data provided (will result in overread).
UnityEngine.Texture2D.LoadRawTextureData (System.IntPtr data, System.Int32 size) (at <a9810827dce3444a8e5c4e9f3f5e0828>:0)
Model.Start () (at Assets/Scripts/Model.cs:98)
What am I doing wrong?
First off, your code to construct the cube is missing the UV part, so even if you assign the texture to the material the result is undetermined. Look at the code samples in Mesh manual page about adding the UV as well: https://docs.unity3d.com/ScriptReference/Mesh.html
Once you have the UV, all you have to do is to set the texture using SetTexture (see https://docs.unity3d.com/ScriptReference/Material.SetTexture.html).
On a separate note, in your code, you are using Shared Material instead of Material: that is not advisable unless you have many objects all using the same material and you want to change them all.
EDIT:
To get a texture from a pixels buffer you create a Texture2D object of the given size and colour type, then you apply the data like this:
myTexture.LoadRawTextureData(myPixels);
myTexture.Apply();
~Pino
If you have a texture and a MeshRenderer, it works like this:
void SetYourTexture()
{
MeshRenderer yourMeshRenderer = GetComponent<MeshRenderer>();
//If you only need one texture at the material unity understand _MainText as the mainTexture;
yourMeshRenderer.material.SetTexture("_MainText", yourTexture);
}