I'm a bit stuck with Tango device. I want to export a mesh built with the dynamic mesh prefab in Unity. I have seen the Tango3DRExtractWholeMesh function, but it's not working for me.
I have defined the output variables to fill out, and then call the function saving the status in another variable. Something like this:
Vector3[] verts;
Vector3[] normals;
Color32[] colors;
int[] indices;
int numV, numT;
Tango3DReconstruction.Status status =m_tangoApplication.Tango3DRExtractWholeMesh(verts, normals, colores, indexes, out numV, out numT)
But Tango does not do anything at this point. I have checked Tango manager parameters and activate all 3D reconstruction's.
Should I do something else?
I know you got it working but for the other people.
Use TangoApplication.Tango3DRExtractWholeMesh()
from:
https://developers.google.com/project-tango/apis/unity/unity-meshing#tango_application_settings
Dont forget to initialize your arrays (vertices, normals, triangles and colors) so they are big enough to contain the data from the mesh and then it works.
The problem is we dont yet know what size to initialize the arrays with.
Everything is fixed in Mira release which has new exporting functions. It is possible to export the mesh obj model.
Related
I'm trying to create a shader that tiles a texture on an object, But I am Running into an issue. When trying to tile with this Shader Graph, It looks good from the front but looks wrong from the side. this is a 3x3x1 object and all of the squares should be the same size. Thanks in advance
I think it's because you're trying to input a Vector3 into a UV input node, which only takes a Vector2, so it's not using the Z component, which is the dimension that isn't working for you.
I haven't looked too deeply into it, but maybe this script will let you accomplish tiling in all three dimensions? https://github.com/Dsphar/Cube_Texture_Auto_Repeat_Unity/blob/master/ReCalcCubeTexture.cs
I have a point-cloud and a rgb texture that fit together from a depth camera. I procedurally created a mesh from a selected part of the point-cloud implementing the quickhull 3D algorithm for mesh creation.
Now, somehow I need to apply the texture that I have to that mesh. Note that there can be multiple selected parts of the point-cloud thus making multiple objects that need the texture. The texture is just a basic 720p file that should be applied to the mesh material.
Basically I have to do this: https://www.andreasjakl.com/capturing-3d-point-cloud-intel-realsense-converting-mesh-meshlab/ but inside Unity. (I'm also using a RealSense camera)
I tried with a decal shader but the result is not precise. The UV map is completely twisted from the creation process, and I'm not sure how to generate a correct one.
UV and the mesh
I only have two ideas but don't really know if they'll work/how to do them.
Try to create a correct UV and then wrap the texture around somehow
Somehow bake colors to vertices and then use vertex colors to create the desired effect.
What other things could I try?
I'm working on quite a similar problem. But in my case I just want to create a complete mesh from the point cloud. Not just a quickhull, because I don't want to lose any depth information.
I'm nearly done with the mesh algorithm (just need to do some optimizations). Quite challenging now is to match the RGB camera's texture with the depth camera sensor's point cloud, because they of course have a different viewport.
Intel RealSense provides an interesting whitepaper about this problem and as far as I know the SDK corrects these different perspectives with uv mapping and provides a red/green uv map stream for your shader.
Maybe the short report can help you out. Here's the link. I'm also very interested in what you are doing. Please keep us up to date.
Regards
I was looking at this tutorial: https://docs.unity3d.com/Manual/InverseKinematics.html
In that tutorial, they change the body position of hands, head etc, by setting a target object to hold or look at.
In this project: https://hackaday.com/2016/01/23/amazing-imu-based-motion-capture-suit-turns-you-into-a-cartoon/
The guy access the blender api, and directly sets the transform of several bones.
Is it possible to do the same in Unity ? I do not need any assistance with getting data from sensors etc. I'm just looking for information on what is the equivalent API in unity to directly set the orientation of specific body parts of a skeleton at runtime.
You are probably looking for SkinnedMeshRenderer.
When you import a model from some 3d soft, such as blender, it will have SkinnedMeshRenderer component.
What you would want too check out is SkinnedMeshRenderer.bones, which get you the array of bones (as an array of Transform) used to control its pose. You can modify its elements, thus affecting the pose. So you can do stuff like this:
var bones = this.GetComponent<SkinnedMeshRenderer>().bones;
bones[0].localRotation = bones[0].localRotation * Quaternion.Euler(0f, 45f, 0f);
Just play around with it, it is the best way to see.
For more advanced manipulations, you can also set your own array of bones and specify their weights, with SetBlendShapeWeight / GetBlendShapeWeight, but this is probably more than what you need.
I am currently developing an asteroid mining/exploration game with fully deformable, smooth voxel terrain using marching cubes in Unity 3D. I want to implement an "element ID" system that is kind of similar to Minecraft's, as in each type of material has a unique integer ID. This is simple enough to generate, but right now I am trying to figure out a way to render it, so that each individual face represents the element its voxel is assigned to. I am currently using a triplanar shader with a texture array, and I have gotten it set up to work with pre-set texture IDs. However, I need to be able to pass in the element IDs into this shader for the entire asteroid, and this is where my limited shader knowledge runs out. So, I have two main questions:
How do I get data from a 3D array in an active script to my shader, or otherwise how can I sample points from this array?
Is there a better/more efficient way to do this? I thought about creating an array with only the surface vertices and their corresponding ID, but then I would have trouble sampling them correctly. I also thought about possibly bundling an extra variable in with the vertices themselves, but I don't know if this is even possible. I appreciate any ideas, thanks.
as you can see, the pivot is outside my mesh. I want to use the pivot for rotating around, for that I need to set the pivot to the real "rotating point". I know that there's the SetPivot script, but it only works with pivots inside meshes.
This mesh is part of an object which contains several meshes, I created it with Wings3d. The problem appears with .obj and .3ds as file extension.
1.How can I fix this?
2.Is there a possibility to define a second pivot which can be used in scripts to "rotate around"(maybe a vector3 which can be set in "Designer")?
I found a youtube vide which might help. I don't know whether it will help though.
Does THIS help?