How to add ILContourPlot to custom Z-axis location in ILPlotCube - ilnumerics

I have three 2D-arrays that all have same X and Y coordinates but different values.
Each of these arrays represent a temperature of a room in certain height (eg. array1 = 10m, array2 = 5m, array3= 3m).
I have created a ILPlotCube that has three different ILContourPlots for these arrays but all of them get positioned to same Z-axis spot of cube (0):
this.scene = new ILScene();
ILPlotCube pc = new ILPlotCube(twoDMode: false);
ILArray<float> TA = tempRoom.GetILArray("TA");
ILContourPlot cpa = new ILContourPlot(TA, create3D: false);
ILArray<float> TB = tempRoom.GetILArray("TB");
ILContourPlot cpb = new ILContourPlot(TB, create3D: false);
ILArray<float> TC = tempRoom.GetILArray("TC");
ILContourPlot cpc = new ILContourPlot(TC, create3D: false);
pc.Add(cpa);
pc.Add(cpb);
pc.Add(cpc);
scene.Add(pc);
ilPanel1.Scene = this.scene;
ilPanel1.Refresh();
Image of result: http://i.stack.imgur.com/VNLgB.jpg
How can I set manually range of the Z-axis of cube shown in picture and manually set Z-axis-position of each ILContourPlot without messing up contours in ILContourPlots?

Contour plots in ILNumerics are regular scene graph objects - just like evey other plot. You can transform them in arbitray ways using regular group objects:
ILArray<float> A = ILMath.tosingle(ILSpecialData.terrain);
ilPanel1.Scene.Add(new ILPlotCube(twoDMode: false) {
new ILGroup(translate: new Vector3(0,0,10)) {
new ILContourPlot(A["0:50;0:50"])
},
new ILGroup(translate: new Vector3(0,0,5)) {
new ILContourPlot(A["50:100;50:100"],lineWidth: 3)
},
new ILGroup(translate: new Vector3(0,0,3)) {
new ILContourPlot(A["150:200;150:200"])
}
});
Gives:
But if you use it in a 2D setup it gets confusing quickly. Obviously you have to use the configuration options for lines and contour levels in order to distinguish individual contour plots. Or use a 3D view.

Related

Optimizing Unity mesh generation | Faster than SetVertexBufferData/ SetIndexBufferData

I’m trying to squeeze every bit of performance out of some voxel mesh generation, and it takes about 1.1ms per chunk at the moment (average measurements):
~0.04ms for vertex, normal, and triangle index calculation in a burst job
~0.61ms for the SetVertexBufferData call
~0.45ms for SetIndexBufferData call
Is there a better way to update a mesh than this? Maybe one that works inside a burst job? I know 1.1ms is pretty fast but I can't help but feel like I'm missing out on performance when the actual data generation is an order of magnitude faster.
Here's what I have:
// Input heightmap
// arr is an int[] created from perlin noise
NativeArray<int> heights = new NativeArray<int>(arr, Allocator.TempJob);
int numVerts = heights.Length * 20;
// VData is the struct: { float3 Vert; float3 Norm; }
NativeArray<VData> verts = new NativeArray<VData>(numVerts, Allocator.TempJob);
// Triangle indices
NativeArray<ushort> tris = new NativeArray<ushort>(heights.Length * 30, Allocator.TempJob);
// create Verts, Tris, and Norms in IJob
Job job = new Job
{
Heights = heights,
Verts = verts,
Tris = tris
};
// Calculate the values
job.Schedule().Complete();
int indices = heights.Length * 30;
Mesh mesh = new Mesh();
mesh.SetVertexBufferParams(numVerts,
new VertexAttributeDescriptor(VertexAttribute.Position),
new VertexAttributeDescriptor(VertexAttribute.Normal)
);
// slow
mesh.SetVertexBufferData(verts, 0, 0, numVerts, 0, MeshUpdateFlags.DontValidateIndices);
mesh.SetIndexBufferParams(indices, IndexFormat.UInt16);
// also slow
mesh.SetIndexBufferData(tris, 0, 0, indices, MeshUpdateFlags.DontValidateIndices);
mesh.SetSubMesh(0, new SubMeshDescriptor(0, indices));
If you have any alternatives or if you see any ways I could improve this, I'm all ears!

Convert object rotation from Three.js to Unity3D

I'm creating a Three.js Scene exporter from Three.js to Unity3D. My problem is in converting Euler angles from Three.js to Unity.
I know that:
Three.js is right-handed space and Unity3D is left-handed;
In Unity3D the plane is constructed being flat on the floor, while in Three.js is standing facing to positive z.
Can somebody please give me an example on how to do that?
UPDATE
I tried to follow #StefanDragnev advice but i can't make it work.This is my Three.JS code to obtain matrix for Unity:
var originalMatrix = object3D.matrix.clone();
var mirrorMatrix = new THREE.Matrix4().makeScale(1, 1, -1);
var leftHandMatrix = new THREE.Matrix4();
leftHandMatrix.multiplyMatrices(originalMatrix,mirrorMatrix);
var rotationMatrix = new THREE.Matrix4().makeRotationX(Math.PI / 2);
var unityMatrix = new THREE.Matrix4();
unityMatrix.multiplyMatrices(leftHandMatrix,rotationMatrix);
jsonForUnity.object.worldMatrix = unityMatrix.toArray();
I tried mirrorMatrix (-1,1,1),too, or makeRotationX(Math.PI / 2) but it didn't work, either. Unity doesn't allow to set the object transformation from object's world matrix directly. I had to extract quaternion from matrix. This is my Unity code:
Vector4 row0 = new Vector4 (threeObject.matrix[0],threeObject.matrix[4],threeObject.matrix[8],threeObject.matrix[12]);
Vector4 row1 = new Vector4 (threeObject.matrix[1],threeObject.matrix[5],threeObject.matrix[9],threeObject.matrix[13]);
Vector4 row2 = new Vector4 (threeObject.matrix[2],threeObject.matrix[6],threeObject.matrix[10],threeObject.matrix[14]);
Vector4 row3 = new Vector4 (threeObject.matrix[3],threeObject.matrix[7],threeObject.matrix[11],threeObject.matrix[15]);
Matrix4x4 matrix = new Matrix4x4();
matrix.SetRow (0,row0);
matrix.SetRow (1,row1);
matrix.SetRow (2,row2);
matrix.SetRow (3,row3);
Quaternion qr = Quaternion.LookRotation(matrix.GetColumn(2), matrix.GetColumn(1));
gameObject.transform.localRotation = qr;
Where am I failing?
It's not just angles that you need to convert. You'll also need to convert the translations. Euler angles are not very confortable to use when doing general transformations. It's much easier to work with the object's world matrix directly.
Converting from right-handed to left-handed - you need to mirror the object's matrix along an axis, say Z in your case. Multiply the object's matrix by Matrix4().makeScale(1, 1, -1).
Then going from XY to XZ being parallel to the viewport, you need to rotate the object along the X axis by 90 degrees (or -90 degrees, if rotations are clockwise). Multiply the object's matrix by Matrix4().makeRotationX(Math.PI / 2).
Then, you need to import the final matrix into Unity. In case you can't just import the matrix wholesale, you can try to first decompose it into scaling, rotation and translation parts, but if at all possible, avoid that.
You can do it like this
let position = new THREE.Vector3();
let rotation = new THREE.Quaternion();
let scale = new THREE.Vector3();
(new THREE.Matrix4().makeScale(1, 1, -1).multiply(this.el.object3D.matrix.clone())).multiply(new THREE.Matrix4().makeRotationX(Math.PI/2)).decompose(position, rotation, scale);

Unity3D only first submeshed is rendered

I am trying to create my whole mesh from 5 submeshes via script in Unity. For each submesh I've got a separated indice array and material assigned. Curiously Unity only renders the first submesh, but if I inspect the mesh assigned to the mesh filter it says that there are more vertices and triangle than actually are rendered.
GameObject go = new GameObject("Island Prototype");
Mesh mesh = new Mesh();
mesh.vertices = this.vertices.ToArray();
mesh.subMeshCount = this.indices.Count;
int c = 0;
foreach (List<int> l in this.indices)
{
Debug.Log(l.Count);
mesh.SetTriangles(l.ToArray(), c);
c++;
}
mesh.RecalculateNormals();
List<Material> materials = new List<Material>();
materials.Add(fieldMaterial);
foreach (TileSettings ts in tiles)
{
materials.Add(fieldMaterial);
}
Debug.Log("Number of materials: " + materials.Count);
//mesh.RecalculateBounds();
//mesh.RecalculateNormals();
MeshRenderer mr = go.AddComponent<MeshRenderer>();
mr.sharedMaterials = materials.ToArray();
MeshFilter mf = go.AddComponent<MeshFilter>();
mf.mesh = mesh;
At the screenshot you can see, that the mesh inspector says the correct count of submeshes. There are also 5 materials attached to the renderer.
At the console I've printed the count of vertices, so submesh 3-5 doesn't own triangles at the moment, but this shouldn't be a problem, should it? At least submesh 2 should be rendered...

combining several iterations of same object in different location using .add() or .merge()

I am trying to make a coil with several small loops. I have a custom function to create a single helix for each loop, and at first I was calling that within a for loop several hundred times, but it was taking too long to render and slowed down the scene.
I tried the merge function several different ways to no avail, so I'm now simply trying to combine two meshes by using the .add command. Here is my process:
(1) add the helix mesh to the total mesh
(2) move the position of the helix mesh
(3) try to add it again so that the total mesh will include both helixes
Only the second (moved) helix shows up when I say scene.add(createCoil()); in my init() function though. How do I add, or merge, several differently positioned helices into one object, geometry, mesh, or whatever, without calling a function to create a new Geometry for every iteration of the for loop?
Here is the code (I took the for loop out just to try one iteration):
function createCoil(){
var geometry = new THREE.TorusGeometry( 11, 0.5, 16, 100 );
var material = new THREE.MeshBasicMaterial( { color: 0x017FFF } );
mesh = new THREE.Mesh( geometry, material );
var clockwise = false;
var radius = 10;
var height = 3.4;
var arc = 1;
var radialSegments = 24;
var tubularSegments = 2;
var tube = 0.1;
var bottom = new THREE.Vector3();
bottom.set(radius, -height / 2, 0);
mesh2 = createHelix(clockwise, radius, height, arc, radialSegments, tubularSegments, tube, material, bottom);
mesh2.position.set(1,1,1);
mesh.add(mesh2);
for(i=1;i<=50;i++){
mesh2.position.y=3.4*i;
mesh.add(mesh2);
}
return mesh;
}
createHelix(...) creates a new THREE.Geometry. I have also tried this and the merge function with the helix being a THREE.Object3D
Please don't point to an answer that includes
THREE.GeometryUtils.merge(geometry, otherGeometry);
(...it's obsolete)
Used another link that was helpful, but I can only (1) change the position of a mesh (not geometry), and (2) only merge geometries (not meshes), within the for loop.
How do I get 500 loops of a coil into a scene without a terrible frame rate?
Please and Thanks!
Use the matrix4 toolset for translation (and rotation if you want), then merge your geometrys:
var geometry = new THREE.TorusGeometry( 11, 0.5, 16, 100 );
var mergeGeometry = new THREE.Geometry();
var matrix = new THREE.Matrix4();
for( i = 1; i <= 50; i++ ) {
matrix.makeTranslation( 0, 3.4 * i, 0 );
mergeGeometry.merge( geometry, matrix );
}
var mesh = new THREE.Mesh( mergeGeo, material );

Why ploting data in ILNumerics produce the result that is not symmetry? (2D Plot)

I have ILNumerics code like this:
var scene = new ILScene();
ILColormap cm = new ILColormap(Colormaps.Hot);
ILArray<float> data = cm.Data;
data[":;0"] = ILMath.pow(data[":;0"], 3.0f);
cm.Data = data;
ILArray<float> contoh = ILMath.zeros<float>(15, 15);
contoh[7, 14] = 1;
scene.Add(
new ILPlotCube(twoDMode: false){
new ILSurface(contoh){
Wireframe = { Color = Color.FromArgb(50, Color.LightGray)},
Colormap = new ILColormap(data),
Children = { new ILColorbar()}
}
}
);
ilPanel1.Scene = scene;
Roughly speak, I want to plot 2D model. In that matrix "contoh", I have one value that different with other neighbors. I want to plot that matrix become something like this figure:
But, there is that I got:
If we see the white area, it is not symmetry. Why this is happen? When I slightly rotate the model, we can see more clearly that even I have data in [14,7] position, the white area stretching until [13,6] but not to [13,8].
And for the last. Can anyone teach me how to make code that will generate a figure as like as Figure 1?
Both plots are different because they are of different type. The matlab one is an imagesc, the ILNumerics one is a surface. Imagesc plots will be available for ILNumerics with the next release.