Unity3d terrain: Only a sub-area of the terrain is generated - unity3d

I want to generate a Terrain in Unity from SRTM data. Width and length of the terrain is specified. The terrain itself is 3061 x 2950 (width, length). A smaller area starting from (0,0) to (~2300, ~2130) has the correct terrain. The remainder is flat surface with 0 height.
here is an image of the problem
The relevant code:
LocalizedMercatorProjection mercator; // basic Mercator projection
SRTM_Reader srtm; // wrapper around https://github.com/itinero/srtm with SRTM data "N49E009.hgt"
// coordinate bounds of Flein, Germany
public float max_lat = 49.1117000f;
public float min_lat = 49.0943000f;
public float min_lon = 9.1985000f;
public float max_lon = 9.2260000f;
public void GenerateTerrain()
{
ConfigureTerrainData();
float[,] heights = GenerateTerrainData();
terrain.terrainData.SetHeights(0, 0, heights);
}
void ConfigureTerrainData()
{
this.depth = Math.Abs(Convert.ToInt32(mercator.latToY(max_lat) - mercator.latToY(min_lat)));
// this.depth = 2950;
this.width = Math.Abs(Convert.ToInt32(mercator.lonToX(max_lon) - mercator.lonToX(min_lon)));
// this.width = 3061;
this.height = 400;
this.terrain.terrainData.heightmapResolution = width + 1;
this.terrain.terrainData.size = new Vector3(width, height, depth);
}
float[,] GenerateTerrainData()
{
float[,] heights = new float[depth, width];
for(int x = 0; x < (width); x += 1)
{
for (int y = 0; y < (depth); y += 1)
{
heights[y,x] = CalculateHeight(x, y);
}
}
return heights;
}
float CalculateHeight(int x, int y)
{
// uses SRTMData.GetElevationBilinear(lat, lon) under the hood
float elevation = srtm.GetElevationAtSync(mercator.yToLat(y), mercator.xToLon(x));
if (elevation >= height) return 1.0f;
return elevation / height;
}
Does anyone have an idea why only a smaller area of the terrain is filled?
Edit 1: Setting the values for depth and width to 4097 mitigates the problem. This is not a perfect solution to me, so the question still persists.

Related

Unity loop Application.EnterPlayMode

When I hit Play Unity makes me wait infinite time before start the scene. The message in Hold On window is "Application.EnterPlayMode Waiting for Unity's code to finish executing". There is only one scene in URP with global volume, directional light and a plane with this script:
using UnityEngine;
public class PerlinNoise : MonoBehaviour
{
[Header("Resolution")]
[Space]
public int width = 256;
public int heigth = 256;
[Space]
[Header("Adjustments")]
[Space]
public float scale = 20;
public float xOffset = 10;
public float yOffset = 10;
void Update()
{
Renderer renderer = GetComponent<Renderer>();
renderer.material.mainTexture = GenerateTexture();
}
Texture2D GenerateTexture()
{
Texture2D texture = new Texture2D(width, heigth);
for (int x = 0; x < width; x++)
{
for (int y = 0; x < heigth; y++)
{
Color color = GenerateColor(x, y);
texture.SetPixel(width, heigth, color);
}
}
texture.Apply();
return texture;
}
Color GenerateColor(int x, int y)
{
float xCoord = (float)x / width * scale + xOffset;
float yCoord = (float)y / width * scale + yOffset;
float perlinNoise = Mathf.PerlinNoise(xCoord, yCoord);
return new Color(perlinNoise, perlinNoise, perlinNoise);
}
}
I tried to kill unity editor task in task manager and restart unity but the same issue repeats. Please help me
You have an infinite loop inside your GenerateTexture method. Specifically, the condition in the nested loop (for y) is accidentally checking for x:
for (int x = 0; x < width; x++)
{
for (int y = 0; x < heigth; y++) // x < height SHOULD BE y < height
{
Color color = GenerateColor(x, y);
texture.SetPixel(width, heigth, color); // this should probably be (x, y, color)
}
}

Unity Mesh UV problem on the positive and negative x-axis on perpendicular walls

I am trying to make a 3D-voxel game that uses the marching cubes algorithm to procedurally generate a game world. This is working fine so far, except that, on exactly perpendicular sides on the positive and negative x sides of a given perpendicular piece of the world/chunk mesh, it looks like the uv coordinates aren't quite right as it just displays a solid color instead of the texture. [![view of debug-chunks and the buggy sides][1]][1]
At <2.> you can see the chunk wall how it is supposed to look like and at <1.> you can see the weird bug. This ONLY occurs on exactly perpendicular x-side triangles! Those meshes in the image are debug-chunks to show the problem.
All the noise-to-terrain translation works fine and I don't get any bugs there. It's only the uvs that cause problems.
I am using the following code to populate the Mesh.vertices, Mesh.triangles and Mesh.uv:
void MakeChunkMeshData()
{
for (int x = 0; x < Variables.chunkSize.x - 1; x++)
{
for (int y = 0; y < Variables.chunkSize.y - 1; y++)
{
for (int z = 0; z < Variables.chunkSize.z - 1; z++)
{
byte[] cubeCornerSolidityValues = new byte[8]{
chunkMapSolidity[x, y, z],
chunkMapSolidity[x + 1, y, z],
chunkMapSolidity[x + 1, y + 1, z],
chunkMapSolidity[x, y + 1, z],
chunkMapSolidity[x, y, z + 1],
chunkMapSolidity[x + 1, y, z + 1],
chunkMapSolidity[x + 1, y + 1, z + 1],
chunkMapSolidity[x, y + 1, z + 1]
};
MarchOne(new Vector3Int(x, y, z), cubeCornerSolidityValues);
}
}
}
}
void DrawChunk()
{
chunkMesh.vertices = vertices.ToArray();
chunkMesh.triangles = triangles.ToArray();
chunkMesh.SetUVs(0, uvs.ToArray());
chunkMesh.RecalculateNormals();
}
void ClearMesh()
{
chunkMesh.Clear();
}
void ClearChunkMeshAndData()
{
chunkMesh.Clear();
uvs = new List<Vector2>();
vertices = new List<Vector3>();
triangles = new List<int>();
}
/// <summary>
/// cube contains bytes for each corner of the cube
/// </summary>
/// <param name="cube"></param>
/// <returns></returns>
int GetCubeConfiguration(byte[] cube)
{
int u = 0;
int result = 0;
for(int corner = 0; corner < 8; corner++)
{
if (cube[corner] < Variables.solidityThreshold)
{
u++;
result |= 1 << corner;
}
}
return result;
}
Vector2[] getUvsPerTriangle(byte[] voxelTypes)
{
int resId = voxelTypes[0];
if (voxelTypes[1] == voxelTypes[2])
resId = voxelTypes[1];
resId = 1;
Vector2 normalized = getUvCoordFromTextureIndex(resId) / Constants.TextureAtlasSizeTextures;
float textureLength = 1f/Constants.TextureAtlasSizeTextures;
Vector2[] result = new Vector2[3] {
normalized + new Vector2(1,0) * textureLength,
normalized + new Vector2(0,1) * textureLength,
normalized + new Vector2(1,1) * textureLength
};
//Debug.Log(result);
return result;
}
/// <summary>
/// returns the absolute x and y coordinates of the given texture in the atlas (example: [4, 1])
/// </summary>
/// <param name="textureIndex"></param>
/// <returns></returns>
Vector2 getUvCoordFromTextureIndex(int textureIndex)
{
int x = textureIndex % Constants.TextureAtlasSizeTextures;
int y = (textureIndex - x) / Constants.TextureAtlasSizeTextures;
return new Vector2(x, y);
}
/// <summary>
/// takes the chunk-wide mesh data and adds its results after marching one cube to it.
/// </summary>
/// <returns></returns>
void MarchOne(Vector3Int offset, byte[] cube)
{
int configuration = GetCubeConfiguration(cube);
byte[] voxelTypes = new byte[3];
int edge = 0;
for (int i = 0; i < 5; i++) //loop at max 5 times (max number of triangles in one cube config)
{
for(int v = 0; v < 3; v++) // loop 3 times through shit(count of vertices in a TRIangle, who would have thought...)
{
int cornerIndex = VoxelData.TriangleTable[configuration, edge];
if (cornerIndex == -1) // indicates the end of the list of vertices/triangles
return;
Vector3 vertex1 = lwTo.Vec3(VoxelData.EdgeTable[cornerIndex, 0]) + offset;
Vector3 vertex2 = lwTo.Vec3(VoxelData.EdgeTable[cornerIndex, 1]) + offset;
Vector3Int vertexIndex1 = lwTo.Vec3Int(VoxelData.EdgeTable[cornerIndex, 0]) + offset;
Vector3Int vertexIndex2 = lwTo.Vec3Int(VoxelData.EdgeTable[cornerIndex, 1]) + offset;
Vector3 vertexPosition;
if (Variables.badGraphics)
{
vertexPosition = (vertex1 + vertex2) / 2f;
}
else
{
// currently using this "profile"
// this code determines the position of the vertices per triangle based on the value in chunkSolidityMap[,,]
float vert1Solidity = chunkMapSolidity[vertexIndex1.x, vertexIndex1.y, vertexIndex1.z];
float vert2Solidity = chunkMapSolidity[vertexIndex2.x, vertexIndex2.y, vertexIndex2.z];
float difference = vert2Solidity - vert1Solidity;
difference = (Variables.solidityThreshold - vert1Solidity) / difference;
vertexPosition = vertex1 + ((vertex2 - vertex1) * difference);
}
vertices.Add(vertexPosition);
triangles.Add(vertices.Count - 1);
voxelTypes[v] = chunkMapVoxelTypes[vertexIndex1.x, vertexIndex1.y, vertexIndex1.z];
edge++;
}
uvs.AddRange(getUvsPerTriangle(voxelTypes));
}
}
EDIT:
when only slightly rotating the chunks, the weird problem immediately disappears. I don't know why, but at least i now have some clue what's going on here.
[1]: https://i.stack.imgur.com/kYnkl.jpg
Well, it turns out that this probably is some weird behavior from unity. When explicitly setting the mesh after the mesh population process as the meshFilters' mesh, it works just fine. Also, I had to call mesh.RecalculateTangents() to make it work.

Object with many children does not show up in the middle although it has coordinates set to 0, 0, 0

I am doing a Rubik cube generator with unity. Each of the pieces are basically a 1x1 cube which will be repeated in the shape of a bigger cube in my code as children of an empty object. The empty object is in the exact middle of the pieces, and all the pieces have their origins in the exact middle. However, when I put the empty to the center of the scene (0, 0, 0) It shows up in a different place.
Here are some pictures from the editor:
As you can see, the empty is in the center with coordinates set to 0, 0, 0
Now ,when it has children and the coordinates are all still 0, it shows in a different place
Edit:
#derHugo helped me out, but now my code that creates the cubes and sets the empty object to the middle of them does not work.
Here is the full code:
public GameObject PiecePrefab;
public int CubeSize;
Vector3 avg;
Vector3 ijk;
int cubeCount = 0;
// Start is called before the first frame update
void Start()
{
//Vector3 orgpos = gameObject.transform.position;
if (CubeSize <= 0)
{
CubeSize = 1;
Debug.LogError("The cube can not be smaller than 1!");
}
else if (CubeSize > 30)
{
CubeSize = 30;
Debug.LogError("The cube should not be bigger than 30!");
}
avg = new Vector3(0, 0, 0);
for (float k = 0; k < CubeSize; k++)
{
for (float j = 0; j < CubeSize; j++)
{
for (float i = 0; i < CubeSize; i++)
{
if (i == CubeSize - 1 || i == 0)
{
CreatePiece(i, j, k);
}
else if (j == CubeSize - 1 || j == 0)
{
CreatePiece(i, j, k);
}
else if (k == CubeSize - 1 || k == 0)
{
CreatePiece(i, j, k);
}
}
}
}
avg /= cubeCount;
gameObject.transform.position = avg;
var _Go = GameObject.FindGameObjectsWithTag("KuutionPala");
foreach (GameObject KuutionPala in _Go)
{
KuutionPala.transform.SetParent(transform);
}
//gameObject.transform.localPosition = orgpos;
void CreatePiece(float x, float y, float z)
{
ijk = new Vector3(x, y, z);
avg += ijk;
cubeCount++;
Vector3 offset3D;
offset3D = new Vector3(x / CubeSize, y / CubeSize, z / CubeSize);
var Piece = Instantiate(PiecePrefab, offset3D, transform.rotation);
Piece.transform.localScale /= CubeSize;
//Debug.LogFormat("x:" + x);
//Debug.LogFormat("y:" + y);
//Debug.LogFormat("z:" + z);
}
}
}
I think the error is on this row:
gameObject.transform.position = avg;
(Sorry if bad code)
As said there are two pivot modes in Unity (see Positioning GameObjects → Gizmo handle position toggles)
Pivot: positions the Gizmo at the actual pivot point of the GameObject, as defined by the Transform component.
Center: positions the Gizmo at a (geometrical) center position based on the selected GameObjects.
Yours is set to Center so in order to change that click on the button that says Center.
Then to your code
You are currently just hoping/assuming that your parent is correctly placed on 0,0,0.
Then you spawn all tiles in a range from 0 to (CubeSize - 1)/2 and then want to shift the center back.
I would rather go the other way round and calculate the correct local offset beforehand and directly spawn the tiles as children of the root with the correct offset. Into positive and negative direction.
Step 1: What is that local position?
For figuring the general maths out just look at two examples.
Let's say you have 3 cubes with indices 0,1,2. They have extends of 1/3 so actually there positions would need to look like
-0.5 0 0.5
| . | . | . |
Let's say you have 4 cubes with indices 0,1,2,3 and extends 1/4 then the positions would need to look like
-0.5 0 0.5
| . | . | . | . |
So as you can see the simplest way to go would be
start with the minimum position (e.g. -0.5f * Vector3.one)
always add half of the extends for the first offset (e.g. 1/CubeSize * 0.5f * Vector3.one)
add an offsets of the extends multiplied by the indices on top (e.g. 1/CubeSize * new Vector3(x,y,z))
so together something like
// be sure to cast to float here otherwise you get rounded ints
var extends = 1 / (float)CubeSize;
var offset = (-0.5f + extends * 0.5f) * Vector3.one + extends * new Vector3(x,y,z);
Step 2: Directly spawn as children with correct offset
void CreatePiece(float x, float y, float z)
{
var extends = 1 / (float)CubeSize;
var offset = (-0.5f + extends * 0.5f) * Vector3.one + extends * new Vector3(x,y,z);
var Piece = Instantiate(PiecePrefab, transform, false);
// This basically equals doing something like
//var Piece = Instantiate(PiecePrefab, transform.position, transform.rotation, transform);
Piece.transform.localPosition = offset;
Piece.transform.localScale = extends * Vector3.one;
}
Then you can reduce your code to
// Use a range so you directly clamp the value in the Inspector
[Range(1,30)]
public int CubeSize = 3;
// Start is called before the first frame update
void Start()
{
UpdateTiles();
}
// Using this you can already test the method without entering playmode
// via the context menu of the component
[ContextMenu(nameof(UpdateTiles)])
public void UpdateTiles()
{
// Destroy current children before spawning the new ones
foreach(var child in GetComponentsInChildren<Transform>().Where(child => child != transform)
{
if(!child) continue;
if(Application.isPlaying)
{
Destroy(child.gameObject);
}
else
{
DestroyImmediate(child.gameObject);
}
}
if (CubeSize < 1)
{
CubeSize = 1;
Debug.LogError("The cube can not be smaller than 1!");
}
else if (CubeSize > 30)
{
CubeSize = 30;
Debug.LogError("The cube should not be bigger than 30!");
}
// For making things easier to read I would use x,y,z here as well ;)
for (float x = 0; x < CubeSize; x++)
{
for (float y = 0; y < CubeSize; y++)
{
for (float z = 0; z < CubeSize; z++)
{
if (x == CubeSize - 1 || x == 0)
{
CreatePiece(x, y, z);
}
else if (y == CubeSize - 1 || y == 0)
{
CreatePiece(x, y, z);
}
else if (z == CubeSize - 1 || z == 0)
{
CreatePiece(x, y, z);
}
}
}
}
}
private void CreatePiece(float x, float y, float z)
{
var extends = 1 / (float)CubeSize;
var offset = (-0.5f + extends * 0.5f) * Vector3.one + extends * new Vector3(x,y,z);
var Piece = Instantiate(PiecePrefab, transform, false);
Piece.transform.localPosition = offset;
Piece.transform.localScale = extends * Vector3.one;
}

Unity perlin noise having repeating patterns

I made a Noise class using the Perlin Noise from Unity like this:
public static float[,] GetNoise(Vector2Int initialOffset, float scale, float persistance, float lacunarity, int octaves)
{
float[,] noiseMap = new float[Chunk.width, Chunk.height];
float maxHeight = 0;
float minHeight = 0;
for (int y = 0; y < Chunk.height; y++)
{
for (int x = 0; x < Chunk.width; x++)
{
float amplitude = 1;
float frequency = 1;
float noiseHeight = 0;
for (int oc = 0; oc < octaves; oc++)
{
float coordX = (x + initialOffset.x) / scale * frequency;
float coordY = (y + initialOffset.y) / scale * frequency;
float perlin = Mathf.PerlinNoise(coordX, coordY) * 2 - 1;
noiseHeight += perlin * amplitude;
amplitude *= persistance;
frequency *= lacunarity;
}
if (noiseHeight < minHeight)
{
minHeight = noiseHeight;
}
if (noiseHeight > maxHeight)
{
maxHeight = noiseHeight;
}
noiseMap[x, y] = noiseHeight;
}
}
for (int y = 0; y < Chunk.height; y++)
{
for (int x = 0; x < Chunk.width; x++)
{
noiseMap[x, y] = Mathf.InverseLerp(minHeight, maxHeight, noiseMap[x, y]);
}
}
return noiseMap;
}
However this code is giving me repeating patterns like this:
What am I doing wrong? Or there is no way to get rid of the patterns?
I got it working, not very well, but working. The way I did was I generate the height map for every tile in the chunk, then I did some random placing of tiles, while having in account the height map. Something like this:
if (heightMap[x, y] < 0.3 && Random.value < 0.5)
// Add tile
This way I got this result:
EDIT:
Doing some more research about Perlin Noise I found out that it just doesn't like negative coords for some reason, so I did this way, hope this helps someone!
so .. fixed the negative coords like this:
//account for negatives (ex. -1 % 256 = -1, needs to loop around to 255)
if (noiseOffset.x < 0)
noiseOffset = new Vector2(noiseOffset.x + noiseRange.x, noiseOffset.y);
if (noiseOffset.y < 0)
noiseOffset = new Vector2(noiseOffset.x, noiseOffset.y + noiseRange.y);

Change the colors of a texture based on another texture in Unity?

So I´m trying to use the function GetPixels basically to change the colors of a second texture based on the pixel colours of a first (or source) texture.
I didn´t try much, I was looking for ways to approach this or get some documentation but Unity Scripting API is pretty messed up.
So far, I´ve got a pseudocode that isn´t working at all, if any of ya could help to get it working that would be really nice.
var secondTexture = new Texture2D(width, height);
Texture2D source = sourceTexture;
var pixels = new Color[width * height];
for (var x = 0; x < width; x++)
{
for (var y = 0; y < height; y++)
{
Color pixels2 = source.GetPixels(x, y);
if(pixels2 == Color.white)
{
//Paint the pixels in secondTexture that match X and Y of sourceTexture with blue per example
pixels[x + y * width] = Color.blue;
}
else
{
//Paint the rest of the pixels with black
pixels[x + y * width] = Color.black;
}
}
}
secondTexture.SetPixels(pixels);
secondTexture.wrapMode = TextureWrapMode.Clamp;
secondTexture.Apply();
return secondTexture;
So basically, what should happens is that whenever a pixel of the sourceTexture is white color, that pixel in the second texture is changed to blue, and when the pixels in sourceTexture ARE NOT white, the pixels on the secondTexture are black. That´s more or less the idea, but what I keep getting doesn´t matter what´s the sourceTexture colors, all my pixels in the second texture are inmediately black. Any idea why?
GetPixels() returns an array of Color. If you want to get the color of single pixel, use GetPixel() instead. In your code try to change Color pixels2 = source.GetPixels(x, y);to Color pixels2 = source.GetPixel(x, y);
Update 1:
Try to test this code. Working perfectly for me.
int width = 150;
int height = 150;
Texture2D secondTexture;
public Texture2D source; //texture for a test, add in inspector
public Renderer rend; //Object for a test, add in inspector
void Start()
{
secondTexture = new Texture2D(width, height);
var pixels = new Color[width * height];
for (var x = 0; x < width; x++)
{
for (var y = 0; y < height; y++)
{
Color pixels2 = source.GetPixel(x, y);
if(pixels2 == Color.white)
{
//Paint the pixels in secondTexture that match X and Y of sourceTexture with blue per example
pixels[x + y * width] = Color.blue;
}
else
{
//Paint the rest of the pixels with black
pixels[x + y * width] = Color.black;
}
}
}
secondTexture.SetPixels(pixels);
secondTexture.wrapMode = TextureWrapMode.Clamp;
secondTexture.Apply();
rend.material.mainTexture = secondTexture;
}