I'd like to define an array of floats in my shader, something like this:
Properties
{
_TilesX ("Tiles X", Int) = 10
_TilesY ("Tiles Y", Int) = 10
_TileData1 ("Tile data", Float[]) = {} // THIS!!!
_Texture1 ("Texture odd", 2D) = "white" {}
_Texture2 ("Texture even", 2D) = "white" {}
}
I'm trying to create a single plane that I'll use as a grid and I want to modify the _TileData1 at run-time to change the Y offset of a tile. I'm using _TilesX and _TilesY to get a 2d position of the tile from a 1d array.
Just to be clear, I just want to find out how to define a property type of float[] since I couldn't find how to do so on Unity's manual pages or forums.
You can't use properties for float[]. But you can use them as variables in your shader and set them from script:
In your shader:
CGPROGRAM
int _SegmentsCount = 0;
float _Segments[1000];
void surf (Input IN, inout SurfaceOutput o) {
for (int i = 0; i < _SegmentsCount; i++) {
// This is obsviously just an example,
// avoid loops in shaders if you can help it.
}
}
ENDCG
Then in your script:
float [] array = new float[] { 0.25f, 0.75f };
material.SetFloatArray("_Segments", array);
material.SetInt("_SegmentsCount", 2);
renderer.material = material;
Apparently not.
I didn't think so, as I'd never seen it, but I thought I'd take a search around and I ran across this thread where the person answering the question says this (emphasis mine):
You can set arrays from script using SetFloatArray, SetVectorArray, and SetColorArray as of 5.4, but you can't define an array in the properties. Basically this means you can still set the value and have it defined in the CGPROGRAM block to be used, but it won't be serialized / saved by the material asset or show up in the material editor. It's an odd omission, especially since texture arrays are supported as properties (though texture arrays are a specific type of texture sampler rather than an array of texture properties like color arrays).
So you would be able to use it in the calculation, but you would only be able to modify the value via MonoBehaviour script (and would need to).
Related
Hi I am making a Generation Game In unity, and I have custom meshes and I wanted a function to convert Mesh to MeshFilters in the fastest way cause this function will get a input of a Array of Meshes and the output will be a Array of MeshFilters.
Thanks :)
This should be pretty streight forward.
In general you don't simply "create" MeshFilters. You rather attach them to a GameObject. There are bascially three ways to do so:
use AddComponent in order to attach it to an existing object
use Instantiate in order to create a clone instance of an existing prefab
use the constructor of GameObject and pass the according type(s) in as parameters
And well then just assign your Mesh to MeshFilter.sharedMesh or MeshFilter.mesh in this case where you assign the entire mesh it shouldn't really make a difference.
So you could e.g. simply do
// Needed for option B - see below
//[SerializeField] private MeshFilter preparedPrefab;
public MeshFilter[] CreateObjects(Mesh[] meshes)
{
var amount = meshes.Length;
var meshFilters = new MeshFilter[amount];
for(var i = 0; i < amount; i++)
{
// here you have multiple options
// A - create a new empty GameObjects with the MeshFilter component
meshFilters[i] = new GameObject("someName" /*, typeof(MeshRenderer), etc*/).AddComponent<MeshFilter>();
// B - Rather already prepare a prefab/template which contains all the components you need
// in particular you might also want a MeshRenderer in roder to see your objects
// and e.g. MeshCollider in order to apply physics and raycasting to your objects
//meshFilter[i] = Instantiate(prapredPrefab);
meshFilter[i].sharedMesh = meshes[i];
}
return meshFilters;
}
Note that a Mesh itself has no information about any position, rotation, scale and child-parent relationships in your scene. For this you would need more information.
In a Unity shaderlab shader you can expose shader properties to the material inspector in the editor. This can be done by placing the properties you want to expose in the Properties section like so
Properties
{
_SomeFloat("A Float", float) = 5
}
Unity defines a list of properties in the documentation here.
However this does not include any form of float2 or vector2, just single Float or Vector which consists of xyzw.
I tried setting the property type to float2 And Vector2
_SomeFloat("A Float", float2) = (5,5)
_SomeFloat2("A Float2", Vector2) = (5,5)
which both return the error Parse error: syntax error, unexpected TVAL_ID at line 7
or trying to cut down the Vector in half by setting only half the members
_SomeFloat("A Float", Vector) = (5,5)
which return the error Parse error: syntax error, unexpected ')', expecting ','
I could just use the Vector type and only use its xy, but that makes for unclear UI as there are now two unused elements in the inspector, and could not find a Property Attribute or Drawer (Such as HideInInspector) that allows you to hide the zw values from the inspector.
So is there a way to expose a float2 using a property type? Or maybe an alternative where you can place two float properties next to each other in the editor like Tiling/Offset drawer is in the standard 2D property type (Maybe something similar to [EditorGUILayout.BeginHorizontal][2])?
From quick search I've found there's MaterialPropertyDrawer that can be extended to add custom tags in shader inspectors (ref: https://docs.unity3d.com/ScriptReference/MaterialPropertyDrawer.html).
Thus, you could use Vector property in shader, create custom attribute, let's say, [ShowAsVector2] and make MaterialPropertyDrawer for it, which would only show two input fields, and assign their value to vector's x and y values. This would result in shader property written as:
[ShowAsVector2] _Position2D("Position", Vector) = (0, 0, 0, 0)
This is an extension to #tsvedas's answer.
using UnityEngine;
using UnityEditor;
/// <summary>
/// Draws a vector2 field for vector properties.
/// Usage: [ShowAsVector2] _Vector2("Vector 2", Vector) = (0,0,0,0)
/// </summary>
public class ShowAsVector2Drawer : MaterialPropertyDrawer
{
public override void OnGUI(Rect position, MaterialProperty prop, GUIContent label, MaterialEditor editor)
{
if( prop.type == MaterialProperty.PropType.Vector )
{
EditorGUIUtility.labelWidth = 0f;
EditorGUIUtility.fieldWidth = 0f;
if (!EditorGUIUtility.wideMode)
{
EditorGUIUtility.wideMode = true;
EditorGUIUtility.labelWidth = EditorGUIUtility.currentViewWidth - 212;
}
EditorGUI.BeginChangeCheck();
EditorGUI.showMixedValue = prop.hasMixedValue;
Vector4 vec = EditorGUI.Vector2Field(position, label, prop.vectorValue);
if (EditorGUI.EndChangeCheck()) {
prop.vectorValue = vec;
}
}
else
editor.DefaultShaderProperty( prop, label.text );
}
}
Simply put this script in an Editor folder, and you should be able to only see the x and y coordinates.
This is a bit complicated, but it boils down to be quite a simple problem, I hope. So here is how it goes: I am using Unity to generate a map gameobject during runtime from a bsp file which has a whole bunch of vertices, faces, uvs, texture references, and so on. The meshes created come out exactly as they should be, and all the textures come out fine. There is one problem though, there are so many meshes created with so many materials leading to many draw calls making the program slow. So I searched on a way to reduce the draw calls and I found a solution. Combine all the meshes into one big mesh and create a texture atlas by combining all the textures used. Combining the meshes works fine and combining the textures comes out great as well. Then I faced the problem of uv mapping. So I found a solution from the NVidia white paper to make a custom shader which uses the tex2d function to interpolate the texel from the texture using the uv positions with their derivatives. I think this would have worked, but my meshes have really weird triangles and I think they are ruining this solution. In the images below you can see the difference when the meshes are combined from when they are separate:
Combined Meshes with Changed UVs and Custom Shader
Separate Meshes with original UVs
This is the code I am using in the shader to set the color of the model:
o.Albedo = tex2D (_MainTex, IN.uv2_BlendTex, ddx(IN.uv_MainTex), ddy(IN.uv_MainTex)).rgb;
As you can see, I have added a second UV which is the non-tiled version of the original UV. I do that by using the frac() function, but in the C# code rather than in the shader. Since the textures can be different sizes, I had to calculate the UV before getting to the shader because I have access to the texture sizes at that time.
Here is the code I used to calculate the 2 UVs:
Rect surfaceTextureRect = uvReMappers[textureIndex];
Mesh surfaceMesh = allFaces[i].mesh;
Vector2[] atlasTiledUVs = new Vector2[surfaceMesh.uv.Length];
Vector2[] atlasClampedUVs = new Vector2[surfaceMesh.uv.Length];
for (int j = 0; j < atlasClampedUVs.Length; j++)
{
Vector2 clampedUV = new Vector2((surfaceMesh.uv[j].x - Mathf.Floor(surfaceMesh.uv[j].x)), (surfaceMesh.uv[j].y - Mathf.Floor(surfaceMesh.uv[j].y)));
float atlasClampedX = (clampedUV.x * surfaceTextureRect.width) + surfaceTextureRect.x;
float atlasClampedY = (clampedUV.y * surfaceTextureRect.height) + surfaceTextureRect.y;
atlasTiledUVs[j] = new Vector2((surfaceMesh.uv[j].x * surfaceTextureRect.width) + surfaceTextureRect.x, (surfaceMesh.uv[j].y * surfaceTextureRect.height) + surfaceTextureRect.y);
atlasClampedUVs[j] = new Vector2(atlasClampedX, atlasClampedY);
if (i < 10) { Debug.Log(i + " Original: " + surfaceMesh.uv[j] + " ClampedUV: " + clampedUV); }
}
surfaceMesh.uv = atlasTiledUVs;
surfaceMesh.uv2 = atlasClampedUVs;
The array uvReMappers is an array of Rect created when using the Texture2D function PackTextures().
Sorry for taking so long, but here is my question: Why do the textures come out contorted. Is it because the way the meshes are triangulated or is it because of the way I wrote the custom shader. And finally how can I fix it.
Thank you for your time. I am sorry for writing so much, but I have never posted a question before. I always find answers to almost all my problems online, but I have been searching for days on how to fix this problem. I feel it might be too specific to be able to find an answer for. I hope I have provided enough information.
I finally solved the problem! So it turns out I should not calculate the UVs before the shader. Instead I passed the information needed by the shader through the UVs so that it can calculate the new texel positions directly.
Here is the code before the shader:
Rect surfaceTextureRect = uvReMappers[textureIndex];
Mesh surfaceMesh = allFaces[i].mesh;
Vector2[] atlasTexturePosition = new Vector2[surfaceMesh.uv.Length];
Vector2[] atlasTextureSize = new Vector2[surfaceMesh.uv.Length];
for (int j = 0; j < atlasTexturePosition.Length; j++)
{
atlasTexturePosition[j] = new Vector2(surfaceTextureRect.x, surfaceTextureRect.y);
atlasTextureSize[j] = new Vector2(surfaceTextureRect.width, surfaceTextureRect.height);
}
surfaceMesh.uv2 = atlasTexturePosition;
surfaceMesh.uv3 = atlasTextureSize;
Here is the shader code:
tex2D(_MainTex, float2((frac(IN.uv.x) * IN.uv3.x) + IN.uv2.x, (frac(IN.uv.y) * IN.uv3.y) + IN.uv2.y));
I took a different approach and created a texture atlas on the cpu, from there UV mapping was just like normal UV mapping all I had to do was assign a texture to the vertex info from my atlas ...
My scenario is a custom voxel engine that can handle anything from minecraft to rendering voxel based planets and I haven't found a scenario it can't handle yet.
Here's my code for the atlas ...
using UnityEngine;
using Voxels.Objects;
namespace Engine.MeshGeneration.Texturing
{
/// <summary>
/// Packed texture set to be used for mapping texture info on
/// dynamically generated meshes.
/// </summary>
public class TextureAtlas
{
/// <summary>
/// Texture definitions within the atlas.
/// </summary>
public TextureDef[] Textures { get; set; }
public TextureAtlas()
{
SetupTextures();
}
protected virtual void SetupTextures()
{
// default for bas atlas is a material with a single texture in the atlas
Textures = new TextureDef[]
{
new TextureDef
{
VoxelType = 0,
Faces = new[] { Face.Top, Face.Bottom, Face.Left, Face.Right, Face.Front, Face.Back },
Bounds = new[] {
new Vector2(0,1),
new Vector2(1, 1),
new Vector2(1,0),
new Vector2(0, 0)
}
}
};
}
public static TextureDef[] GenerateTextureSet(IntVector2 textureSizeInPixels, IntVector2 atlasSizeInPixels)
{
int x = atlasSizeInPixels.X / textureSizeInPixels.X;
int z = atlasSizeInPixels.Z / textureSizeInPixels.Z;
int i = 0;
var result = new TextureDef[x * z];
var uvSize = new Vector2(1f / ((float)x), 1f / ((float)z));
for (int tx = 0; tx < x; tx++)
for (int tz = 0; tz < z; tz++)
{
// for perf, types are limited to 255 (1 byte)
if(i < 255)
{
result[i] = new TextureDef
{
VoxelType = (byte)i,
Faces = new[] { Face.Top, Face.Bottom, Face.Left, Face.Right, Face.Front, Face.Back },
Bounds = new[] {
new Vector2(tx * uvSize.x, (tz + 1f) * uvSize.y),
new Vector2((tx + 1f) * uvSize.x, (tz + 1f) * uvSize.y),
new Vector2((tx + 1f) * uvSize.x, tz * uvSize.y),
new Vector2(tx * uvSize.x, tz * uvSize.y)
}
};
i++;
}
else
break;
}
return result;
}
}
}
And for a texture definition within the atlas ...
using UnityEngine;
using Voxels.Objects;
namespace Engine.MeshGeneration.Texturing
{
/// <summary>
/// Represents an area within the atlas texture
/// from which a single texture can be pulled.
/// </summary>
public class TextureDef
{
/// <summary>
/// The voxel block type to use this texture for.
/// </summary>
public byte VoxelType { get; set; }
/// <summary>
/// Faces this texture should be applied to on voxels of the above type.
/// </summary>
public Face[] Faces { get; set; }
/// <summary>
/// Atlas start ref
/// </summary>
public Vector2[] Bounds { get; set; }
}
}
For custom scenarios where I need direct control of the UV mappings I inherit texture atlas and then override the SetupTextures() method but in pretty much all cases for me I create atlases where the textures are all the same size so simply calling GenerateTextureSet will do the uv mapping calculations I believe you need.
The UV coords for a given face of a given voxel type are then ...
IEnumerable<Vector2> UVCoords(byte voxelType, Face face, TextureAtlas atlas)
{
return atlas.Textures
.Where(a => a.VoxelType == voxelType && a.Faces.Contains(face))
.First()
.Bounds;
}
In your case you probably have a different way to map to the texture of choice from your pack but essentially the combination of a face and type in my case are what determine the uv mapping set I want.
This then allows you to use your mesh with any standard shader instead of relying on custom shader logic.
You have to turn the passed in TEXCOORD0 from a percentage of the image space to a pixel value, use the modulus to figure out which pixel it is on the tiled texture, and then turn it back into a percentage of the image.
Here's the code:
You need the 2D variables _MainTex and _PatternTex to be defined.
struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
};
float modFunction(float number, float divisor){
//2018-05-24: copied from an answer by Nicol Bolas: https://stackoverflow.com/questions/35155598/unable-to-use-in-glsl
return (number - (divisor * floor(number/divisor)));
}
fixed4 frag (v2f i) : SV_Target
{
fixed4 curColor = tex2D(_MainTex, i.uv);
fixed4 pattern = tex2D(_PatternTex,
float2(
modFunction(i.uv.x*_MainTex_TexelSize.z,_PatternTex_TexelSize.z) *_PatternTex_TexelSize.x,
modFunction(i.uv.y*_MainTex_TexelSize.w,_PatternTex_TexelSize.w) *_PatternTex_TexelSize.y
)
);
fixed4 col = curColor * pattern;
col.rgb *= col.a;
return col;
}
I have a main_mesh that has 10 submeshes, I wonder how I can change the color of of these submeshes to a different color (e.g submesh1 will have a red color, submesh2 will have a blue color,...etc). Any advise please?
UPDATE:
This is how I'm getting my mesh which has 10 submeshes:
SkinnedMeshRenderer smr = gameobject1.GetComponent<SkinnedMeshRenderer>();
Mesh main_mesh = smr.sharedMesh;
SkinnedMeshRenderer smr = gameobject1.GetComponent<SkinnedMeshRenderer>();
Mesh main_mesh = smr.sharedMesh;
smr.materials[0].color = Color.red; // Change submesh1 to red color
smr.materials[1].color = Color.blue; // Change submesh2 to blue color
...
smr.materials[n].color = ... // Change submesh n to whatever color
Since you've added the tag Unityscript i'll assume that you want to be able to change submeshes inside a script.
Assignation as parameter
The first solution would be to have add a public parameter to you script that would be an array of Mesh. Then assign manually each submesh to the array through the inspector. Now you can access the material of each mesh and change it's color.
public class MyScript : MonoBehaviour {
public Mesh[] submeshes;
// Update is called once per frame
void Update () {
for (int i = 0; submeshes[i]; i++) {
// Return the first material of the mesh renderer, use .materials if multiple Material are applied
submeshes[i].renderer.material.color = Color.red;
}
}
}
Note that I used Mesh as type for my array, but you could directly use Material if you only want to change the color.
Also, if submeshes have the exact same material, It'll change the color for all submeshes, not just one. You need to have one material per mesh.
While this solution is not viable if your number of submeshes change dynamically, this solution is pretty simple and straighforwarded.
Use children
Instead of assigning every submeshes manually, you can dynamically change the color by accessing children
public class MyScript : MonoBehaviour {
public Mesh myObject;
// Update is called once per frame
void Update () {
Material[] array = myObject.GetComponentsInChildren<Material>();
for (int i = 0; array[i]; i++) {
array[i].color = Color.red;
}
}
}
This solution allows you to have N material assigned to your submeshes.
Here is the documentation for GetComponentsInChildren
Edit
Short answer If you want a specific answer to your case, it depends on the materials and shaders assigned to your skinned Mesh Renderer because they can override or alter your childrens' materials. If not, the below code should work.
SkinnedMeshRenderer smr = gameobject1.GetComponent<SkinnedMeshRenderer>();
Mesh main_mesh = smr.sharedMesh;
Mesh[] submeshes = main_mesh.GetComponentsInChildren<Mesh>();
for (int i = 0; submeshes[i]; i++) {
// If your submesh already have a material, remove the first line below !
submeshes[i].renderer.material = new Material(Shader.Find("Diffuse"));
submeshes[i].renderer.material.color = Color.red;
}
This solution create a new material for each submesh, which is quite brutal.
In the inspector, you should assign one material to each submeshes and then use always the same material with different colors.
In case it doesn't work
When you want to change the color of one specific mesh, this mesh needs to have his own material. The color of the mesh will depends on this materials and it's properties (shaders, textures, colors).
With a Skinned Mesh Renderer, you generally use Diffuse Material with textures to apply colors to one complex mesh. In some case, this mesh apply the color to it's childrens.
When using a Skinned Mesh Renderer, you usually use a UV texture. This particular texture is created based on your 3D object and is used to apply multiple color on it (sometimes also it's childrens). Here is a simple example of UV texture and here is a more complex example.
Note that, as a mesh Renderer, a skinned mesh renderer can have multiple materials which make the situation more complex but the principle remains the same.
SkinnedMeshRenderer smr = gameobject1.GetComponent<SkinnedMeshRenderer>();
Mesh main_mesh = smr.sharedMesh;
With your code if main_mesh use a UV texture, you have two solutions
Remove the texture then apply a color to it's children
Create a specific UV texture which apply colors as you want.
I building a simple Framework for OpenGL UI's for MonoTouch. I set up everything and also succeeded rendering 3D Models, but a simple 2D texture object fails. The texture has a size of 256x256 so it's not to large and its power of two.
Here is some rendering code( Note: I did remove the existing, and working code ):
// Render the gui objects ( flat )
Projection = Matrix4x4.Orthographic(0, WindowProperties.Width, WindowProperties.Height, 0);
View = new Matrix4x4();
GL.Disable(All.CullFace);
GL.Disable(All.DepthTest);
_Stage.RenderGui();
Stage:
public void RenderGui ()
{
Draw(this);
// Renders every child control, all of them call "DrawImage" when rendering something
}
public void DrawImage (Control caller, ITexture2D texture, PointF position, SizeF size)
{
PointF gposition = caller.GlobalPosition; // Resulting position is 0,0 in my tests
gposition.X += position.X;
gposition.Y += position.Y;
// Renders the ui model, this is done by using a existing ( and working vertex buffer )
// The shader gets some parameters ( this works too in 3d space )
_UIModel.Render(new RenderParameters() {
Model = Matrix4x4.Scale(size.Width, size.Height, 1) * Matrix4x4.Translation(gposition.X, gposition.Y, 0),
TextureParameters = new TextureParameter[] {
new TextureParameter("texture", texture)
}
});
}
The model is using a vector2 for positions, no other attributes are given to the shader.
The shader below should render the texture.
Vertex:
attribute vec2 position;
uniform mat4 modelViewMatrix;
varying mediump vec2 textureCoordinates;
void main()
{
gl_Position = modelViewMatrix * vec4(position.xy, -3.0, 1.0);
textureCoordinates = position;
}
Fragment:
varying mediump vec2 textureCoordinates;
uniform sampler2D texture;
void main()
{
gl_FragColor = texture2D(texture, textureCoordinates) + vec4(0.5, 0.5, 0.5, 0.5);
}
I found out that the drawing issue is caused by the shader. This line produces a GL_INVALID_OPERATION( It works with other shaders ):
GL.UniformMatrix4(uni.Location, 1, false, (parameters.Model * _Device.View * _Device.Projection).ToArray());
EDIT:
It turns out that the shader uniform locations changed( Yes i'm wondering about this too, because the initialization happens when the shader is completly initialized. I changed it, and now everything works.
As mentioned in the other thread the texture is wrong, but this is another issue ( OpenGL ES 2.0 / MonoTouch: Texture is colorized red )
The shader initialization with the GL.GetUniformLocation problem mentioned above:
[... Compile shaders ...]
// Attach vertex shader to program.
GL.AttachShader (_Program, vertexShader);
// Attach fragment shader to program.
GL.AttachShader (_Program, pixelShader);
// Bind attribute locations
for (int i = 0; i < _VertexAttributeList.Length; i++) {
ShaderAttribute attribute = _VertexAttributeList [i];
GL.BindAttribLocation (_Program, i, attribute.Name);
}
// Link program
if (!LinkProgram (_Program)) {
GL.DeleteShader (vertexShader);
GL.DeleteShader (pixelShader);
GL.DeleteProgram (_Program);
throw new Exception ("Shader could not be linked");
}
// Get uniform locations
for (int i = 0; i < _UniformList.Length; i++) {
ShaderUniform uniform = _UniformList [i];
uniform.Location = GL.GetUniformLocation (_Program, uniform.Name);
Console.WriteLine ("Uniform: {0} Location: {1}", uniform.Name, uniform.Location);
}
// Detach shaders
GL.DetachShader (_Program, vertexShader);
GL.DetachShader (_Program, pixelShader);
GL.DeleteShader (vertexShader);
GL.DeleteShader (pixelShader);
// Shader is initialized add it to the device
_Device.AddResource (this);
I don't know what Matrix4x4.Orthographic uses as near-far range, but if it's something simple like [-1,1], the object may just be out of the near-far-interval, since you set its z value explicitly to -3.0 in the vertex shader (and neither the scale nor the translation of the model matrix will change that). Try to use a z of 0.0 instead. Why is it -3, anyway?
EDIT: So if GL.UniformMatrix4 function throws a GL_INVALID_OPERATION, it seems you didn't retrieve the corresponding unfiorm location successfully. So the code where you do this might also help to find the issue.
Or it may also be that you call GL.UniformMatrix4 before the corresponding shader program is used. Keep in mind that uniforms can only be set once the program is active (GL.UseProgram or something similar was called with the shader program).
And by the way, you're multiplying the matrices in the wrong order, anyway (given your shader and matrix setting code). If it really works this way for other renderings, then you either were just lucky or you have some severe conceptual and mathemtical inconsistency in your matrix library.
It turns out that the shader uniforms change at a unknown time. Everything is created and initialized when i ask OpenGL ES for the uniform location, so it must be a bug in OpenGL.
Calling GL.GetUniformLocation(..) each time i set the shader uniforms solves the problem.