I'm trying to change the tilling of the metallic texture in the standard shader at run time. In the process of testing the operative piece of code has ended up looking like this:
mr.material.SetTextureScale("_MetallicGlossMap", new Vector2(Random.Range(0f, 100f),Random.Range(0f,100f)));
This produces no errors but does nothing at all.
I'm at a loss.
I've found the answer to this.
The texture scale of the metallic map is controlled by _MainTex.
Material.SetTextureScale("_MainTex", scale) will set the metallic map scale as well. However, Material.MainTextureScale will not.
Not at all confusing.
Related
I'm making a simple 3D game using my own shader, and want to emulate an 8-bit pixel art style. To do this, I need to sample a low-resolution texture without the sampler interpolating between inter-pixel values. Hopefully this will also reduce processing time since it won't need to calculate the interpolation. Is there a way I can accomplish this?
Ok, turns out there was something I didn't realise about 2D textures. I'd already set its Filter Mode to 'point', which turns off interpolation in the sampling, but I hadn't noticed that there's a box marked 'Non-Power of 2'. What this does is, if your texture has dimensions that aren't a power of 2 (i.e. 1024) it upscales it so that it is. It does this so that it can apply compression to the texture, since compression only works on power-of-2 images in Unity. However, it applies antialiasing to the upscale regardless of Sample Mode, so the interpolation was actually being 'baked in' by the compiler and had nothing to do with the sampler or the shader whatsoever.
Turning this OFF solved the problem; however once I'm happy with my textures I will be expanding them to a power-of-2 size with trailing padding in order to take advantage of it.
I have a 3D model(part of heart) and also I have created a texture to apply on to it.
Unfortunately if I apply to the 3D model it doesnt look good. But I did the same for a cube and its nicely work as I expected. The below is the figure.
You can see the cube is more realistic, however, if I apply to my model, it is not very good. Any suggestion why this is happening?
The cube is "unwrapped" - it has a UV Map. Your Heart-Mesh does not.
You need to UV-Map / Unwrap your Heart-Mesh.
In Blender:
For this, you could try "Smart UV Project" in EditMode, but that will create small islands and you get a lot of seams.
By hand, you could mark seams and choose "unwrap" which can result in a
better UV map.
Alternative: Use a Triplanar Shader. Probably a good idea for a repeating texture like yours.
(I got that image from this reddit post: https://www.reddit.com/r/Unity3D/comments/ndh9ll/simple_triplanar_shader_in_unity/)
I have a point-cloud and a rgb texture that fit together from a depth camera. I procedurally created a mesh from a selected part of the point-cloud implementing the quickhull 3D algorithm for mesh creation.
Now, somehow I need to apply the texture that I have to that mesh. Note that there can be multiple selected parts of the point-cloud thus making multiple objects that need the texture. The texture is just a basic 720p file that should be applied to the mesh material.
Basically I have to do this: https://www.andreasjakl.com/capturing-3d-point-cloud-intel-realsense-converting-mesh-meshlab/ but inside Unity. (I'm also using a RealSense camera)
I tried with a decal shader but the result is not precise. The UV map is completely twisted from the creation process, and I'm not sure how to generate a correct one.
UV and the mesh
I only have two ideas but don't really know if they'll work/how to do them.
Try to create a correct UV and then wrap the texture around somehow
Somehow bake colors to vertices and then use vertex colors to create the desired effect.
What other things could I try?
I'm working on quite a similar problem. But in my case I just want to create a complete mesh from the point cloud. Not just a quickhull, because I don't want to lose any depth information.
I'm nearly done with the mesh algorithm (just need to do some optimizations). Quite challenging now is to match the RGB camera's texture with the depth camera sensor's point cloud, because they of course have a different viewport.
Intel RealSense provides an interesting whitepaper about this problem and as far as I know the SDK corrects these different perspectives with uv mapping and provides a red/green uv map stream for your shader.
Maybe the short report can help you out. Here's the link. I'm also very interested in what you are doing. Please keep us up to date.
Regards
There are some ideas I would like to do in shader that require me to make a weird texture topology, texture atlas packing for example. They could not be wrapped with common wrap mode, there would be glitch or bleeding on the edge of UV seam
And so I think, what if I just use tex2D() or tex2Dlod() for texture lookup with point filtering, and then rewrite all sampling and blending logic in the shader itself, look up for many points with custom wrapper and blend them with shader code
Is it possible and what could be a problem or disadvantage about this method?
Yes, this is possible and common. You will need to set the filter and/or wrap mode of the texture asset itself, in the project. (if using shader graph, you have the option to specify a custom sample state inside the shader itself). You can certainly modify the UV coordinates passed to your shader, and use those modified values to sample the texture(s).
So, I have a functioning voxel engine that creates smoothed terrain in chunks of 1x1x1 meter in a 1024 radius around my player.
I wanted to create a geometry shader that not only continues to texture the ground appropriately, but also creates grass (preferably waving with the wind).
I have found some basic billboard geometry shaders to get me started, but they seem to cause the mesh to stop texturing. Is there anyway to do both from one shader?
Do I pass the mesh triangles and the new grass triangles on to the fragment shader with a flag? Thanks in advance!
You can do this by implementing two passes to the shader as it turns out. My first pass is a simple surface shader, but my second pass is the geometry shader. The multi-pass still results in 130 FPS, so it seems to be adequate.