UNITY Vertex Color from obj or dae (fdx) - unity3d

I have been googling hard for days to find a solution for this without luck.
I am basically trying to import vertex colours from.one of the above formats and interpolate between them to get a smooth Color profile across the surface.
The obj and dae file have a vertex (xyz) as well as 3 other values (I assume RGB) associated with each vertex (when viewed in an text editor). However when I import the asset, it comes in without any Color. The material is set and uneditable by default and I just can't work out how to get these colours displayed.
Any help is greatly appreciated

OK, I found a way around it.
Basically, if you import a dae file with vertex colors a Vertex Shader (Standard) is available which seems to work very well.
Initially I was messing around with objs and it looks like that does not work as well but I will have to try that again tonight!

Related

How to create a Sprite Color Mask in Unity

I'm trying to use the Among Us Spritesheets to get crewmates in Unity. The spritesheet looks like this: https://www.reddit.com/r/AmongUs/comments/ir6nl0/main_player_sprite_sheet_for_those_who_wanted_it/
Each sprite is a blue/red color. Somehow the devs get every color of crewmate from these sprites, and I'm wondering how they did it.
How can I get every color of crewmate from this sprite sheet?
Thanks!
EDIT: Solution
I edited the title to be more accurate to my problem. Thanks to #G Kalendar for mentioning Shaders, I hadn't thought about that.
What I ended up doing was creating a Shader Graph, extracting each color channel, multiplying it by a color value, and recombining them into a texture.
I followed this helpful and straightforward tutorial: https://www.youtube.com/watch?v=4dAGUxvsD24
This is what my Shader Graph ended up looking like:
"Secondary" and "Primary" are color properties.
Hope this helps somebody!
If you want to have multiple info stored in a single image, it's common practice to use the channels: Red, Green, Blue. Horizon Zero Dawn for example uses that technique to make the environment effects as efficient as possible.
Here it looks like blue and red are used as a polaceholder to mark an area. So in unity's shader graph, when you use this image in a SampleTexture2D node you can use a Split node to get the different channels of the image to isolate the parts you want to color in.
Then just multiply the different channels by the color you want, add them together and use that as the base color.
Edit: Or use the "Replace Color Node" I just learned about.

Unity Line Mesh Inward Shader

I am pretty new to shaders in unity and am stuck on a problem.
I am using ShaderGraph and unity 2021. I understood the ShaderGraph examples from the unity page and "implemented" an OutlineShader thinking I could simply "inverse" the direction (not multiply by some value to get a bigger object then coloring that object and laying the original object ontop). Yet I am nowhere near what I want.
Basically what I am trying to achive is a stepped color gradient to the inside of the object. As an image it should look something like this:
Starting from color0 (green) going inward to color1 (yellow) until i reach the center with colorN (white) and repeat the same for the other side. The shader is then used on a material that is applied to a MeshLineRenderer (LineRendererPro from asset store with minor modifications for custom behavior) which kind of looks like this:
In the end the line should be colored according to the specified colors and the direction the line is going.
Explanation of a simple "Inline" Shader (similar to an outline) would help alot. I think I would be able to adapt that concept and implement multiple colors with (maybe) percentages of width. I don't want to use a fixed image/texture since i want to change the widths and colors.
Any input is welcome and thank you in advance.

Unity Point-cloud to mesh with texture/color

I have a point-cloud and a rgb texture that fit together from a depth camera. I procedurally created a mesh from a selected part of the point-cloud implementing the quickhull 3D algorithm for mesh creation.
Now, somehow I need to apply the texture that I have to that mesh. Note that there can be multiple selected parts of the point-cloud thus making multiple objects that need the texture. The texture is just a basic 720p file that should be applied to the mesh material.
Basically I have to do this: https://www.andreasjakl.com/capturing-3d-point-cloud-intel-realsense-converting-mesh-meshlab/ but inside Unity. (I'm also using a RealSense camera)
I tried with a decal shader but the result is not precise. The UV map is completely twisted from the creation process, and I'm not sure how to generate a correct one.
UV and the mesh
I only have two ideas but don't really know if they'll work/how to do them.
Try to create a correct UV and then wrap the texture around somehow
Somehow bake colors to vertices and then use vertex colors to create the desired effect.
What other things could I try?
I'm working on quite a similar problem. But in my case I just want to create a complete mesh from the point cloud. Not just a quickhull, because I don't want to lose any depth information.
I'm nearly done with the mesh algorithm (just need to do some optimizations). Quite challenging now is to match the RGB camera's texture with the depth camera sensor's point cloud, because they of course have a different viewport.
Intel RealSense provides an interesting whitepaper about this problem and as far as I know the SDK corrects these different perspectives with uv mapping and provides a red/green uv map stream for your shader.
Maybe the short report can help you out. Here's the link. I'm also very interested in what you are doing. Please keep us up to date.
Regards

Blender model turning black when scaled down in Unity

I made a model in Blender, which I would like to import into Unity.
When importing into Blender, at full scale, the colours show up correctly :
However, I need to scale it down (the original model is very large). And when I do so, all the colours turn black. All of a sudden, once passed a seemingly arbitrary threshold :
I tried UV unwrapping in Blender, as well as fixing normals (inside and outside). But it doesn't fix this problem.
What could be the cause of this?
Please try to use 0.1(default) for Scale Factor on Import Settings of the Model.
If it doesn't work, how about changing the shader into Standard/Color?
Maybe the issue comes from the shader can't handle the tiny scale because of floating-point precision.

how to import colored mesh from meshlab to unity!(may thouth 3ds max)

I'm very new.I have a ply file,with some faces and colored point cloud.My finial aim is use unity to import this mesh with color!
I find some way ,it seems need 3ds max.But i even make mesh have color in 3ds max.
I tried :
1 In mashleb,filters-texture-per vertex texture function and convert PerVertex UV into perwedge UV.And i get a obj with mtl(only 1KB).Of courese ,it doesn't works.
2 I try filters-texture-transefer Vertex color to texture .and i get i png.But it seems only a side color of mesh.
3 I try filters-color creation and prcession-transfer color:vertex to face.And i get a obj with a little large mtl file.This action make it is very slow to open this obj file in meshlab and 3ds max and unity.And it doesn't works.
I think the color system of 3ds max and unity is different from meshlab.But i have no idea how to do!Any suggesion is OK !Thanks!
It seems after filters-texture-transefer Vertex color to texture,and save it .It can have some texture but wrong like this: pictures here
error 1
I think some thing wrong with getting texture.Any suggestion?
I found ansner in youtube:
https://www.youtube.com/watch?v=6wP_e37t7PI
3. Filters -> Texture -> Parametrization: Trivial Per-Triangle
Save project
Filters -> Texture -> Transfer Vertex Attributes to Texture
Export mesh as obj. file
I don't know reason but it really works.