when importing a model to unity from blender, i get odd material results - unity3d

i imported my blender backrooms model to unity, and i got a wierd mess, this is the model in unity , when its in blender it looks exactly as intended, and even when i render a image of the model in blender all the textures look like they should,this is the model in blender, not a rendered image, ive browsed countless "solutions" and i cant find one that applies to mine, i have tried changing unity import settings, extracting both materials and images, and so on, nothing makes the materials look proper, my guess is that its not accounting for UV mapping junk so its stretching the image so far that it becomes blurred beyond recognitiion, if anybody can help me out it would be much appreciated, also just a side note but i dont think the displacement maps work in unity like they do in blender
edit: the model has multiple objects, the floor ceiling and walls are all 3 seperate objects
edit again: here is the node layout for the floor, every other node layout for the walls and ceiling is just the floors layout but without a displacement map on it

It's been a minute since I've used Unity. But without looking at the blender file, I would suspect that it could maybe be that there's two UVs on your model, and Unity is using the wrong one. I believe Unity uses the first UV Map for texture info, while the others are used for lightmap info. If that's the case, just delete the wrong one in your model. Hopefully that helps!
Edit (more info): I created a basic model with walls, ceiling, and floor. The walls, and ceiling only have a UV Map called UVMap. However, the floor has two UV Maps: UVMap and UVMap.001. Like so:
Now, if I click on that little camera icon next to the UVMap/UVMap.001 you will possibly see the overall size of the bricks change. If so, delete the wrongly scaled UV Map so you only have one UV Map in that window. After that, I would just ensure all your UV Maps are called the same on your other meshes, which is typically UVMap as default. You will also want to be sure all of your images are being mapped by UV, not Object or Generated as Unity cannot use that information. From there you should be good to go!

Related

Import model from Blender to Unity with single texture

so I spend hours googling and "trial and error"ing, but I still could not figure this out.
I have (half) a chessboard modeled in Blender. I bake all my materials to a single png with UV mapping. It looks like this:
I export the model as an .fbx and the baked texture as a .png. Both are imported to unity.
My goal is to have a single texture/material for the model.
The problem is, that unity still recognizes all squares etc. as separate objects with their own standard UV. When I map the combined texture to the model it is all messed up.
I can see why this happens, but not how to change the mapping so it picks out the right pixels of the texture. Is there something I can do to fix this?
Here is a picture of the model in unity.
Thanks in advance.
I finally found a work around which works perfect. You can merge objects in Blender. The steps I took:
merge all squares and other objects into one. I used this site as a tutorial
create a new uv map for this combined object
bake the textures to the new uv map
export model as .fbx
export texture as .png
import all to unity
create material from texture
apply material to model
Works perfectly fine for me :D

Using Minecraft character with skin into a unity game

I've decided to use Minecraft like characters in my small game since I do not know how to make 3d models (nor I want to learn how to do such thing in the near future).
However the task now seem a little harder than expected:
I've tried looking in the asset store for prefabs to use but without success.
So, I've decided to try and make a model on blender(by not knowing a thing about non parametric 3d modeling, my knowledge of blender is extremely limited) and import it into my unity game.
And surprisingly, I managed to create the model using McPrep, export it and import it into unity maintaining objects that drive the bones (the output is a bit messy but I think I can manage to clean it up a little).
However the imported version does not have any skin and appears in a gray shade.
Turns out that the output does not keep materials/textures with it!
I've tried getting the texture used by blender and it returns the same skin I fed into mcprep so, by using the same skin, I've tried creating a material with it by getting the .png and using it as texture in a unlit texture material.
However, the result is a bit messy as shown here (left is Blender, right is Unity):
How may I make the texture on unity3d be better and right? (I've heard there is a way using a C# script but I really don't know what it is nor how to do it)
EDIT:
Thanks for the answers before, I did set the filter to point obtaining the texture to be a bit better. However the part that should be transparent is displayed in black on top of the other part (I think).
The image on the right is only filter point and the one on the left is point + alpha is transparency and the transparent shader using unlit transparent
ANSWER FOUND:
As Bart said, make sure the textures' Filter Mode is set to Point, but additionally:
Minecraft player characters are made of 2 layers, the second layer usually has lots of transparency and is used for clothing or other relief detail. So you need to use a transparency-capable shader on your material in Unity.
You're running into a filtering issue. In your case you want no filtering to happen. So select your texture, and in the inspector change the import settings so that your "Filter Mode" is set to "Point". In this case it will do no filtering of the input and your large pixels should appear sharp as you want.

Accessing a secondary texture normal map in shader graph?

I'm using Unity 2020.1.3f1's URP, with the new 2D renderer system.
As of right now, I have objects that change between the built in "Sprite-Lit-Default" material, and a material with the custom built pixel outline shader detailed here: https://danielilett.com/2020-04-27-tut5-6-urp-2d-outlines/
This worked well and good, but I recently added lights, and a normal map to my sprites as a secondary texture in the import settings. The default lit texture has no problems displaying the normal map, but when I attempted to modify my shader graph to include the normal map, it doesn't import like the sprite texture does when _MainTex is set as the reference.
I've tried _NormalMap (which is the name of the secondary texture in the importer!) as well as _NormalTex, but it always ends up not importing the normal map. I even attempted changing _MainTex to a Texture2D, but given that kept sparking an error, I didn't think it was the right way to go about it. (This one to be specific.)
Error assigning 2D texture to 2DArray texture property '_MainTex': Dimensions must match
UnityEditor.EditorApplication:Internal_CallUpdateFunctions()
Am I missing something here? All the tutorials I can find online only show people dragging the normal map in through the inspector, but this material is going to be used by many different sprites, so that seems...counterintuitive.
On top of this, the default material/shader has no issues with this, so I feel like I'm either missing something, or I'm going to end up having to code my sprites to change material through code instead of the animator, just for this small, annoying quirk..
Blackboard Properties, and nodes. This just goes into the normals input.
Inspector panel showing the missing normal map slot.
And the Secondary Textures in case I somehow misnamed it, why not?
(EDIT)
So, an update on this, for anyone else who runs into this same issue.
I managed to find a section of the shadergraph documentation that seems to be the only part talking about this:
It is required to name the reference for MainTex as _MainTex to render Sprites. It is also recommended to name the references for Mask as _MaskTex and Normal as _NormalMap to match the Shader inputs used in this package.
So from what I gather from that, _MainTex is the only one that's automatic in ShaderGraph.
After a full day of looking up tutorials, I've noticed that every single one of them simply set the normal map and extra textures as the default textures so they'll show up without being assigned manually.
I think this is possible with hand-written shaders, but I've decided to just go with a simple unlit shadergraph on a hand-drawn sprite outline, displayed on a separate gameobject parented to the main object.
I'm not posting this as an answer in case someone else finds a solution to it in the future, and since this isn't...really a solution in my eyes.
I don't know if you have figured this out yet but I'll try to answer since I had the same problem. Create a new Texture2D node, convert it to a property, and have the reference as _NormalMap, connect that to the sample texture 2D node as normal, and plug that into the sprite lit master. Now go into the sprite editor, assign the normal maps as a secondary texture, and make sure the name is the same as in your shader, _NormalMap (or something else, as long as it's the same). This currently has worked for me, shader graph detects the normal map texture by reference automatically. Attached below are some images to help and the finished result on the character sprite, which uses a custom shader I picked up from a tutorial and added the normal map to.
Shader Graph Sprite EditorSprite Editor 2 Normal Map result

Unity is multiplying the vertex count of my mesh twenty fold when importing it from blender?

This is my first time using such a massive mesh as its meant to be terrain but I cannot use Unity's terrain feature to make it. In Blender it shows as just under 20k vertices, yet when I bring it into Unity it shows a staggering 493k. I've never had this problem with a mesh before. Creating the terrain was the first step in this project so there are absolutely no other meshes or objects in the scene beyond two planes that are used for water, yet they aren't the culprit since disabling or deleting them has no noticeable effect on the vertex count.
As you can see in the images, the vertex count in blender shows 19,129, while in Unity it is 492.2k. My hierarchy is empty except a player prefab containing canvas elements, and a camera, a directional light, and event system.
I have really no idea what could be causing this.
The vertex count from the statistic window includes all vertices in the rendering. A lot of it is probably coming from the camera, which probably includes a skybox, shadows (as the statistic window shows: "Shadow casters: 2"), and depth texture. Try delete everything in your hierarchy except for the terrain, then check again.

Prefab in Unity and HoloLens falling below the spatial mapping

I'm working in a Unity application with HoloLens, where I place different prefabs across a room and I use TapToPlace to put it on a surface using SpatialMapping. I works great with some fbx models that I had, but then I received some models that I got in .max format, which I converted to FBX using 3DS Mask, they look good, but the exact setup as the other models, whenever I tap them, they are place halfway below the surface.
I've noticed that the working prefabs puts the cursor right at the bottom when in placing mode, while the "broken" ones put the cursor in the middle. Here is an example of how it looks in Unity:
Working:
Not working:
What is that circle there? it looks like the cursor, because everytime that I tap the object, the cursor is placed exactly there for each prefab, but for the ones in the center, half of the model shows below the surface, somehow it looks like it's related; is there a way to move that to the bottom as the working one looks like? if not, is there something that I need to do in 3DS max when exporting these new models to FBX? both models are using box collider, TwoHandManipulatable and TapToPlace.
Thank you, I appreciate the help.
The issue is on the axisof the model -- the pivot point of the model is not at the bottom of the model.
If at all possible it is recommended that you fix the model in your 3D modelling application, If this is not possible, you can fix it in Unity by adding an extra parent transform, more information please see:
https://answers.unity.com/questions/62675/redefine-axis-of-an-object.html
https://answers.unity.com/questions/357698/can-we-change-the-pivot-points-of-any-gameobject.html