I've decided to use Minecraft like characters in my small game since I do not know how to make 3d models (nor I want to learn how to do such thing in the near future).
However the task now seem a little harder than expected:
I've tried looking in the asset store for prefabs to use but without success.
So, I've decided to try and make a model on blender(by not knowing a thing about non parametric 3d modeling, my knowledge of blender is extremely limited) and import it into my unity game.
And surprisingly, I managed to create the model using McPrep, export it and import it into unity maintaining objects that drive the bones (the output is a bit messy but I think I can manage to clean it up a little).
However the imported version does not have any skin and appears in a gray shade.
Turns out that the output does not keep materials/textures with it!
I've tried getting the texture used by blender and it returns the same skin I fed into mcprep so, by using the same skin, I've tried creating a material with it by getting the .png and using it as texture in a unlit texture material.
However, the result is a bit messy as shown here (left is Blender, right is Unity):
How may I make the texture on unity3d be better and right? (I've heard there is a way using a C# script but I really don't know what it is nor how to do it)
EDIT:
Thanks for the answers before, I did set the filter to point obtaining the texture to be a bit better. However the part that should be transparent is displayed in black on top of the other part (I think).
The image on the right is only filter point and the one on the left is point + alpha is transparency and the transparent shader using unlit transparent
ANSWER FOUND:
As Bart said, make sure the textures' Filter Mode is set to Point, but additionally:
Minecraft player characters are made of 2 layers, the second layer usually has lots of transparency and is used for clothing or other relief detail. So you need to use a transparency-capable shader on your material in Unity.
You're running into a filtering issue. In your case you want no filtering to happen. So select your texture, and in the inspector change the import settings so that your "Filter Mode" is set to "Point". In this case it will do no filtering of the input and your large pixels should appear sharp as you want.
Related
I just started using URP in Unity for a game in progress. I'm doing a sort of sprites-in-3d thing, so I'm rendering some sprite sheets on quads. To do this, I create a Material with the sprite sheet and use tiling/offset to render the proper frame of animation by making a call like:
CombatMaterial?.SetTextureOffset("_BaseMap", new Vector2( (AnimationDefinitions[animationDefinition] % 16) * .0625f, CombatMaterial.mainTextureOffset.y));
I'm currently trying to add some feedback into my game for when characters use abilities or get hit by flickering the material. Because the base color starts at white and goes to black, that won't really work; the only other thing I seem to have available to me is emission, which looks great. Using a 0xAAAish color achieves the effect I'm looking for. I've been using the Feel Unity asset to do this, but I've also attempted using something like this:
CombatMaterial?.SetColor("_EmissionColor", Color.white);
The problem is, once I've set the _EmissionColor, the main texture offset no longer updates in game, thereby ruining all animations. If I change the texture offset manually through the inspector at runtime, animations don't work AND the _EmissionColor flickering stop working. If I mess around with the color of the _BaseMap in the inspector, _EmissionColor flickering starts working again.
Before I start diving into some unsightly color adjustments in an attempt to make this work again, I would love to know if I'm doing something that is simply unsupported by URP/Materials/whatever, or if there is some alternative to what I'm doing that's a little more straightforward.
Thank you!
After trying a bunch of random stuff, I don't have a "real" solution, but the game IS working how I want it to.
What worked for me was setting the _EmissionColor on the Material to (1,1,1). For some reason, when the _EmissionColor is set to (0,0,0) it's a black (ha) hole and won't accept future changes to the _EmissionColor. I assume this is some shader nonsense (with the base Lit Shader that URP uses) that I am clearly unfamiliar with.
Hopefully this helps anyone doing something as pointlessly against the grain as I am!
i imported my blender backrooms model to unity, and i got a wierd mess, this is the model in unity , when its in blender it looks exactly as intended, and even when i render a image of the model in blender all the textures look like they should,this is the model in blender, not a rendered image, ive browsed countless "solutions" and i cant find one that applies to mine, i have tried changing unity import settings, extracting both materials and images, and so on, nothing makes the materials look proper, my guess is that its not accounting for UV mapping junk so its stretching the image so far that it becomes blurred beyond recognitiion, if anybody can help me out it would be much appreciated, also just a side note but i dont think the displacement maps work in unity like they do in blender
edit: the model has multiple objects, the floor ceiling and walls are all 3 seperate objects
edit again: here is the node layout for the floor, every other node layout for the walls and ceiling is just the floors layout but without a displacement map on it
It's been a minute since I've used Unity. But without looking at the blender file, I would suspect that it could maybe be that there's two UVs on your model, and Unity is using the wrong one. I believe Unity uses the first UV Map for texture info, while the others are used for lightmap info. If that's the case, just delete the wrong one in your model. Hopefully that helps!
Edit (more info): I created a basic model with walls, ceiling, and floor. The walls, and ceiling only have a UV Map called UVMap. However, the floor has two UV Maps: UVMap and UVMap.001. Like so:
Now, if I click on that little camera icon next to the UVMap/UVMap.001 you will possibly see the overall size of the bricks change. If so, delete the wrongly scaled UV Map so you only have one UV Map in that window. After that, I would just ensure all your UV Maps are called the same on your other meshes, which is typically UVMap as default. You will also want to be sure all of your images are being mapped by UV, not Object or Generated as Unity cannot use that information. From there you should be good to go!
I'm having trouble figuring out how to light up large area(s) of sprites in Unity 2D. My previous knowledge on Unity's lighting is zero.
I first tried using a large amount of point lights and using the "Sprites/Diffuse" material, but about only five would actually render at a time, so I guess there's a limit on that.
Then I tried putting in an area light. That didn't do anything, so that's when I started doing research about baked lighting on sprites (and baked lighting in general). I found stuff like this but I couldn't get it to work either because it's outdated or because I don't know what I'm doing. Other answers I've come across seem to assume that the reader knows anything about lighting in Unity in the first place which, to be honest, I don't. Unity's documentation website had some information on it, but no tutorials that go into how to set up baked lighting.
I've tried a bunch of different combinations of materials (like using the "Standard" shader for the sprites instead of "Sprites/Diffuse", emission, ect.) and I enabled "Baked Global Illumination" in Lighting>Settings.
If baked lighting isn't possible on sprites (or isn't worth the trouble), what are the alternatives?
Edit: I made sure not to have the lights pointing the wrong direction, and I do realise that Unity2D is just like painting onto a piece of paper in Unity3D. I was able to get point lights to work, but only a few at a time. I don't need to do the entire screen at once, I need to do a large specific area at once.
some tips...
working with sprites your in 2d... when you add a light, switch to 3d mode, and rotate to make sure your light is pointed at your objects, and oriented so as not to be on the same plane, or level with them, as this will cast all the light behind them.
if your trying to light up everything on the screen(in camera) attach an area light to the camera at the cameras position, point it where the camera points, and then in the inspector on the right, you can change its variables. intensity, range, width, height etc.
Emissive Texture:
https://www.youtube.com/watch?v=oa6kW5HhRd4
For some reason, I never even thought about going into the asset store. I found this for free, and it looks like it will work: Light2D.
In general I wanted to make it in unity3d but I saw that in tutorials, they made it in blender.
This is what I did in unity:
I want to make this high cube hollow on the inside, and make a small door so that I will be able to walk inside the cube. Later on, I want to somehow add stairs but the problem now is how to make it hollow on the inside.
I saw in some places the suggestion to use blender so I tried this tutorial in blender:
Blender
But got stuck there after I checked the Add Mesh Extra Objects and clicked on Save User Settings. I tried then to click on the bottom on Add > Mesh but then I don't have Extra Objects. I have Extras objects like in the tutorial video.
Anyone my main goal is to make the high cube hollow on the inside with a small door on the bottom so I will be able to walk inside.
From what I know Unity3D has no 3DModelling by default (maybe there is something for that in store), so you need to use Blender or some other program (for example Maya). There is https://blender.stackexchange.com/ where you can ask same question (since it is not really unity3d related)
Maybe this will be helpful : https://blender.stackexchange.com/questions/5849/how-do-i-create-a-solid-object-cube
Overall what you could do, but it is not the est way - you can build a everything out of boxes yourself in Unity3D
Try use to boolean modifier. Take a look this tutorial:
Blender 2.6 Tutorial 26 - Boolean Modifier
I'm starting to use Maya to do some bone animations of a 2D character, which is composed of several different parts (legs, body, head, weapon, etc.). I have each part in a separate .PNG image file. Right now I have a polygon for each part, with its own material and texture:
I was wondering if there's a way to automatically combine the textures into an atlas, make all the polygons use the same material with the atlas, and correct their UV maps so they still point to the right part of the atlas. Right now, I can do it manually in reverse: I can make the atlas outside Maya with other tools, then use the atlas on a material and manually correct the UV maps of each polygon. But it's a very long process and if I need to change a texture, I have to do it all over again. So I was wondering if there's a way to automate it.
The reason why I'm trying to do this is to save draw calls in Unity. From what I understand, Unity can batch objects as long as they share the same material. So instead of having a draw call for each polygon in the character, I'd like to have a single draw call for the whole character. I'm pretty new to Maya, so any help would be greatly appreciated!
If you want to do the atlassing in Maya, you can do it by duplicating your mesh and using the Transfer Maps tool to bake all of the different meshes onto the duplicate as a single model. The steps would be:
1) Duplicate the mesh
2) Use UV layout to make sure that the duplicated mesh has no overlapping UVs (or only has them where appropriate, like mirrored pieces.
3) Use the Transfer Maps... tool to project the original mesh onto the new one, using the "Use topology" option to ensure that the projection is clean.
The end result should be that the new model has the same geometry and appearance as the original, but with all of it's textures combined onto a single sheet attached to a single material.
The limitation of this method is that some kinds of mesh (particularly meshes that self-intersect) may not project properly, leaving you to manually touch up the atlassed texture. As with any atlassing solution you will probably see some softening in the textures, since the atlas texture is not a pixel for pixel copy but rather a a projection, and thus a resampling.
You may find it more flexible to reprocess the character in Unity with a script or assetPostprocessor. Unity has a native texture atlassing function, documented here. Unity comes with a script for combining static meshes, but you'd need to implement your own; theres'a an example on the unity wiki that probably does what you want : http://wiki.unity3d.com/index.php?title=SkinnedMeshCombiner (Caveat: we do something similar to this at work, but I can't share it; I have not used the one in this link). FWIW Unity's native atlassing works only in rectangles, so it's not as memory efficient as something you could do for yourself.