drawString not drawing text on my game - monogame

I am new in this monogame stuff. Watched few tutorials and figured it out how to do some stuff. But here is a problem, when it comes to drawing string on game surface, it does not work.
I downloaded a MonoGame sample from one page (https://onedrive.live.com/redir?resid=24923400704D0887!3539&authkey=!AAwQxeoQwb_cmhk&ithint=file%2c.zip) and it suited my needs for my game. Game works well all the textures and graphics and everything, but when it comes to printing it does nothing.
I have SpriteFont sf declared right after my class, have it initialized
sf = Content.Load<SpriteFont>("myFont");
And in draw method right between spriteBatch.Begin and spriteBatch.End
spriteBatch.DrawString(sf, "Score", new Vector(100, 100), Color.Red);
but still doesn't work.
Font loads great, I tested it in other new Monogame project for WP and it even draws!
So I blame this template-sample I downloaded. Don't know this stuff a lot so I hope someone can help me and explain in.

You should start by removing any other draw calls. If you can then see the string, it means that the string is being rendered behind something else. Then you need to play with layerDepth values (from 0.0 to 1.0 where 1.0 is the front)

Related

Unity URP Material scripting

I just started using URP in Unity for a game in progress. I'm doing a sort of sprites-in-3d thing, so I'm rendering some sprite sheets on quads. To do this, I create a Material with the sprite sheet and use tiling/offset to render the proper frame of animation by making a call like:
CombatMaterial?.SetTextureOffset("_BaseMap", new Vector2( (AnimationDefinitions[animationDefinition] % 16) * .0625f, CombatMaterial.mainTextureOffset.y));
I'm currently trying to add some feedback into my game for when characters use abilities or get hit by flickering the material. Because the base color starts at white and goes to black, that won't really work; the only other thing I seem to have available to me is emission, which looks great. Using a 0xAAAish color achieves the effect I'm looking for. I've been using the Feel Unity asset to do this, but I've also attempted using something like this:
CombatMaterial?.SetColor("_EmissionColor", Color.white);
The problem is, once I've set the _EmissionColor, the main texture offset no longer updates in game, thereby ruining all animations. If I change the texture offset manually through the inspector at runtime, animations don't work AND the _EmissionColor flickering stop working. If I mess around with the color of the _BaseMap in the inspector, _EmissionColor flickering starts working again.
Before I start diving into some unsightly color adjustments in an attempt to make this work again, I would love to know if I'm doing something that is simply unsupported by URP/Materials/whatever, or if there is some alternative to what I'm doing that's a little more straightforward.
Thank you!
After trying a bunch of random stuff, I don't have a "real" solution, but the game IS working how I want it to.
What worked for me was setting the _EmissionColor on the Material to (1,1,1). For some reason, when the _EmissionColor is set to (0,0,0) it's a black (ha) hole and won't accept future changes to the _EmissionColor. I assume this is some shader nonsense (with the base Lit Shader that URP uses) that I am clearly unfamiliar with.
Hopefully this helps anyone doing something as pointlessly against the grain as I am!

How can I tell what will actually be displayed in game?

I'm having a serious issue with Unity. It's most definitely a conceptual issue, and I would love some help understanding what I'm doing wrong.
The game window on the Unity editor shows my game working exactly as intended. The sizing of everything is perfect, the animations fit nicely, and everything is how I want it to be.
The problem, however, is that when I actually build the game, nothing is where it should be. The animations are off. The sizing of buttons and canvas UI elements are wonky. This is not what I see represented in the Unity editor, so I have no idea why it's working like that.
I've tried changing canvas to scale with screen size because the canvas elements are what is bugging.
I expect my built game to look exactly as it does in the unity editor, but that's not happening.
Thanks!

Curled fingers: Unity3d

When I played imported animation, the character's fingers get curled and his mouth opened. I might be able to fix that by editing adjusting bones in unity. But with so many characters and each time dragging them from assets to scene.
It's a bit too much misery and it's slowed me down, I need this get done fast.
Unity Masters, PLEASE HELP ME.
the bones in your imported model rig must contain all the fingers and other parts bones that doesnt move correctly , if it does then assign them to your avatar as you are setting up your avatar (look here for avatar setup) , after that if your animation animates fingers then your model will also
So that's right you don't get timely help. AnyWays,
[SOLUTION]
The animations was imported from iClone 5. Now I was importing animation separately and then applying it to the .fbx character. So I alternatively imported the animations embedded with the character. Now this is helpful only in case the original character is also being imported from iClone. Then in the rigged configuration I reset the pose>Enforce T-pose>automap> again automap>enforce T-Pose. Worked for me.
[SOLUTION2]
I had alternative in which I could mask hand animations (not using them) and remove jaw from 3d exchange while exporting fbx. But they looked so lovely I had to look further.
Sometimes you can't find the solution, I guess you are the only solution then. That's how everybody learns. hopes it helps.

Unity3d animated cursor

For change cursor i using this:
UnityEngine.Cursor.SetCursor(CursorTexture,
new Vector2(CursorTexture.width, CursorTexture.height) * 0.5f,
CursorMode.ForceSoftware);
I want to animate cursor when something happens.
Is it possible to anumate cursor using Cursor.SetCursor?
You can do it like LearnCocos2D says. The problem will be it will flicker a lot and the other problem you will most likely have is that the mouse pointer will be really sluggish. This is because software mouse pointers are not rendered by hardware so its always a couple of frames behind actual user's input on the pointer device.
Then for the animated texture to work on web browser you need to make sure you export the needed shaders you are using if anything on a resources folder of your web player project since lots of shaders are not exported to the web build by default. It should work if you are using a standard diffuse, but I think that for a mouse pointer since most likely it uses transparency then it may not work by default. You'll need to find the actual shader being used and export that by manually for your build.
Unity should have support for hardware animated cursors at least on PC, but sadly it doesn't...

Textures not drawing if multiple EAGLViews are used

I'm having a bit of a problem with Apples EAGLView and Texture2D. If I create an instance of EAGLView and draw some textures, it works great. However, whenever I create a second instance of EAGLView, the textures in the new view(s) aren't drawn.
Being new to OpenGL, I've got absolutely now clue as to what is causing this behavior. If somebody would like to help, I've created a small project that reproduces the behavior. The project can be found at http://www.cocoabeans.se/OpenGLESBug.zip
Many thanks,
Tim Andersson
Update
I tried using sharegroups but I'm not really sure if I used them correctly. However, it did change the behavior slightly; instead of the texture drawing only in the first instantiated view, it now draws the texture in the last instantiated view and draws white rectangles in the other views. I don't know if that is better or worse, but at least something is showing up in the other views now.
This is driving me crazy and I would be very grateful if somebody could help me with this problem. I've updated the project at http://www.cocoabeans.se/OpenGLESBug.zip to reflect the changes.
Cheers,
Tim
Second Update
After trying some more things, it seems that the problem is related to Apple's Texture2D class, though I'm not sure exactly what is causing the behavior. I think the best thing to do is to write my own texture class (it will help me understand how OpenGL handles textures, which will probably come in handy).
(Haven't downloaded your code.)
The OpenGL drawing contexts are different if you just use two EAGLViews (the code in that base class creates and owns the GL context as well as render/frame/depth buffers). If you generate/bind some textures in one context, they won't be available in the other. You can share contexts if you like using a sharegroup (see this question for more: How to use OpenGL ES on a separate thread on iphone?). Or define the textures (if small) in both places, etc.