Sprite disappearing to the back of another sprite on moving to defined point via transform.position - unity3d

I have a difficulty with Unity2D's Sprite rendering.
Currently I have a sprite for a gameBoard, an empty GameObject holding the spawnPoint, a random sprite marking it, as well as a playerSprite to be instantiated as a prefab. If I am just using the hierarchy on Unity, the playerSprite shows perfectly above the gameBoard, and "hard-coding" its position will always keep it above the gameBoard sprite, visible to the eye.
The problem comes when I want to instantiate the gameBoard and dynamically adding the playerPrefabs into the game.
Here is the current code snippet I am currently using:
gameBoard.SetActive(true); //gameBoard is defined as a public gameObject with its element defined in Unity as the gameBoard sprite.
Player.playerSprite = (GameObject)Instantiate(Resources.Load("playerSprite"));
Player.playerSprite.transform.localPosition = spawnPoint.transform.localPosition;
The result is that the spritePrefab spawns at the place I want perfectly, but behind the gameBoard sprite, making it hidden when the game runs.
The result is the same when using transform.position instead of transform.localPosition
How should I code the transform part. such that I can still make my playerSprite visible? Thanks.

It's most likely not an issue with the position, but rather the Sorting Order of your Sprite Renderers.
The default values for any SpriteRenderer is Layer = Default & Sorting Order = 0
Sprite Renderers with a higher sorting order are rendered on top of those with a lower value.
Add the following lines to the end of your code, and try it out.
gameBoard.GetComponent<SpriteRenderer>().sortingOrder = 0;
Player.playerSprite.GetComponent<SpriteRenderer>().sortingOrder = 0;
Obviously, you could do the same thing in the inspector as well.

Related

Adding child to rigidbody makes it move

I have a parent object with a rigidbody (gravity disabled). With a script I am adding simple cubes with box colliders as children to this parent.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class Test : MonoBehaviour
{
public GameObject prefab;
void Update()
{
if(Input.GetMouseButtonDown(0))
{
GameObject go = Instantiate(prefab, transform);
go.transform.localPosition = new Vector3(10f, 10f, 10f);
}
}
}
This works fine, but a slight movement of the parent object is visible everytime a cube is instantiated and parented. So eventually after adding quite a few cubes the parent moves even whole units from its original position, although the velocity is always Vector3.zero.
How does this movement happen and how do I prevent it?
So, it appears that the small movement of the rigidbody parent comes from a change in the centre of mass. By moving or instantiating objects as childs this value has been changed as it states in https://docs.unity3d.com/ScriptReference/Rigidbody-centerOfMass.html:
If you don't set the center of mass from a script it will be
calculated automatically from all colliders attached to the rigidbody.
After a custom center of mass set, it will no longer be recomputed
automatically on modifications such as adding or removing colliders,
translating them, scaling etc. To revert back to the automatically
computed center of mass, use Rigidbody.ResetCenterOfMass.
Finally, as solution I just set the local center of mass to a fixed value.
rig.centerOfMass = Vector3.zero;
I am quite happy with this solution, because this also changes the center of rotation. Still I am not quite sure why it adds a movement on a change in the center of mass, because I don't see why that would be realistic behaviour.
Also there is a difference between rigidbody.position and transform.position. By using solutions such as commented under the question:
Unparent, move child, parent again
Disable child, move child, enable child again
Disable the child's collider, move child, enable it again
...it is possible to keep the transform.position the same, but rigidbody.position will still change. This difference between both values persists as long as the rigidbody is asleep. Waking it up e.g. by calling Rigidbody.Wakeup() will sync both values again. Due to a difference in both values even things like a raycast can fail, although the ray might be hitting the spot where the meshes are actually drawn and just the physics engine thinks they are somewhere else. Also waking up again looks ugly because the mesh snaps back into place were it should be to match the rigidbody.

"Attaching" instantiated prefab to a gameobject

I have 2 game objects (2D project):
GameObject enemy1 = GameObject.Find("Enemy1"); // Sprite Renderer "Order in Layer" 1
GameObject enemy2 = GameObject.Find("Enemy2"); // Sprite Renderer "Order in Layer" 2
A fire prefab is instantiated (just a fire animation):
GameObject go = Instantiate(Resources.Load<GameObject>("Animations/Fire1")); // Sprite Renderer "Order in Layer" 5
go.transform.SetParent(enemy1.transform);
go.transform.position = enemy1.transform.position;
Since the fire prefab's Sprite Renderer's Order in Layer is 5, it always on top of both enemies.
I would like the fire to appear above enemy1 but behind enemy2 regardless of what their Order in Layer is changed to. Basically, it should look like the enemy is catching fire, even if it moves or if it's layer order changes.
How can this be achieved? I thought making the fire prefab a child of the enemy gameobject would do it, but it doesn't.
Edit:
Making the fire animation a child of the enemy manually in the editor works perfectly. How do I replicate that with code?
Instantiating the prefab in this way fixed the issue:
go = Instantiate(Resources.Load<GameObject>("Animations/Fire1"), enemy1.transform);
Changing the added prefab's sortingOrder was also necessary to make things working correctly:
go.GetComponent<SpriteRenderer>().sortingOrder = enemy1.GetComponent<SpriteRenderer>().sortingOrder;
After this there was a small issue - the instantiated prefab would randomly appear behind or in front of the enemy. This was fixed by adding a Sorting Group component to the fire prefab.

(Unity + 2D) Change UI Button position issue

The last few days I've been stuck with a headache of a problem in Unity.
Okay, I won't go into details with my game, but I've made a super-simple example which represents my issue.
I have a 2D scene these components:
When the scene loads, and I tap the button this script executes:
Vector3 pos = transform.position;
pos.x -= 10;
transform.position = pos;
I have also tried this code:
transform.position = Camera.main.WorldToScreenPoint(new Vector3(0, 0, 0));
The problem is, that when I click the button, the x-pos of the object sets to -1536 which is not as expected. Picture shows the scene after the button has been clicked. Notice the Rect Transform values:
So I did a little Googling and found out about ScreenToWorldPoint, WorldToScreenPoint etc but no of these conversions solves my problem.
I'm pretty sure I'm missing someting here, which probably is right in front of my, but I simply can't figure out what.
I hope someone can point me in the right direction.
Best regards.
The issue is that you are using transform.position instead of rectTransform.anchoredPosition.
While it's true that UI elements are still GameObjects and do have the normal Transform component accessible in script, what you see in the editor is a representation of the RectTransform that is built on top. This is a special inspector window for UI elements, which use the anchored positioning system so you can specify how the edges line up with their parent elements.
When you set a GameObject's transform.position, you are providing a world space position specified in 3D scene units (meters by default). This is different from a local position relative to the canvas or parent UI element, specified in reference pixels (the reference pixel size is determined by the canvas "Reference Resolution" field).
A potential issue with your use of Camera.WorldToScreenPoint is that that function returns a position specified in pixels. Whereas, as mentioned before, setting the transform.position is specified in scene units (i.e. meters by default) and not relative to the parent UI element. The inspector, though, knows it's a UI element so instead of showing you that value, it is showing you the world position translated to the UI's local coordinates.
In other words, when you set the position to zero, you are getting the indices of whatever pixels happen to be over the scene's zero point in your main camera's view, and converting those pixel numbers to meters, and moving the element there. The editor is showing you a position in reference pixels (which are not necessarily screen pixels, check your canvas setting) relative to the object's parent UI element. To verify, try tilting your camera a bit and note that the value displayed will be different.
So again you would need to use rectTransform.anchoredPosition, and you would further need to ensure that the canvas resolution is the same as your screen resolution (or do the math to account for the difference). The way the object is anchored will also matter for what the rectTransform values refer to.
Try using transform.localposition = new Vector3(0,0,0); as your button is a child of multiple game objects. You could also try using transform.TransformPoint, which should just convert localpos to worldpos.
The issue is that your button is inside of another object. You want to be changing the local position. transform.localPosition -= new Vector3(10, 0, 0)
As #Joseph has clearly explained, you have to make changes on your RectTransform for your UI components, instead of change Transform.
To achieve what you want, do it like this:
RectTransform rectTransform = this.GetComponent<RectTransform>();
Vector2 anchoredPos = rectTransform.anchoredPosition;
anchoredPos.x -= 10;
rectTransform.anchoredPosition = anchoredPos;
Just keep in mind that this 10 are not your 3D world space units (like meters or centimeters).
Try these things because I did not understand what you were trying to do
Try using transform.deltaposition
Go to the canvas and go then scale with screen size then! You can use transform.position = new Vector3(transform.position.x -10,transform.position.y, transform.positon.z)
And if this doesn't work
transform.Translate(new Vector3(transform.deltaposition.x - 10,transform.deltaposition.y, transform.deltaposition.z);
I have a better idea. Instead of changing the positions of the buttons, why not change the values that the buttons represent?
Moving Buttons
Remember that the buttons are GameObjects, and every gameobject has a position vector in its transform. So if your button is named ButtonA, then in your code you want to get a reference to that.
GameObject buttonA = GameObject.Find("ButtonA");
//or you can assign the game object from the inspector
Once you have a reference to the button, you can proceed in moving it. So let's imagine that we want to move ButtonA 10 units left.
Vector3 pos = buttonA.transform.position;
pos.X -= 10f;
buttonA.transform.position = pos;

Make rigid body bounce off the screen edge

I'm currently working on an object that involves bubble-like movement. This object has a rigidbody and a sphere collider.
I use the AddForce method every 1-3 seconds to make it move continuously and slowly.
Now I'd like to know how to make the rigidbody move in the opposite direction (a.k.a bounce off) when they reach the screen edge already. I have already computed the screen edges using ViewportToWorldPoint method.
One ineffective solution that I thought of is to put empty game objects with collider at the edges but that won't work in my case since I'm building for mobile devices which have different screen resolutions/sizes.
Anyone know a good solution for this?
I'm not sure i got the idea. But i think i had the same problem when i was writing an old mobile game.
I had the same idea you did, use empty game objects with box collider on the edges, but then i thought, this isn't responsive, so i wrote this code:
public class Walls_c : MonoBehaviour {
public Transform righttop;
public Transform rightbottom;
public Transform lefttop;
public Transform leftbottom;
// Use this for initialization
void Start () {
righttop.transform.position = Camera.main.ViewportToWorldPoint(new Vector3(1,1,0));
rightbottom.transform.position = Camera.main.ViewportToWorldPoint(new Vector3(1,0,0));
lefttop.transform.position = Camera.main.ViewportToWorldPoint(new Vector3(0,1,0));
leftbottom.transform.position = Camera.main.ViewportToWorldPoint(new Vector3(0,0,0));
}
}
With this, i always get the corners of the screen. It's no fancy... but it works.
Let me now if it works.
In order to get a reflection effect, you need Material. Attach a bouncy material to the edges(gameObject) and let the physics calculate what should be the reaction of collision.
Create a Physics 2D material and set the bounceness to some appropriate value say 0.2.
Regarding your issue:
One ineffective solution that I thought of is to put empty game
objects with collider at the edges but that won't work in my case
since I'm building for mobile devices which have different screen
resolutions/sizes.
If you are working with UI component then dealing with boundaries should not be a problem (since it anchors with different resolution) but if it is not the case you can create a script CameraResizer and an enum Anchor (Left, Right, Top, Bottom) and using the same way ViewportToWorldPoint you can align your empty gameObject (boundary) to any screen size by attaching it the gameObject.
Hope it helps!

Why Particplesystem Position is wrong while attached to a GameObject?

I'm working with unity 5(2D project).
I attached a ParticleSystem to a GameObject(an UI Button). now i can see both of them but they are not in the same position.
I move particplesystem manually and put both object in the same position .but when i move my object they will not move togothere... particlesystem will move lower and objects will have different positions.
thank you from #Code Clown and #Noel Widmer.
as my friends said at first i should transforms the GUI's screen position to world position and then to the particle system's local position in each update.
I made this by these two line of codes.
Vector3 p = Camera.main.ScreenToWorldPoint(GameObject.Find("MyObject").transform.position);
transform.localPosition = p;