(Unity + 2D) Change UI Button position issue - unity3d

The last few days I've been stuck with a headache of a problem in Unity.
Okay, I won't go into details with my game, but I've made a super-simple example which represents my issue.
I have a 2D scene these components:
When the scene loads, and I tap the button this script executes:
Vector3 pos = transform.position;
pos.x -= 10;
transform.position = pos;
I have also tried this code:
transform.position = Camera.main.WorldToScreenPoint(new Vector3(0, 0, 0));
The problem is, that when I click the button, the x-pos of the object sets to -1536 which is not as expected. Picture shows the scene after the button has been clicked. Notice the Rect Transform values:
So I did a little Googling and found out about ScreenToWorldPoint, WorldToScreenPoint etc but no of these conversions solves my problem.
I'm pretty sure I'm missing someting here, which probably is right in front of my, but I simply can't figure out what.
I hope someone can point me in the right direction.
Best regards.

The issue is that you are using transform.position instead of rectTransform.anchoredPosition.
While it's true that UI elements are still GameObjects and do have the normal Transform component accessible in script, what you see in the editor is a representation of the RectTransform that is built on top. This is a special inspector window for UI elements, which use the anchored positioning system so you can specify how the edges line up with their parent elements.
When you set a GameObject's transform.position, you are providing a world space position specified in 3D scene units (meters by default). This is different from a local position relative to the canvas or parent UI element, specified in reference pixels (the reference pixel size is determined by the canvas "Reference Resolution" field).
A potential issue with your use of Camera.WorldToScreenPoint is that that function returns a position specified in pixels. Whereas, as mentioned before, setting the transform.position is specified in scene units (i.e. meters by default) and not relative to the parent UI element. The inspector, though, knows it's a UI element so instead of showing you that value, it is showing you the world position translated to the UI's local coordinates.
In other words, when you set the position to zero, you are getting the indices of whatever pixels happen to be over the scene's zero point in your main camera's view, and converting those pixel numbers to meters, and moving the element there. The editor is showing you a position in reference pixels (which are not necessarily screen pixels, check your canvas setting) relative to the object's parent UI element. To verify, try tilting your camera a bit and note that the value displayed will be different.
So again you would need to use rectTransform.anchoredPosition, and you would further need to ensure that the canvas resolution is the same as your screen resolution (or do the math to account for the difference). The way the object is anchored will also matter for what the rectTransform values refer to.

Try using transform.localposition = new Vector3(0,0,0); as your button is a child of multiple game objects. You could also try using transform.TransformPoint, which should just convert localpos to worldpos.

The issue is that your button is inside of another object. You want to be changing the local position. transform.localPosition -= new Vector3(10, 0, 0)

As #Joseph has clearly explained, you have to make changes on your RectTransform for your UI components, instead of change Transform.
To achieve what you want, do it like this:
RectTransform rectTransform = this.GetComponent<RectTransform>();
Vector2 anchoredPos = rectTransform.anchoredPosition;
anchoredPos.x -= 10;
rectTransform.anchoredPosition = anchoredPos;
Just keep in mind that this 10 are not your 3D world space units (like meters or centimeters).

Try these things because I did not understand what you were trying to do
Try using transform.deltaposition
Go to the canvas and go then scale with screen size then! You can use transform.position = new Vector3(transform.position.x -10,transform.position.y, transform.positon.z)
And if this doesn't work
transform.Translate(new Vector3(transform.deltaposition.x - 10,transform.deltaposition.y, transform.deltaposition.z);

I have a better idea. Instead of changing the positions of the buttons, why not change the values that the buttons represent?
Moving Buttons
Remember that the buttons are GameObjects, and every gameobject has a position vector in its transform. So if your button is named ButtonA, then in your code you want to get a reference to that.
GameObject buttonA = GameObject.Find("ButtonA");
//or you can assign the game object from the inspector
Once you have a reference to the button, you can proceed in moving it. So let's imagine that we want to move ButtonA 10 units left.
Vector3 pos = buttonA.transform.position;
pos.X -= 10f;
buttonA.transform.position = pos;

Related

Unity Face Object With Off-center Pivot Point?

I'm trying to create a "billboard" effect where a quad/gameObject always faces a target object (the camera) around the Y axis. This is working fine as per the code below, but I want to add an optional offset to the pivot point on the X axis.
So rather than rotating around the center point (default behaviour), I want the quad to rotate around a new point thats n units off of the center point, while still facing the target object.
This will run in Update().
Currently Working Code Without Offset
transform.LookAt(camera.transform, Vector3.up);
transform.localEulerAngles = new Vector3(0, transform.localEulerAngles.y, 0); // Only affects Y Axis.
The offset I need is calculated by the function below. It is tested to be correct by moving a child GameObject by this value.
Both the leftObj and rightObj are children of the GameObject I want to be rotating.
public float GetCenterPos()
{
Vector3 left = leftObj.transform.localPosition;
Vector3 right = rightObj.transform.localPosition;
Vector3 center = (left + right) / 2f;
return center.x;
}
Top Down View of My Problem
I have tried combinations of RotateAround, but I can't figure out how to get it to face the correct object and what the pivot should be relative to the offset.
I have also googled around, and I can't find a solution to this problem that I feel is relatively simple.
To recap: I don't need a rotational offset, and I don't want to add an extra parent to change the pivot like many other answers suggest. The offset gets calculated dynamically in Update.
Thank you for any help.
I've been tinkering with it for a while, and I came up with this solution. It's not ideal because it requires storing a reference to the starting position, which breaks some of the other movement functionality I need, but it does answer my original question.
Before beginning the code above (either in start or before a bool flag is set, whatever) store a reference to the object's localPosition (startPos)
Then before calling LookAt, adjust the position to take into account the offset.
transform.localPosition = new Vector3(startPos.x + offset, transform.localPosition.y, transform.localPosition.z);
transform.LookAt(camController.transform, Vector3.up);
transform.localEulerAngles = new Vector3(0, transform.localEulerAngles.y, 0);
To clarify, the reason why I need a reference to startPos is because otherwise I would be adding the offset every frame, resulting in the object just moving constantly, rather than using a consistent value.
I just set the startPos before and after toggling the "billboard" functionality to keep it updated. Not ideal, but it does work.

Placing objects right in front of camera

I am trying to figure out how to modify HelloARController.cs from the example ARCore scene to place objects directly in front of the camera. My thinking is that we are raycasting from the camera to a Vector3 on an anchor or tracked plane, so can't we get the Vector3 of the start of that ray and place an object at or near that point?
I have tried lots, and although I am somewhat a beginner, I have come up with this
From my understanding, ScreenToWorldPoint should output a vector3 of the screen position corresponding to the world, but it is not working correctly. I have tried other options instead of ScreenToWorldPoint, but nothing has presented the desired effect. Does anyone have any tips?
To place the object right at the middle of the camera's view, you would have to change the target gameObject's transform.position (as AlmightyR has said).
The ready code would look something like this:
GameObject camera;
GameObject object;
float distance = 1;
object.transform.position = camera.transform.position + camera.transform.forward * distance;
Since camera's forward component (Z axis) is always poiting at the direction where Camera is looking to, you take that vector's direction and multiply it by a distance you want your object to be placed on. If you want your object to always stay at that position no matter how camera moves, you can make it a child of camera's transform.
object.transform.SetParent(camera.transform);
object.transform.localPosition = Vector3.forward * distance;
Arman's suggestion works. Also giving credit to AlmightyR since they got me started in the right direction. Here's what I have now:
// Set a position in front of the camera
float distance = 1;
Vector3 cameraPoint = m_firstPersonCamera.transform.position + m_firstPersonCamera.transform.forward * distance;
// Intanstiate an Andy Android object as a child of the anchor; it's transform will now benefit
// from the anchor's tracking.
var andyObject = Instantiate(m_andyAndroidPrefab, cameraPoint, Quaternion.identity,anchor.transform);
The only problem with this is that because of the existing HelloAR example code, an object is only placed if you click on a point in the point cloud in my case (or a point on a plane by default). I would like it to behave so that you click anywhere on screen, and it places an object anchored to a nearby point in the point cloud, not necessarily one that you clicked. Any thoughts for how to do that?
Side tip for those who don't know: If you want to place something anchored to a point in the cloud, instead of on a plane, change
TrackableHitFlag raycastFilter = TrackableHitFlag.PlaneWithinBounds | TrackableHitFlag.PlaneWithinPolygon;
to
TrackableHitFlag raycastFilter = TrackableHitFlag.PointCloud;

Unity3d - Need to hide a group of objects in the area

I've already tried depthmask shaders and examined some other ideas, but it seems like it doesn't suit me at all.
I'm making an AR game and I have a scene with a house and trees. All these objects are animated and do something like falling from the sky, but not all at once, but in sequence. For example, the house first, then trees, then fence etc.
(Plz, look at my picture for details) http://f2.s.qip.ru/bVqSAgcy.png
If user moves camera too far, he will see all these objects stucking in the air and waiting for their order to start falling, and it is not good. I want to hide this area from all sides (because in AR camera can move around freely) and make all parts visible only when each will start moving (falling down).
(One more screen) http://f3.s.qip.ru/bVqSAgcz.png
I thought about animation events, but there are too many objects (bricks, for example) and I can't handle all of them manually.
I look forward to your great advice ;)
P.S. Sorry for my bad english.
You can disable their(the objects that are gonna fall) mesh renderers and re active them when they are ready to fall.
See here for more details about mesh renderer.
Deactivate your Object. You might use the camera viewport coordinates to get a y position outside the viewport. They start on the bottom left of the screen (0,0) and go to the top right of the screen (1,1). Convert them to worldspace coordinates. Camera.ViewportToWorldPoint
Vector3 outsideCamera = Camera.main.ViewportToWorldPoint(new Vector3(0.5f, 1.2f, 10.0f));
Now you can use the intended x and z positions of your object. Activate it when you want to drop it.
myObject.transform.position = new Vector3(myObject.transform.position.x, outsideCamera.y, myObject.transform.position.z);
Another thing you could additionally do is scaling the object from very small to its intended size when it is falling. This would prevent the object being visible before falling when the users point the camera upwards.
1- Maybe you can use the Camera far clipping plane property.
Or you can even use 2 Cameras if you need to display let's say the landscape on one (which will not render the house + trees + ...) with a "big" far clipping plane and use a second one with Depth only clear flags rendering only the items (this one can have a smaller far clipping plane from what I understand).
2- Other suggestion I'd give you is adding the scale to your animation:
set the scale to 0 on the beginning of animation
wait for the item to be needed to fall down
set the scale to 1 (with a transition if needed)
make the item fall down
EDIT: the workaround you found is quite just fine too! But tracking only world position should be enough I think (saving a tiny amount of memory).
Hope this helps,
Finally, the solution I chose. I've added this script to each object in composition. It stores object's position (in my case both world and local) at Start() and listening if it changes in Update(). So, if true, stop monitoring and set MeshRenderer in on state.
[RequireComponent(typeof(MeshRenderer))]
public class RenderScript : MonoBehaviour
{
private MeshRenderer mr;
private bool monitoring = true;
private Vector3 posLocal;
private Vector3 posWorld;
// Use this for initialization
void Start()
{
mr = GetComponent<MeshRenderer>();
mr.enabled = false;
posLocal = transform.localPosition;
posWorld = transform.position;
}
// Update is called once per frame
void Update()
{
if (monitoring)
{
if (transform.localPosition != posLocal || transform.position != posWorld)
{
monitoring = false;
mr.enabled = true;
}
}
}
}
Even my funny cheap сhinese smartphone is alive after this, so, I guess, it's OK.

Unity2D - uniform positioning of transforms with different sizes

I have a rectangle (sprite) and I need to place different game objects (sprites) inside that rectangle but so they are all "aligned" by their bottoms.
For the life of me, I cannot make it work in Unity.
Say that my box has a height of 5.
I want to place the different size objects so they are all "resting" at the 2.5 y axis inside the box.
Does anyone know how I can do that since transform.position measures from the center of the GameObject?
Thanks!
Don't use transform.position, use RectTransform properites, as they take anchor points into account. In particular you need to set the anchor position for the sprite in the prebab / inspector and then use RectTransform.anchoredPosition to position it.

Sprite disappearing to the back of another sprite on moving to defined point via transform.position

I have a difficulty with Unity2D's Sprite rendering.
Currently I have a sprite for a gameBoard, an empty GameObject holding the spawnPoint, a random sprite marking it, as well as a playerSprite to be instantiated as a prefab. If I am just using the hierarchy on Unity, the playerSprite shows perfectly above the gameBoard, and "hard-coding" its position will always keep it above the gameBoard sprite, visible to the eye.
The problem comes when I want to instantiate the gameBoard and dynamically adding the playerPrefabs into the game.
Here is the current code snippet I am currently using:
gameBoard.SetActive(true); //gameBoard is defined as a public gameObject with its element defined in Unity as the gameBoard sprite.
Player.playerSprite = (GameObject)Instantiate(Resources.Load("playerSprite"));
Player.playerSprite.transform.localPosition = spawnPoint.transform.localPosition;
The result is that the spritePrefab spawns at the place I want perfectly, but behind the gameBoard sprite, making it hidden when the game runs.
The result is the same when using transform.position instead of transform.localPosition
How should I code the transform part. such that I can still make my playerSprite visible? Thanks.
It's most likely not an issue with the position, but rather the Sorting Order of your Sprite Renderers.
The default values for any SpriteRenderer is Layer = Default & Sorting Order = 0
Sprite Renderers with a higher sorting order are rendered on top of those with a lower value.
Add the following lines to the end of your code, and try it out.
gameBoard.GetComponent<SpriteRenderer>().sortingOrder = 0;
Player.playerSprite.GetComponent<SpriteRenderer>().sortingOrder = 0;
Obviously, you could do the same thing in the inspector as well.