Why Particplesystem Position is wrong while attached to a GameObject? - unity3d

I'm working with unity 5(2D project).
I attached a ParticleSystem to a GameObject(an UI Button). now i can see both of them but they are not in the same position.
I move particplesystem manually and put both object in the same position .but when i move my object they will not move togothere... particlesystem will move lower and objects will have different positions.

thank you from #Code Clown and #Noel Widmer.
as my friends said at first i should transforms the GUI's screen position to world position and then to the particle system's local position in each update.
I made this by these two line of codes.
Vector3 p = Camera.main.ScreenToWorldPoint(GameObject.Find("MyObject").transform.position);
transform.localPosition = p;

Related

(Unity + 2D) Change UI Button position issue

The last few days I've been stuck with a headache of a problem in Unity.
Okay, I won't go into details with my game, but I've made a super-simple example which represents my issue.
I have a 2D scene these components:
When the scene loads, and I tap the button this script executes:
Vector3 pos = transform.position;
pos.x -= 10;
transform.position = pos;
I have also tried this code:
transform.position = Camera.main.WorldToScreenPoint(new Vector3(0, 0, 0));
The problem is, that when I click the button, the x-pos of the object sets to -1536 which is not as expected. Picture shows the scene after the button has been clicked. Notice the Rect Transform values:
So I did a little Googling and found out about ScreenToWorldPoint, WorldToScreenPoint etc but no of these conversions solves my problem.
I'm pretty sure I'm missing someting here, which probably is right in front of my, but I simply can't figure out what.
I hope someone can point me in the right direction.
Best regards.
The issue is that you are using transform.position instead of rectTransform.anchoredPosition.
While it's true that UI elements are still GameObjects and do have the normal Transform component accessible in script, what you see in the editor is a representation of the RectTransform that is built on top. This is a special inspector window for UI elements, which use the anchored positioning system so you can specify how the edges line up with their parent elements.
When you set a GameObject's transform.position, you are providing a world space position specified in 3D scene units (meters by default). This is different from a local position relative to the canvas or parent UI element, specified in reference pixels (the reference pixel size is determined by the canvas "Reference Resolution" field).
A potential issue with your use of Camera.WorldToScreenPoint is that that function returns a position specified in pixels. Whereas, as mentioned before, setting the transform.position is specified in scene units (i.e. meters by default) and not relative to the parent UI element. The inspector, though, knows it's a UI element so instead of showing you that value, it is showing you the world position translated to the UI's local coordinates.
In other words, when you set the position to zero, you are getting the indices of whatever pixels happen to be over the scene's zero point in your main camera's view, and converting those pixel numbers to meters, and moving the element there. The editor is showing you a position in reference pixels (which are not necessarily screen pixels, check your canvas setting) relative to the object's parent UI element. To verify, try tilting your camera a bit and note that the value displayed will be different.
So again you would need to use rectTransform.anchoredPosition, and you would further need to ensure that the canvas resolution is the same as your screen resolution (or do the math to account for the difference). The way the object is anchored will also matter for what the rectTransform values refer to.
Try using transform.localposition = new Vector3(0,0,0); as your button is a child of multiple game objects. You could also try using transform.TransformPoint, which should just convert localpos to worldpos.
The issue is that your button is inside of another object. You want to be changing the local position. transform.localPosition -= new Vector3(10, 0, 0)
As #Joseph has clearly explained, you have to make changes on your RectTransform for your UI components, instead of change Transform.
To achieve what you want, do it like this:
RectTransform rectTransform = this.GetComponent<RectTransform>();
Vector2 anchoredPos = rectTransform.anchoredPosition;
anchoredPos.x -= 10;
rectTransform.anchoredPosition = anchoredPos;
Just keep in mind that this 10 are not your 3D world space units (like meters or centimeters).
Try these things because I did not understand what you were trying to do
Try using transform.deltaposition
Go to the canvas and go then scale with screen size then! You can use transform.position = new Vector3(transform.position.x -10,transform.position.y, transform.positon.z)
And if this doesn't work
transform.Translate(new Vector3(transform.deltaposition.x - 10,transform.deltaposition.y, transform.deltaposition.z);
I have a better idea. Instead of changing the positions of the buttons, why not change the values that the buttons represent?
Moving Buttons
Remember that the buttons are GameObjects, and every gameobject has a position vector in its transform. So if your button is named ButtonA, then in your code you want to get a reference to that.
GameObject buttonA = GameObject.Find("ButtonA");
//or you can assign the game object from the inspector
Once you have a reference to the button, you can proceed in moving it. So let's imagine that we want to move ButtonA 10 units left.
Vector3 pos = buttonA.transform.position;
pos.X -= 10f;
buttonA.transform.position = pos;

Unity Question: Transforming world coordinates to my Radar's Coordinates for 3D Radar

I'm learning Unity3D and having some trouble on my 3D radar. My radar is a child of my Player game object which is rotating around as I fly. The radar itself is rotated 45 degrees so it faces the user. There is a Cube that is supposed to be the radar blip of the enemy plane. The Cube is a child of Radar so should inherit its rotation. There is a script on the Cube to update itself every update(). Here is the hierarchy:
Enemy Plane
Player
-- Camera
-- Radar
------ Cube (radar representation of Enemy Plane)
The problem is that while the Cube itself is rotated, with the Radar, its motion is not. As I get closer to the enemy plane, the Cube just gets closer to the camera (which is good) but I would expect its motion to follow the 45 degree rotation of the parent Radar object?
public class RadarGlyph : MonoBehaviour
{
GameObject radarSource;
GameObject trackedObject;
Vector3 radarScaler;
void Start()
{
this.radarSource = GameObject.Find("Radar");
this.trackedObject = GameObject.Find("Enemy Fighter");
this.radarScaler = new Vector3(0.001f, 0.001f, 0.001f);
}
void Update()
{
Vector3 vDelta = this.trackedObject.transform.position - this.radarSource.transform.position;
vDelta.Scale(this.radarScaler);
this.transform.localPosition = this.transform.InverseTransformDirection(vDelta);
}
}
For a complete solution, you have to get the position of the target wrt the ship first and then recreate it within the context of the blip and the radar.
As a quick fix, you can try changing your last line like this:
this.transform.localPosition = this.parent.localRotation * this.transform.InverseTransformDirection(vDelta);
or (apparently not good as you mentioned)
this.transform.localPosition = Quaternion.Inverse(this.parent.localRotation) * this.transform.InverseTransformDirection(vDelta);
one of these is bound to work. (The first one did)
Edit: here's a third alternative
this.transform.localPosition = this.transform.parent.parent.InverseTransformDirection(vDelta);
This one gets the position in Player's space and applies it in radar's space.
The first and third are trying to do the same thing. Since you were transforming the direction into the blip's coordinate frame, any rotations that its parents have are canceled out. Instead, the correct thing to do is to get the position relative to the Player first. Then apply it to the blip in the radar. The third line of code I have here is attempting to do that.

Placing objects right in front of camera

I am trying to figure out how to modify HelloARController.cs from the example ARCore scene to place objects directly in front of the camera. My thinking is that we are raycasting from the camera to a Vector3 on an anchor or tracked plane, so can't we get the Vector3 of the start of that ray and place an object at or near that point?
I have tried lots, and although I am somewhat a beginner, I have come up with this
From my understanding, ScreenToWorldPoint should output a vector3 of the screen position corresponding to the world, but it is not working correctly. I have tried other options instead of ScreenToWorldPoint, but nothing has presented the desired effect. Does anyone have any tips?
To place the object right at the middle of the camera's view, you would have to change the target gameObject's transform.position (as AlmightyR has said).
The ready code would look something like this:
GameObject camera;
GameObject object;
float distance = 1;
object.transform.position = camera.transform.position + camera.transform.forward * distance;
Since camera's forward component (Z axis) is always poiting at the direction where Camera is looking to, you take that vector's direction and multiply it by a distance you want your object to be placed on. If you want your object to always stay at that position no matter how camera moves, you can make it a child of camera's transform.
object.transform.SetParent(camera.transform);
object.transform.localPosition = Vector3.forward * distance;
Arman's suggestion works. Also giving credit to AlmightyR since they got me started in the right direction. Here's what I have now:
// Set a position in front of the camera
float distance = 1;
Vector3 cameraPoint = m_firstPersonCamera.transform.position + m_firstPersonCamera.transform.forward * distance;
// Intanstiate an Andy Android object as a child of the anchor; it's transform will now benefit
// from the anchor's tracking.
var andyObject = Instantiate(m_andyAndroidPrefab, cameraPoint, Quaternion.identity,anchor.transform);
The only problem with this is that because of the existing HelloAR example code, an object is only placed if you click on a point in the point cloud in my case (or a point on a plane by default). I would like it to behave so that you click anywhere on screen, and it places an object anchored to a nearby point in the point cloud, not necessarily one that you clicked. Any thoughts for how to do that?
Side tip for those who don't know: If you want to place something anchored to a point in the cloud, instead of on a plane, change
TrackableHitFlag raycastFilter = TrackableHitFlag.PlaneWithinBounds | TrackableHitFlag.PlaneWithinPolygon;
to
TrackableHitFlag raycastFilter = TrackableHitFlag.PointCloud;

Sprite disappearing to the back of another sprite on moving to defined point via transform.position

I have a difficulty with Unity2D's Sprite rendering.
Currently I have a sprite for a gameBoard, an empty GameObject holding the spawnPoint, a random sprite marking it, as well as a playerSprite to be instantiated as a prefab. If I am just using the hierarchy on Unity, the playerSprite shows perfectly above the gameBoard, and "hard-coding" its position will always keep it above the gameBoard sprite, visible to the eye.
The problem comes when I want to instantiate the gameBoard and dynamically adding the playerPrefabs into the game.
Here is the current code snippet I am currently using:
gameBoard.SetActive(true); //gameBoard is defined as a public gameObject with its element defined in Unity as the gameBoard sprite.
Player.playerSprite = (GameObject)Instantiate(Resources.Load("playerSprite"));
Player.playerSprite.transform.localPosition = spawnPoint.transform.localPosition;
The result is that the spritePrefab spawns at the place I want perfectly, but behind the gameBoard sprite, making it hidden when the game runs.
The result is the same when using transform.position instead of transform.localPosition
How should I code the transform part. such that I can still make my playerSprite visible? Thanks.
It's most likely not an issue with the position, but rather the Sorting Order of your Sprite Renderers.
The default values for any SpriteRenderer is Layer = Default & Sorting Order = 0
Sprite Renderers with a higher sorting order are rendered on top of those with a lower value.
Add the following lines to the end of your code, and try it out.
gameBoard.GetComponent<SpriteRenderer>().sortingOrder = 0;
Player.playerSprite.GetComponent<SpriteRenderer>().sortingOrder = 0;
Obviously, you could do the same thing in the inspector as well.

Transform object with gravity

I am having a problem with gravity VS transform position.
I have an object falling down but I would like to change its position while it falls. But as soon as I attach a transform position in a script to the gameobject, gravity stops working.
Here is how I attend to make it moving (pos.x/pos.y are variables) in C#:
pointer.transform.localPosition = new Vector3(-pos.x-280, pos.y-50, 252);
Where in the script do you place that line of code?
If you are placing it inside the Update() function you would be setting the position to be Vector3(-pos.x-280, pos.y-50, 252) in each frame!