Make Sprite Draggable When Touched - touch

Is there a way to make a sprite draggable but only when the sprite itself is touched? Currently i have my game ,which uses anengine, set up to where the sprite follows your finger ever time the scene is touched. If you touch the opposite side of the scene the sprite gets "teleported" which is what i dont want.
I tried overriding the onAreaTouched method of the sprite and made it set its coordinates to where your finger currently is but this doesnt work too well. If you make sudden movements the draggablity wears off.
is there any simple way to accomplish this?

Answering my own question... i used this code and it worked perfectly:
draggableSprite = new Sprite(CAM_WIDTH/2, CAM_HEIGHT/2,
spriteTextureRegion, mVertexBufferObjectManager){
#Override
public boolean onAreaTouched(TouchEvent pSceneTouchEvent,
float pTouchAreaLocalX, float pTouchAreaLocalY) {
if(pSceneTouchEvent.isActionMove()){
spriteIsTouched = true;
}
else{
spriteIsTouched = false;
}
return super
.onAreaTouched(pSceneTouchEvent, pTouchAreaLocalX, pTouchAreaLocalY);
}
};
scene.attachChild(draggableSprite);
scene.registerTouchArea(draggableSprite);
scene.setOnSceneTouchListener(new IOnSceneTouchListener() {
#Override
public boolean onSceneTouchEvent(Scene pScene, TouchEvent pSceneTouchEvent) {
// TODO Auto-generated method stub
if(spriteIsTouched){
draggableSprite.setPosition(pSceneTouchEvent.getX() - (draggableSprite.getWidth()/2), pSceneTouchEvent.getY() - (draggableSprite.getHeight()/2));
//This sets the position of the sprite and then
//offsets the sprite so its center is at your finger
}
return false;
}
});`

Related

Canvas rotating when I change it to worldSpace, Unity

I've got a canvas that changes from Overlay to worldSpace when an event occurs, but when it changes to Overlay from worldSpace, the rotation of the canvas is changed, which I don't want it to be.
Images:
As you can see in the order that the images are, it starts with no rotation in worldSpace, then the event occurs changes it to overlay and the rotation is now 140, after the event and I'm back in worldSpace, it is still 140 degrees.
I don't know what is wrong with this. Please help if you can
This happens because, internally, the Canvas is being transformed into camera space when set to Overlay. That's how it renders. Your solution would be to cache the transform before changing the Render Mode.
Something like this, given a component:
using UnityEngine;
[RequireComponent(typeof(Canvas))]
public class CanvasRenderModeSwitcher : MonoBehaviour
{
private Canvas canvas;
private Vector3 position;
private Vector3 scale;
private Quaternion rotation;
private void OnEnable()
{
canvas = GetComponent<Canvas>();
}
public void SetRenderMode(RenderMode renderMode)
{
if (renderMode == RenderMode.WorldSpace)
{
// Set the render mode before values are reset.
canvas.renderMode = renderMode;
// Restore the values.
transform.position = position;
transform.rotation = rotation;
transform.localScale = scale;
}
else
{
// Cache the values.
position = transform.position;
rotation = transform.rotation;
scale = transform.localScale;
// Set the render mode after values are cached.
canvas.renderMode = renderMode;
}
}
}

How can I stop getting mouse position of UI Elements in Unity2D?

I'm making a mobile game where there is a simple circle with a handle attached to it, it can be controlled and can be rotated with mouse position, I'm using below code for handle control.
public class Controller : MonoBehaviour
{
private float m_force = 0.04f;
// amount of force for the player (circle and handle both)
public GameObject firePoint; // It is the tip of the handle which will shoot bullets
public float senstivity = 1f;
// senstivity control
public void ControlSenstivity(float index)
{
senstivity = index;
}
void Update()
{
//Get the Screen positions of the object
Vector2 positionOnScreen = Camera.main.WorldToViewportPoint(transform.position) ;
//Get the Screen position of the mouse
Vector2 mouseOnScreen = Camera.main.ScreenToViewportPoint(Input.mousePosition) ;
//Get the angle between the two points
float angle = AngleBetweenTwoPoints(positionOnScreen, mouseOnScreen) ;
// it will control the rotation of player (circle and handle both)
transform.rotation = Quaternion.Euler(new Vector3(0f, 0f, angle) * senstivity);
}
float AngleBetweenTwoPoints(Vector3 a, Vector3 b)
{
return Mathf.Atan2(a.y - b.y, a.x - b.x) * Mathf.Rad2Deg ;
}
// Below function for adding force to the player (both the circle and the handle), this function I added for mobile input whenever the button is pressed
public void AddForce()
{
transform.Translate(-firePoint.transform.localPosition.x, 0, 0 * m_force);
}
}
In this mobile game the user can rotate the player by dragging on the screen and can move the player by exerting force in the opposite direction of fire point whenever the add force button is pressed but the problem is that whenever I press the button the player takes Input.mousePosition of that button position and get rotated towards the button, that's why it will always move in one direction that is opposite of button. I wanted to know what can I do to not get Input.mousePosition of button which is in the canvas and can originally get position only of camera space. Any response will be appreciated, Thanks in advance!
You can use EventSystem.IsPointerOverGameObject. This method will check if your pointer(mouse/joystick) is on any of the UI elements.
Add if condition before getting mouse position.
// This will ignore all UI game objects
if (!EventSystem.current.IsPointerOverGameObject())
{
//Get the Screen positions of the object
Vector2 positionOnScreen = Camera.main.WorldToViewportPoint(transform.position);
}

How to make Player Model with Nav Mesh Agent to Jump

i have a program code on a Player model with Nav Mesh Agent that allows it to walk around the world when clicked but am trying to make it jump and haven't seem to achieve it.
this is my code, don't know what to add or remove
public class WorldInteraction : MonoBehaviour {
NavMeshAgent playerAgent;
// Use this for initialization
void Start () {
playerAgent = GetComponent<NavMeshAgent> (); //instantiate the nav mesh to PlayerAgent
}
// Update is called once per frame
void Update () {
if (Input.GetMouseButtonDown (0) && !UnityEngine.EventSystems.EventSystem.current.IsPointerOverGameObject ()) //condition if the postion is being clicked on a UI is veung clicked
{
GetInteraction (); //call interaction method
}
if (Input.GetMouseButtonDown (1) && !UnityEngine.EventSystems.EventSystem.current.IsPointerOverGameObject ())
{
transform.Translate (Vector3.up);
}
}
void GetInteraction(){ //this method gets the ray or point clicked and move the player to that point
Ray interactionRay = Camera.main.ScreenPointToRay (Input.mousePosition); //get the point clicked in the world
RaycastHit interactionInfo; //keeps track of the point clicked
if (Physics.Raycast (interactionRay, out interactionInfo, Mathf.Infinity)) //get the point clicked, store it in InteractionInfo and make sure its not out of range by mathf
{
GameObject interactedObject = interactionInfo.collider.gameObject;
if (interactedObject.tag == "Interactable Item") //check if the item point selected is interacrable(cant be move over)
{
interactedObject.GetComponent<Interactable> ().MoveToInteraction (playerAgent); //move playerAgent to the Interactable item, so they could interact(its calling the movetoInteractable method in Interactable class).
} else {
playerAgent.stoppingDistance = 0;
playerAgent.destination = interactionInfo.point; //if its a movable point, player destination is set to that point
}
}
}
}
The NavMeshAgent controls the object in all directions so it will just override your attempt to jump. Make the object with the NavMeshAgent a child of an empty object and just Translate the empty object upwards. Hopefully that helps.

Find all objects between player and camera

I am using an orthographic projection for camera to follow a player. I would like to find all gameobjects between the player and the camera so I can change the opacity so they are partially transparent while blocking the camera's view. I read about raycasting, but it seems it would give only the first object between the player and camera. What approaches are there to accomplish this?
Just use Physics.RaycastAll like this:
public class CameraScript : MonoBehaviour
{
//Attach this script to the camera//
public GameObject player;
void Update()
{
float dist = Vector3.Distance(transform.Position, player.transform.position);
RaycastHit[] hits = hits = Physics.RaycastAll(transform.position,
transform.forward, 100.0F);
foreach (RaycastHit h in hits)
{
//Change the opacity of the of each object to semitransparent.
}
}
}

Make character only can jump when touching the ground

I want to make a character jump only when he is with his feet on the ground. I don't want him to be able to 'air-jump', so I came with this solution:
if (JumpButtonPressed()) {
if (GetComponent<BoxCollider2D>().IsTouchingLayers(LayerMask.NameToLayer("Ground"))) {
velocity.y = jumpForce;
}
}
The idea is that only when in touch with a "Ground" Layer, it can jump. But this is what happens:
It doesnt work just on his foot. If he is touching a platform by the side, it can jump as well. What could I do?
Use circle overlap with a position vector which is placed at the bottom on the player. Detect layer from there.
You may need to use different layer for platform if you want to write features like ledge grab, jump.
Create a bool isJumping and set it to true while in the air. When it reaches the ground - set it to false. Smth like that if your snippet is in Update():
bool isJumping = false;
if (JumpButtonPressed() && !isJumping) {
if (GetComponent<BoxCollider2D>().IsTouchingLayers(LayerMask.NameToLayer("Ground"))) {
velocity.y = jumpForce;
isJumping = true;
}
}
if (GetComponent<BoxCollider2D>().IsTouchingLayers(LayerMask.NameToLayer("Ground"))) {
isJumping = false;
}