Drag sprite with mouse/touch on UE4 - unreal-engine4

I want to move the sprite to the release position of the mouse left button. The problem is OnRelease event is fired when the mouse button is release while the mouse is over the sprite. i want to somehow fire an event when the mouse left button released anywhere in the scene after i actually clicked on the sprite.

You shold try to record that you have actually clicked on the sprite. For example, a boolean variable in your player controller.

Related

Unity scene view bug with panning and rotating

I cant rotate the camera with the RMB, no panning with middle mouse button and the arrows on an object don't move the object, only the inner circle
thanks in advance
ned

Passing a touch gesture

I'm looking for a solution to help "pass" a touch gesture along.
Basically I have a menu, and I want users to be able to drag and drop items from the menu to the canvas in a single continuous drag.
I have already achieved a draggable image via pan gestures (we'll call these instances Sprites). I can also instantiate a Sprite anywhere on the UIView using a button or UIImageView with touch gestures.
However, this currently requires two touches. One to touch down the menu item button and release, creating the Sprite. The second to touch down on the sprite, allowing the user to drag it, and then release it where they want. I would like to merge these touches so that when a user touches a menu item, the Sprite is instantiated and already within the pan gesture, or something to that affect.
I've attached a visual description if that helps. Any help is appreciated. Thanks!
There is no way of artificially forcing UIGestureRecognizer to recognize touches that are passed to a different view.
From the Gesture Recognizer docs:
Delivery of events initially follows the usual path: from operating
system to the application object to the window object representing the
window in which the touches are occurring. But before sending an event
to the hit-tested view, the window object sends it to the gesture
recognizer attached to that view or to any of that view’s subviews.
Figure 3-1 illustrates this general path, with the numbers indicating
the order in which touches are received.
Figure 3-1
Event's delivery happens automaticaly by the system and is delivered to the appropriate view.
To do what you want I would implement the UIGestureRecognizer on the subview (view that contains your UIButton) and on press create an instance of your Sprite object and manipulate that object from withing the gesture recognizer of the subview. Alternatively you could use -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)even to reposition the object yourself.

Detect when finger is dragged inside object (iOS touch)

I know I can use UIControlEventTouchDragEnter in order to tell when I have touched a button dragged my touch outside its bounds and then re-entered those bounds. However I was wondering would it be possible to touch the screen, not on a button, and detect when I have dragged over/inside that button?
Also could someone tell me the difference between UIControlEventTouchDragExit UIControlEventTouchDragOutside
Thanks!
You would have to observe touch events on the button's superview and whenever the user's fingers move, call hitTest:withEvent: to check whether the touch coordinates lie on top of the button.
I believe the difference between UIControlEventTouchDragExit and UIControlEventTouchDragOutside is this: when the finger moves from inside the control to the outside, UIControlEventTouchDragExit fires once. Then, as long as the finger remains outside, UIControlEventTouchDragOutside fires on each move. But you should test this yourself.

custom slide to reveal toolbar

i want to do a custom toolbar, something like the slide to unlock of android phones. In idle state, the user can see a button of the bottom left of the page. the user would then tap it, drag towards the right. When the user reaches the right end, the toolbar will then 'lock'. Buttons would be located at the toolbar.
I'm think of using a customview and touchmoved functions, but what I don't know how is how to make the view move with the touch, and how to actually lock the bar.
Everytime you move your finger the touchmoved function is called. In the touchmoved function you have to redraw the whole view or just set a new frame for this view. It pretty simple as you already know how to detect touches and react on them.

TouchesBegan and TouchesEnded with Multitouch issue

I'm using TouchesBegan TouchesMoved with Multitouch.
I have a manual implementation of what is essentially a button.
I bounds test on the point of TouchesBegan to set the button as down and the same for TouchesEnded to reset it.
The problem is if the user moves the finger out of the bounds of the button before lifting then the TouchesEnded is outside of the bounds of the button where the touch started.
I can't just reset everything on touchesended as the user might still be holding another button down with another finger.
What is the recommended solution to this? UIButton must be doing something similar somehow.
You need to watch touchesMoved: and "deactivate" your button when the touch moves outside of its bounds, and "reactivate" your button when the touch moves back in. See Handling a Complex Multi-Touch Sequences for explanation of how to watch for mutations on a multi-touch sequence (fancy way of saying "which finger was that?")