I have written a custom UIGestureRecognizer which handles rotations with one finger. It is designed to work exactly like Apples UIRotationGestureRecognizer and return the same values as it does.
Now, I would like to implement the velocity but I cannot figure out how Apple defines and calculates the velocity for the gesture recognizer.
Does anybody have an idea how Apple implements this in the UIRotationGestureRecognizer?
You would have to keep reference of last touch position and it's timestamp.
double last_timestamp;
CGPoint last_position;
Then you could do something like:
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
last_timestamp = CFAbsoluteTimeGetCurrent();
UITouch *aTouch = [touches anyObject];
last_position = [aTouch locationInView: self];
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
double current_time = CFAbsoluteTimeGetCurrent();
double elapsed_time = current_time - last_timestamp;
last_timestamp = current_time;
UITouch *aTouch = [touches anyObject];
CGPoint location = [aTouch locationInView:self.superview];
CGFloat dx = location.x - last_position.x;
CGFloat dy = location.y - last_position.y;
CGFloat path_travelled = sqrt(dx*dx+dy*dy);
CGFloat sime_kind_of_velocity = path_travelled/elapsed_time;
NSLog (#"v=%.2f", sime_kind_of_velocity);
last_position = location;
}
This should give you some kind of speed reference.
Related
I want to throw an object on touches ended method with force and left to right direction.I created world and body object and calculated swipe distance, angle and force but it is not working properly. Here is my code- --
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"tocuehs began");
UITouch *touch=[touches anyObject];
point1 = [touch locationInView:[touch view]];
}
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch=[touches anyObject];
point2 = [touch locationInView:[touch view]];
float distance = ccpDistance(point1, point2);
int maxDistance = 50;
CGFloat strength = distance / maxDistance;
CGFloat angle = atan2f(point2.y - point1.y, point2.x - point1.x);
angle = - 1 * CC_DEGREES_TO_RADIANS(angle);
// int force = strength * maxForce;
_body->ApplyLinearImpulse(b2Vec2(10.0f+cos(angle)*25.0f,10.0f+sin(angle)*25.0f), _body->GetPosition());
}
Please help!
I am developing one application. In that I am using one UIImageView and I am changing the position of the UIImageView every 0.5 seconds using below code.
[NSTimer scheduledTimerWithTimeInterval:0.5
target: self
selector:#selector(moveImage)
userInfo: nil repeats:YES];
-(void) moveImage
{
//[image1 setCenter: CGPointMake(634, 126)];
CGFloat x = (CGFloat) (arc4random() % (int) self.view.bounds.size.width);
CGFloat y = (CGFloat) (arc4random() % (int) self.view.bounds.size.height);
CGPoint squarePostion = CGPointMake(x, y);
img.center=squarePostion;
}
Now i can touch the screen. What i need to find out is my touch location and that imageview location both are correct or not.
use this
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
if ([touch view] == photo1) {
//photo1 is image view give tag to it
if( photo1.tag==3)
{
NSLog(#"You have been touched image view");
}
photo1.center = touchLocation;
}
}
//it used to find the point in view
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [touches anyObject];
CGPoint pos = [touch locationInView: [UIApplication sharedApplication].keyWindow];
NSLog(#"%f,%f",pos.x, pos.y);
}
Is it possible to drag UIView around the iOS screen while it has both image and text? e.g. small cards. Could you point me to the similar (solved) topic? I haven't found any.
This is what a neat solution, based on pepouze's answer, would look like (tested, it works!)
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *aTouch = [touches anyObject];
CGPoint location = [aTouch locationInView:self];
CGPoint previousLocation = [aTouch previousLocationInView:self];
self.frame = CGRectOffset(self.frame, (location.x - previousLocation.x), (location.y - previousLocation.y));
}
While UIView does not have a built-in support for moving itself along the user dragging, it should be not so difficult to implement it. It is even easier when you are only dealing with dragging on the view, and not other actions such as tapping, double tapping, multi-touches etc.
First thing to do is to make a custom view, say DraggableView, by subclassing UIView. Then override UIView's touchesMoved:withEvent: method, and you can get a current dragging location there, and move the DraggableView. Look at the following example.
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *aTouch = [touches anyObject];
CGPoint location = [aTouch locationInView:self.superview];
[UIView beginAnimations:#"Dragging A DraggableView" context:nil];
self.frame = CGRectMake(location.x, location.y,
self.frame.size.width, self.frame.size.height);
[UIView commitAnimations];
}
And because all subviews of the DraggableView object will be moved, too. So put all your images and texts as subviews of the DraggableView object.
What I implemented here is very simple. However, if you want more complex behaviors for the dragging, (for example, the user have to tap on the view for a few seconds to move the view), then you will have to override other event handling methods (touchesBegan:withEvent: and touchesEnd:withEvent) as well.
An addition to MHC's answer.
If you don't want the upper left corner of the view
to jump under your finger, you can also override touchesBegan
like this:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *aTouch = [touches anyObject];
offset = [aTouch locationInView: self];
}
and change MHC's touchesMoved to:
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *aTouch = [touches anyObject];
CGPoint location = [aTouch locationInView:self.superview];
[UIView beginAnimations:#"Dragging A DraggableView" context:nil];
self.frame = CGRectMake(location.x-offset.x, location.y-offset.y,
self.frame.size.width, self.frame.size.height);
[UIView commitAnimations];
}
you should also define CGPoint offset in the interface:
#interface DraggableView : UIView
{
CGPoint offset;
}
EDIT:
Arie Litovsky provides more elegant solution that allows you to ditch the ivar: https://stackoverflow.com/a/10378382/653513
Even though rokjarc solution works, using
CGPoint previousLocation = [aTouch previousLocationInView:self.superview];
avoids the CGPoint offset creation and the call to touchesBegan:withEvent:
Here is a solution to drag a custom UIView (it can be scaled or rotated through its transform), which can hold images and/or text (just edit the Tile.xib as required):
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
CGPoint previous = [touch previousLocationInView:self];
if (!CGAffineTransformIsIdentity(self.transform)) {
location = CGPointApplyAffineTransform(location, self.transform);
previous = CGPointApplyAffineTransform(previous, self.transform);
}
self.frame = CGRectOffset(self.frame,
(location.x - previous.x),
(location.y - previous.y));
}
This work for me. My UIView rotated and scaled
- (void) touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
CGPoint previous = [touch previousLocationInView:self];
if (!CGAffineTransformIsIdentity(self.transform)) {
location = CGPointApplyAffineTransform(location, self.transform);
previous = CGPointApplyAffineTransform(previous, self.transform);
}
CGRect newFrame = CGRectOffset(self.frame,
(location.x - previous.x),
(location.y - previous.y));
float x = CGRectGetMidX(newFrame);
float y = CGRectGetMidY(newFrame);
self.center = CGPointMake(x, y);
}
I am trying to implement a selection wheel in an iphone application. I have got the wheel to track the users finger but now for some reason when the user touches the screen the wheel jumps so the same section of the wheel tracks the finger each time.
How would I get it so the wheel rotates like the user is dragging it from any point?
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSSet *allTouches = [event allTouches];
int tInput = [allTouches count]-1;
UITouch *touch =[[allTouches allObjects] objectAtIndex:tInput];
CGPoint location = [touch locationInView:self.view];
float theAngle = atan2( location.y-imageX.center.y, location.x-imageX.center.x );
CGAffineTransform cgaRotate = CGAffineTransformMakeRotation(theAngle);
imageX.transform = cgaRotate;
}
Any suggestion?
Right - that makes perfect sense - this is indeed what your code does.
What you need to add is the initial value where you started your drag as a relative rotation -- or track your last rotation.
A very elaborate way is shown below - but this should help get the point across.
#interface UntitledViewController : UIViewController {
CGPoint firstLoc;
UILabel * fred;
double angle;
}
#property (assign) CGPoint firstLoc;
#property (retain) UILabel * fred;
#implementation UntitledViewController
#synthesize fred,firstLoc;
- (void)viewDidLoad {
[super viewDidLoad];
self.fred = [[UILabel alloc] initWithFrame:CGRectMake(100,100,100,100)];
fred.text = #"Fred!"; fred.textAlignment = UITextAlignmentCenter;
[self.view addSubview:fred];
angle = 0; // we aint have rotated just yet...
};
// make sure we get them drag events.
- (BOOL)isFirstResponder { return YES; }
-(void)handleObject:(NSSet *)touches
withEvent:(UIEvent *)event
isLast:(BOOL)lst
{
UITouch *touch =[[[event allTouches] allObjects] lastObject];
CGPoint curLoc = [touch locationInView:self.view];
float fromAngle = atan2( firstLoc.y-fred.center.y,
firstLoc.x-fred.center.x );
float toAngle = atan2( curLoc.y-fred.center.y,
curLoc.x-fred.center.x );
// So the angle to rotate to is relative to our current angle and the
// angle through which our finger moved (to-from)
float newAngle = angle + (toAngle - fromAngle);
CGAffineTransform cgaRotate = CGAffineTransformMakeRotation(newAngle);
fred.transform = cgaRotate;
// we only 'save' the current angle when we're done with the drag.
//
if (lst)
angle = newAngle;
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch =[[[event allTouches] allObjects] lastObject];
// capture where we started - so we can later work out the
// rotation relative to this point.
//
firstLoc = [touch locationInView:self.view];
};
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self handleObject:touches withEvent:event isLast:NO];
};
-(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[self handleObject:touches withEvent:event isLast:YES];
}
Obviously you can do this a lot more elegant - and above misses a bit of 0 .. 2xPI capping you need.
Because the view itself is still the old one. You may want to fetch the zip example file in:
http://bynomial.com/blog/?p=77
and in PuttyView.m change the:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
....
else {
self.center = CGPointMake(self.center.x + touchPoint.x - touchStart.x,
self.center.y + touchPoint.y - touchStart.y);
into
else {
NSSet *allTouches = [event allTouches];
int tInput = [allTouches count]-1;
UITouch *touch =[[allTouches allObjects] objectAtIndex:tInput];
CGPoint location = [touch locationInView:self];
float theAngle = atan2( location.y-self.center.y, location.x-self.center.x );
CGAffineTransform cgaRotate = CGAffineTransformMakeRotation(theAngle);
self.transform = cgaRotate;
}
where you thus stay within the view and its relative angle. Alternatively - in your code - keep the relative angle itself.
I am trying to rotate an image. This I am succeeding in doing. The problem is that when I am clicking down and dragging slightly, the image rotates completely to reach the point where I have clicked and then rotates slowly as I drag the image clockwise. I am concerned to why it is rotating completely to the place that I am dragging. I want the dragging to start from the position it is found and NOT to rotate to the place my finger is down on and then starts from there.
This is my code:
-
(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSSet *allTouches = [event allTouches];
int len = [allTouches count]-1;
UITouch *touch =[[allTouches allObjects] objectAtIndex:len];
CGPoint location = [touch locationInView:[self superview]];
float theAngle = atan2( location.y-self.center.y, location.x-self.center.x );
totalRadians = theAngle;
[self rotateImage:theAngle];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
}
-(void) rotateImage:(float)angleRadians{
self.transform = CGAffineTransformMakeRotation(angleRadians);
CATransform3D rotatedTransform = self.layer.transform;
self.layer.transform = rotatedTransform;
}
Am I doing anything wrong?
Thanks!
Instead of creating the transformation from the angle that your getting from the touch, try incrementing totalRadians by its difference from the new angle, and then create the transform from that.
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSSet *allTouches = [event allTouches];
int len = [allTouches count]-1;
UITouch *touch =[[allTouches allObjects] objectAtIndex:len];
CGPoint location = [touch locationInView:[self superview]];
float theAngle = atan2( location.y-self.center.y, location.x-self.center.x );
totalRadians += fabs(theAngle - totalRadians);
totalRadians = fmod(totalRadians, 2*M_PI);
[self rotateImage:totalRadians];
}
My first advice would be to switch to using the UIRotateGestureRecognizer if your app is for 4.0 and higher. It does the right thing and provides you with a rotation property.