CGPoint didn't get the coordination of my button - iphone

hello this is my code is it wrong
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"Inside touchesBegan");
UITouch *touch = [[event touchesForView:self.view] anyObject];
CGPoint location = [touch locationInView:touch.view];
NSLog(#"pointx: %f pointy:%f", location.x, location.y);
NSLog(#"about to enter do");
if(CGRectContainsPoint(b_do.frame, location))
{
NSLog(#"inside do");
[self b_do];
[b_do setHighlighted:YES];
}
}
in the log i got this :
2010-10-15 21:00:42.555 phone[14280:207] Inside touchesBegan
2010-10-15 21:00:42.557 phone[14280:207] pointx: 0.000000 pointy:0.000000
2010-10-15 21:00:42.557 phone[14280:207] about to enter do
2010-10-15 21:00:42.558 phone[14280:207] about to exit touchesBegan

The line with self.view might be the problem. The UITouch delegate is usually the UIView subclassed object itself (since it's a view, just use "self"), not a view controller (where self.view would reference the controller's view).

my peoblem solved by replace this code:
UITouch *touch = [[event touchesForView:self.view] anyObject];
By this one :
UITouch *touch = [touches anyObject];

Related

Moving player left and right (touchesbegan)

I have a toon (player.physicsbody) which I want to move left and right by pressing two buttons (like a ship in Space Invaders). I have coded this and it works ok, the problem comes when I run the app in a device, if "left button" is pressed, pressing "right button" does nothing and viceversa.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:location];
if ([node.name isEqualToString:#"rightbutton"]){
player.physicsBody.velocity=CGVectorMake(200, player.physicsBody.velocity.dy);
}
if ([node.name isEqualToString:#"leftbutton"]){
player.physicsBody.velocity=CGVectorMake(-200, player.physicsBody.velocity.dy);
}
}
Calling [event allTouches] returns an NSSet of UITouch events. Calling [[event allTouches] anyObject] returns only one member of that set. I modified your code to iterate over all touches in the set.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in [event allTouches]) {
CGPoint location = [touch locationInNode:self];
SKNode *node = [self nodeAtPoint:location];
if ([node.name isEqualToString:#"rightbutton"]){
player.physicsBody.velocity=CGVectorMake(200, player.physicsBody.velocity.dy);
}
else if ([node.name isEqualToString:#"leftbutton"]){
player.physicsBody.velocity=CGVectorMake(-200, player.physicsBody.velocity.dy);
}
}
}

Drag image from carousal to imageview

I want to drag image from carousel of image and put it in another imageview.After drag of it should delete from carousel.I have done code for this that will also remove from carousel.
I used touch events.
But problem is that after draging one image when i touch anywhere else in the screen it will select next image automatically.How can i stop it?I want to drag image when i click on it.
My Code.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
btnBackImg.center = location;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint pointMoved = [touch locationInView:self.view];
btnBackImg.frame = CGRectMake(pointMoved.x, pointMoved.y, btnBackImg.frame.size.width,btnBackImg.frame.size.height);
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"touch end");
if (btnBackImg != nil)
{
UITouch *touch = [[event allTouches] anyObject];
if (CGRectContainsPoint(basket_image.frame, [touch locationInView:self.view]))
{
basket_image.image=[UIImage imageNamed:[NSString stringWithFormat:#"wit_bas.png"]];
[self.product_categories removeObjectAtIndex:imgIndex+1];
[carousel reloadData];
}
}
}
In this btnCackImg want to move in to basketImg.
If anyone konw then pls help me or if any link then also useful.
Thanks in Advance.
You have used this code in your touches.began CGPoint location = [touch locationInView:touch.view]; where touch.View means touching all the objects in the view namely self.view. What you really need is to add the name of the image to be put instead of touch.View. This will select the image at one time.
CGPoint location = [touch locationInView:imageName];

Get X and Y coordinate Of view

I have one UITableView in My View.The name of my UITableView is tblTask.
I am using touchesBegan method to get the x and y coordinate of my view.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
NSLog(#"point........ %f .. %f",location.x,location.y);
}
But when I begin the touch event from the tbltask. It does not return the location.x and location.y even if tbltask is the subview of the main view.
I tried to disable the scrollview of the table also but nothing happen.
when I set this.
tbltask.userInteractionEnabled=NO;
Then it will give the coordinate but I want the coordinate of view even if tbltask interaction is enabled.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch ;
touch = [[event allTouches] anyObject];
if ([touch view] == self.view){
CGPoint location = [touch locationInView:touch.view];
NSLog(#"point........ %f .. %f",location.x,location.y);
}
}

Have Multiple Moving UIImage Views With Different Touches

I am working on trying to get multiple UIImageViews to move when they are dragged. I was originally putting all the movement under TouchesBegan. However, when I drag a single object, they all disappear and only the one I am dragging moves.
How can I have each image dragged own its own terms(I drag the first image, only the first image moves etc.)?
I'm not sure if this is what you're looking for but I worked on this a while ago, so hopefully it's useful.
Just make sure you set up 7 UIImageViews in the XIB File, link them and you'll be good to go.
#synthesize img1, img2, img3, img4, img5, img6, magnet;
-(void) checkInsertion {
if (CGRectContainsRect(img2.frame, img1.frame)) {img1.center = img2.center;}
if (CGRectContainsRect(img4.frame, img3.frame)) {img3.center = img4.center;}
if (CGRectContainsRect(img6.frame, img5.frame)) {img5.center = img6.center;}
[UIView beginAnimations:nil context:nil];
[UIView setAnimationDuration:0.2];
if (CGRectIntersectsRect(magnet.frame, img1.frame)) {img1.center = magnet.center;}
if (CGRectIntersectsRect(magnet.frame, img2.frame)) {img2.center = magnet.center;}
if (CGRectIntersectsRect(magnet.frame, img3.frame)) {img3.center = magnet.center;}
if (CGRectIntersectsRect(magnet.frame, img4.frame)) {img4.center = magnet.center;}
if (CGRectIntersectsRect(magnet.frame, img5.frame)) {img5.center = magnet.center;}
if (CGRectIntersectsRect(magnet.frame, img6.frame)) {img6.center = magnet.center;}
[UIView commitAnimations];
}
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [[event allTouches] anyObject];
CGPoint touchLoc = [touch locationInView:touch.view];
if (CGRectContainsPoint(img1.frame, touchLoc)) {img1.center = touchLoc;}
if (CGRectContainsPoint(img3.frame, touchLoc)) {img3.center = touchLoc;}
if (CGRectContainsPoint(img5.frame, touchLoc)) {img5.center = touchLoc;}
if (CGRectContainsPoint(magnet.frame, touchLoc)) {magnet.center = touchLoc;}
[self checkInsertion];
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];
}
You should look at UIPanGestureRecognizer. In the view controller that's responsible for these multiple image views, add a pan gesture recognizer to each of the image views. The view controller should be responsible for responding to the pan gesture callbacks and move the frame of the image view accordingly.
Of note, you should also grab the offset of the touch within the image view from the center and store this. Then, when the view is panned, you need to factor in the offset into your calculations. This is so the view doesn't just jump to the user's finger if they start dragging somewhere distant from the center point.
See the WWDC session videos from 2010 and 2011 about UIScrollView. Especially from 2011. They go into examples of how to do some of this.
I have gotten it to work!
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event touchesForView:self.view] anyObject];
CGPoint location = [touch locationInView:touch.view];
if(CGRectContainsPoint(plusone.frame, location))
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
plusone.center = location;
}
if(CGRectContainsPoint(plustwo.frame, location))
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
plustwo.center = location;
}
if(CGRectContainsPoint(plusthree.frame, location))
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
plusthree.center = location;
}
if(CGRectContainsPoint(minusone.frame, location))
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
minusone.center = location;
}
if(CGRectContainsPoint(minustwo.frame, location))
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
minustwo.center = location;
}
if(CGRectContainsPoint(minusthree.frame, location))
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
minusthree.center = location;
}
But, When An Image Collides With Another, It Disappears!
Does anyone have a fix to this problem?

UITouch object behaving oddly

I wanted to get points where user is touching on the screen therefore I wrote following code which will fire when user will touch somewhere on the screen
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
UITouch *touch = [[UITouch alloc] init];
touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
if (CGRectContainsPoint([dob frame], point)) {
[self showDatePickerforDOB:YES];
}
}
but this code is giving run time error. Upon debugging it was revealed that locationInView is not recognized as a function of touch object on the other hand it is documented in iphone class reference documentation. When I changed code to exclude alloc i.e
UITouch *touch;
touch = [touches anyObject];
then locationInView is perfectly working fine. Any ideas why UITouch *touch = [[UITouch alloc] init]; is giving runtime error.
I think you should not alloc and init the touch pointer, the correct code should be:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
if (CGRectContainsPoint([dob frame], point)) {
[self showDatePickerforDOB:YES];
}
}
[touches anyObject] will return an autorelease object so you don't need to alloc and init the touch pointer.
I got the reason why there was an error. It was because of line
CGPoint point = [touch locationInView:self];
when I changed it to CGPoint point = [touch locationInView:self.view]; runtime error was removed. The reason is that locationInView function takes UIView as its parameter and on the other hand i was giving it a delegate i.e 'self'.
so self.view solved the issue