Values for touch locations fall out of view's bounds - iphone

I am trying to record the locations of the touches. Below is my code. As far as I understand, a touch at the very far upper left corner would give me a location of (0,0), and the very far lower right corner would be (768, 1024) supposing I'm holding the iPad in portrait. However, I'm getting values like (-6, -18) for the upper left corner and (761,1003) for the lower right. It looks like the coordinates are shifted somehow. A trace of self.bounds does give me {{0,0}, {768, 1024}}. Can someone explain this to me? I would like to get x and y value that are between the bounds {{0,0}, {768, 1024}}. Thank you very much in advance.
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
NSLog(#"frame: %#", NSStringFromCGRect(bounds)); // this value was traced as frame: {{0, 0}, {768, 1024}}
UITouch* touch = [[event touchesForView:self] anyObject];
location = [touch locationInView:self];
NSLog(#"Location: %#", NSStringFromCGPoint(location));
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
UITouch* touch = [[event touchesForView:self] anyObject];
location = [touch locationInView:self];
NSLog(#"Location: %#", NSStringFromCGPoint(location));
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds = [self bounds];
UITouch* touch = [[event touchesForView:self] anyObject];
location = [touch locationInView:self];
NSLog(#"Location: %#", NSStringFromCGPoint(location));
}

Since I cannot comment yet, I have to post an answer. Have you tried setting the background color of the view to a different color so that you can see if its in the top left corner for sure?

Here's what I have for the view:
#implementation TouchesView
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
[self setUserInteractionEnabled:YES];
[self setBackgroundColor:[UIColor colorWithRed:1.0f green:0.6f blue:0.6f alpha:1.0f]];
}
return self;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touch = [[touches anyObject] locationInView:self];
NSLog(#"(%.1f, %.1f)", touch.x, touch.y);
NSLog(#"%#", NSStringFromCGPoint(touch));
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{}
#end
And this is the initialization and adding it to a view controller:
TouchesView *touchView = [[TouchesView alloc] initWithFrame:self.view.bounds];
[self.view addSubview:touchView];
This works fine for me.

The -20 offset on the y values is fixed by setting "Status bar is initially hidden" to YES. The rest of the problem looks like it's a hardware issue because I got a different range for x and y values each time I run the program on the iPad. I didn't get this problem while running the program on the simulator. On the simulator, x values range from 1 to 767 and y values range from 1 to 1023, which are pretty correct.

Related

Determine iPhone Screen Tapped is in region or not?

I want to determine whether the tapped location is in region or not. I have 4 CGPoints and I know this can be done by using UITouch. Also, I have screen tapped location by using the function
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *myTouch = [[touches allObjects] objectAtIndex: 0];
CGPoint currentPos = [myTouch locationInView:self.view];
}
And for example my 4 CGPoints are
self.firstPoint = CGPointMake(50.0f, 50.0f);
self.secondPoint = CGPointMake(200.0, 50.0);
self.thirdPoint = CGPointMake(200.0, 200.0);
self.fourthPoint = CGPointMake(50.0, 120.0);
Thanks in advance
You should use a CGRect to represent the rect instead of four CGPoints and then use CGRectContainsPoint() to check if the rect contains the point.
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
image=[UIImage imageNamed:#"anyImage.gif"];
newView = [[UIImageView alloc]initWithImage:image];
if (location.y<480|| location.y>50)
{
//write your code
}
}

touchesMoved reaching out of my view bounds

I have subclassed UIView and there initially my view will be in a default color and i need to fill some different color on touch (from x axis = 0 to user touched point),here the problem is touchesMoved even if i drag out of my self view bounds it is getting those points,how to restrict it to only for my self view bounds.
I googled & tried below snippets but of no luck
if([self pointInside:point withEvent:nil]){
[self fillColor];
}
My touchesMoved method is as below,
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
endPoint = point;
NSLog(#"moved x: %f,y: %f",point.x,point.y);
if(CGRectContainsPoint([self frame], endPoint)){ // this also not working
[self fillColor];
}
}
Any help is appreciated in advance.
just set tag in viewDidLoad: method and use bellow logic..
fillColorView.tag = 111;
and use bellow logic in touchesMoved: method like bellow..
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *tap = [touches anyObject];
CGPoint pointToMove = [tap locationInView:fillColorView];
if([tap.view isKindOfClass:[UIView class]])
{
UIView *tempView=(UIView *) tap.view;
if (tempView.tag == 111){
[self fillColor];
}
}
}
hope this help you...
In your touchesMoved method, CGPoint point = [touch locationInView:self]; replcae self by the view in which you wants the touch to be worked.
self will get the complete view, you should pass your drawingView at there, so that it will detetc touch only on that view.

With touches - how to stop UIIMageView moving when reaching a certain Y point

I have an image at the bottom of the screen.
I want the user to to move the image upwards until it reaches a certain Y point as follows:
[door setCenter:CGPointMake(160,347)];
So far, as you drag the image (door) upwards it continues past my destination point but when you let go it snaps back to the correct position.
How do I stop the image moving when reaching a certain point if the user's finger is still swiping upwards? Would it be inside an if statement?
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (startPoint.y < 347) {
// something in here ?????
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *myTouch = [touches anyObject];
startPoint = [myTouch locationInView:self.view];
NSLog(#"position = %f and %f",startPoint.x,startPoint.y);
[door setCenter:CGPointMake(160, startPoint.y)];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[door setCenter:CGPointMake(160,347)];
}
How about setting it in touchesMoved method. Something like,
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *myTouch = [touches anyObject];
startPoint = [myTouch locationInView:self.view];
NSLog(#"position = %f and %f",startPoint.x,startPoint.y);
if (startPoint.y < 347) { //or suitable condition to verify your case
[door setCenter:CGPointMake(160, startPoint.y)]; //then only set the center
} else
{
[door setCenter:CGPointMake(160,347);
}
}

Rotating image using objective C

I am trying to rotate an image. This I am succeeding in doing. The problem is that when I am clicking down and dragging slightly, the image rotates completely to reach the point where I have clicked and then rotates slowly as I drag the image clockwise. I am concerned to why it is rotating completely to the place that I am dragging. I want the dragging to start from the position it is found and NOT to rotate to the place my finger is down on and then starts from there.
This is my code:
-
(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSSet *allTouches = [event allTouches];
int len = [allTouches count]-1;
UITouch *touch =[[allTouches allObjects] objectAtIndex:len];
CGPoint location = [touch locationInView:[self superview]];
float theAngle = atan2( location.y-self.center.y, location.x-self.center.x );
totalRadians = theAngle;
[self rotateImage:theAngle];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
}
-(void) rotateImage:(float)angleRadians{
self.transform = CGAffineTransformMakeRotation(angleRadians);
CATransform3D rotatedTransform = self.layer.transform;
self.layer.transform = rotatedTransform;
}
Am I doing anything wrong?
Thanks!
Instead of creating the transformation from the angle that your getting from the touch, try incrementing totalRadians by its difference from the new angle, and then create the transform from that.
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSSet *allTouches = [event allTouches];
int len = [allTouches count]-1;
UITouch *touch =[[allTouches allObjects] objectAtIndex:len];
CGPoint location = [touch locationInView:[self superview]];
float theAngle = atan2( location.y-self.center.y, location.x-self.center.x );
totalRadians += fabs(theAngle - totalRadians);
totalRadians = fmod(totalRadians, 2*M_PI);
[self rotateImage:totalRadians];
}
My first advice would be to switch to using the UIRotateGestureRecognizer if your app is for 4.0 and higher. It does the right thing and provides you with a rotation property.

Programmatically pass touches to UIScrollView to scroll

Basically I'm trying to make UIScrollView only scroll on higher angles. As in right now if you move your finger 10 degrees off horizontal, the scrollview will scroll. I'd like to push that up to, say, 30 degrees.
After doing some reading, I established the best way to do this would be to put a subclassed UIView on top of the scrollview. If the UIView on top's touches are above 30 degrees, pass it down to the scrollview, and otherwise don't.
However, I can't figure out how to pass the touches down. Here's my code right now:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"glass touch began");
UITouch *touch = [touches anyObject];
beginning_touch_point = [touch locationInView:nil];
[scroll_view touchesBegan:touches withEvent:event];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"glass touch ended");
UITouch *touch = [touches anyObject];
CGPoint previous_point = beginning_touch_point;
CGPoint current_point = [touch locationInView:nil];
float x_change = fabs(previous_point.x - current_point.x);
float y_change = fabs(previous_point.y - current_point.y);
if(x_change > y_change)
{
if(previous_point.x - current_point.x < 0)
{
[(MyScheduleViewController *)schedule_controller didFlickLeft];
}
else
{
[(MyScheduleViewController *)schedule_controller didFlickRight];
}
[scroll_view touchesCancelled:touches withEvent:event];
}
else
{
[scroll_view touchesEnded:touches withEvent:event];
}
}
I know right now that it's checking for 45 degrees, but that's not the important thing. What is important is that the touches are indeed getting passed down correctly to my scroll_view. I have it doing a NSLog() on touchesbegan and touchesended, and it's doing both correctly. It's just not scrolling. I'm worried touchesBegan and touchesEnded cannot cause a scroll. Does anyone know what can, or what I'm doing wrong?
I also tried to do the same but didn't succeed. It seems scrollView doesn't handle touchesMoved/touchesBegan, but it handles some other events to understand that user wants to scroll the view.
For me the solution was to determine thet shift value and set it explicitly as a scrollView content offset. It looked like this (I don't have the exact source code no, this code may work incorrectly):
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint curr = [touch locationInView: self];
CGPoint offset = [scrollView contentOffset];
[scrollView setContentOffset: CGPointMake(offset.x + (curr.x - prev.x), offset.y) animated:YES];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
prev = [[touches anyObject] previousLocationInView: self];
}