setNeedsDisplayInRect bug in iOS5? - ios5

I'm trying to use setNeedsDisplayInRect: in iOS5 to optimize some drawing code. The setup is simple: I have an array of CGRect 'hotspots' that function as buttons. When a touch is detected I find the CGRect it occurred in and call setNeedsDisplayInRect: on the view with that rect as a param. All the CGRects are valid for the view - it uses them to do it's initial drawing and that comes out correctly.
What I am seeing (as the Console dump below shows) is that the first call to setNeedsDisplayInRect: passes the view frame as rect, not the rect I specified. Subsequent calls are correct.
Can anyone confirm this as a bug or see that I am doing something incorrectly here? All the code is below. -- Thanks!
WRONG -> drawRect with rect 0.000000 0.000000 70.000000 660.000000
drawRect with rect 15.000000 260.000000 40.000000 40.000000
drawRect with rect 15.000000 310.000000 40.000000 40.000000
drawRect with rect 15.000000 360.000000 40.000000 40.000000
drawRect with rect 15.000000 410.000000 40.000000 40.000000
drawRect with rect 15.000000 460.000000 40.000000 40.000000
drawRect with rect 15.000000 510.000000 40.000000 40.000000
drawRect with rect 15.000000 410.000000 40.000000 40.000000
drawRect with rect 15.000000 310.000000 40.000000 40.000000
drawRect with rect 15.000000 260.000000 40.000000 40.000000
drawRect with rect 15.000000 110.000000 40.000000 40.000000
drawRect with rect 15.000000 610.000000 40.000000 40.000000
drawRect with rect 15.000000 510.000000 40.000000 40.000000
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint point = [touch locationInView:self.view];
CGRect rect;
int cnt = self.barNoteArray.count;
for (int i = 0; i < cnt; i++) {
rect = [[self.barNoteArray objectAtIndex:i] cellRect];
if (CGRectContainsPoint(rect, point)) {
self.bar.highlightIndex = 1;
[self.bar setNeedsDisplayInRect:rect];
break;
}
}
}
- (void)drawRect:(CGRect)rect
{
if (highlightIndex){
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetFillColorWithColor(context, [UIColor greenColor].CGColor);
CGContextAddRect(context, rect);
CGContextFillPath(context);
highlightIndex = 0;
printf("drawRect with rect %f %f %f %f \n", rect.origin.x,
rect.origin.y,
rect.size.width,
rect.size.height);
} else {
// other drawing code
}

I am having this problem as well. I have narrowed it down to the following situation:
It seems that this is happening when I shift from one UIView to a second one. In my case, the user it selecting a painting tool in one UIView, then drawing on a underlaying UIView (the canvas).
It seems the initial stroke changes the rect received in drawRect from what you put in there to the full view size. Once those one or two drawRects have been drawn, it does not change what you put in there.
I can prove that it is a setNeedsDisplayInRect because if I comment the offending line out drawRect no longer is called. The rect just before the call shows then proper sub rectangle to update, the rect in drawRect shows a different rectangle.
Again, this is only happening on the initial move from one UIView to another (it seems).

Related

UIImageView draw in rect with size of image and not image view

I want to draw on UIImageView, following is my code for drawing. My image view mode is aspect fit. When i draw using the following code, since i am using drawinrect and giving the rect of UIImageView so all the images are scaled down to the size to the `imageview. I want to draw the image of size of image and not image view.Since i am detecting the touch point on the imageview so the point of tap and actual drawing on image is different. Can any body tell me how to draw on image directly. and how to calculate this shift or difference in the point of touch on image rather than image view.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
lastTouch = [touch locationInView:self];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint currentTouch = [[touches anyObject] locationInView:self];
UIGraphicsBeginImageContext(self.image.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[self.image drawInRect:CGRectMake(0, 0, self.image.size.width, self.bounds.size.height)];
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, _brushSize);
CGContextBeginPath(context) ;
CGContextMoveToPoint(context, lastTouch.x, lastTouch.y);
CGContextAddLineToPoint(context,currentTouch.x,currentTouch.y) ;
CGContextSetAlpha( context , 1 );
CGContextSetStrokeColorWithColor(context, [_brushColor CGColor]);
CGContextStrokePath(context) ;
self.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastTouch = currentTouch;
}
You shouldn't really be us classing UIImageView. It isn't intended to be subclassed.
A better approach to this would be to have a UIView instead of the UIImageView.
Then two possible ways is to either...
Draw the UIImage directly into the UIView inside drawRect but this will give you the same problem that you are currently having.
Or...
In the UIView have a UIImageView. Resize this so that it is the correct size/shape for the UIImage. (Ie do the scaling yourself) then over the UIImageView put a second UIView (and subclass this) this second one will be capped DrawingView or something. You can now do all your drawing inside here. And all your touch detection will be in here too. So you no longer need to convert the touch points into different drawing points.

Trying to fill a path with colour in CGContext without much luck

I currently have the following code to try and allow the user to draw a dotted path and make a custom shape. Once they have made this shape, I wanted it to automatically be filled with colour. That isn't happening.
At the moment I am getting this error for the code below:
<Error>: CGContextClosePath: no current point.
Here's the code I'm using:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint previous = [touch previousLocationInView:self];
CGPoint current = [touch locationInView:self];
#define SQR(x) ((x)*(x))
//Check for a minimal distance to avoid silly data
if ((SQR(current.x - self.previousPoint2.x) + SQR(current.y - self.previousPoint2.y)) > SQR(10))
{
float dashPhase = 5.0;
float dashLengths[] = {10, 10};
CGContextSetLineDash(context,
dashPhase, dashLengths, 2);
CGContextSetFillColorWithColor(context, [[UIColor lightGrayColor] CGColor]);
CGContextFillPath(context);
CGContextSetLineWidth(context, 2);
CGFloat gray[4] = {0.5f, 0.5f, 0.5f, 1.0f};
CGContextSetStrokeColor(context, gray);
self.brushSize = 5;
self.brushColor = [UIColor lightGrayColor];
self.previousPoint2 = self.previousPoint1;
self.previousPoint1 = previous;
self.currentPoint = current;
// calculate mid point
self.mid1 = [self pointBetween:self.previousPoint1 andPoint:self.previousPoint2];
self.mid2 = [self pointBetween:self.currentPoint andPoint:self.previousPoint1];
if(self.paths.count == 0)
{
UIBezierPath* newPath = [UIBezierPath bezierPath];
CGContextBeginPath(context);
[newPath moveToPoint:self.mid1];
[newPath addLineToPoint:self.mid2];
[self.paths addObject:newPath];
CGContextClosePath(context);
}
else
{
UIBezierPath* lastPath = [self.paths lastObject];
CGContextBeginPath(context);
[lastPath addLineToPoint:self.mid2];
[self.paths replaceObjectAtIndex:[self.paths indexOfObject:[self.paths lastObject]] withObject:lastPath];
CGContextClosePath(context);
}
//Save
[self.pathColors addObject:self.brushColor];
self.needsToRedraw = YES;
[self setNeedsDisplayInRect:[self dirtyRect]];
//[self setNeedsDisplay];
}
}
Why is this happening and why is the inside of the path not being filled with colour?
Your code has a few issues:
You should be doing your drawing in your view's drawRect: method, not the touch handler.
You never set the variable context with a value for the current context. Do that with the UIGraphicsGetCurrentContext() method. Again, within your drawRect: method.
You go through the trouble of creating a UIBezierPath object, but you never use it. Do that by calling CGContextAddPath( context, newPath.CGPath ), changing the variable name as needed in the two places you appear to the using a UIBezierPath.
Keep the call to setNeedsDisplayInRect: in your touch handler method. That tells the system to do the work to update your view using the drawing that your (yet to be implemented) drawRect: method draws.

lines are not being drawn on overlay view

I am trying to draw a straight line between two points in overlay view.
In MKOverlayView method, I think I am doing correctly but I don't understand why it's not drawing any lines...
Does anyone know why?
- (void)drawMapRect:(MKMapRect)mapRect zoomScale:(MKZoomScale)zoomScale
inContext:(CGContextRef)context
{
UIGraphicsPushContext(context);
MKMapRect theMapRect = [[self overlay] boundingMapRect];
CGRect theRect = [self rectForMapRect:theMapRect];
// Clip the context to the bounding rectangle.
CGContextAddRect(context, theRect);
CGContextClip(context);
CGPoint startP = {theMapRect.origin.x, theMapRect.origin.y};
CGPoint endP = {theMapRect.origin.x + theMapRect.size.width,
theMapRect.origin.y + theMapRect.size.height};
CGContextSetLineWidth(context, 3.0);
CGContextSetStrokeColorWithColor(context, [UIColor blueColor].CGColor);
CGContextBeginPath(context);
CGContextMoveToPoint(context, startP.x, startP.y);
CGContextAddLineToPoint(context, endP.x, endP.y);
CGContextStrokePath(context);
UIGraphicsPopContext();
}
Thank you for your help.
The line is being drawn using startP and endP which are CGPoint values but they are initialized using theMapRect which contains MKMapPoint values.
Instead, initialize them using theRect which you are converting from theMapRect using rectForMapRect.
Also, for the line width, you may want to scale it using the MKRoadWidthAtZoomScale function. Otherwise, a fixed line width of 3.0 will not be visible unless you are zoomed in very close.
The changed code would look like this:
CGPoint startP = {theRect.origin.x, theRect.origin.y};
CGPoint endP = {theRect.origin.x + theRect.size.width,
theRect.origin.y + theRect.size.height};
CGContextSetLineWidth(context, 3.0 * MKRoadWidthAtZoomScale(zoomScale));
Finally, instead of a custom MKOverlayView, why not use a MKPolylineView to avoid drawing lines manually?

Serious issues with getting touch coordinates

I have a simple application that shows the user an image (in a UIImageView) whereby they touch on a part of the screen and then on the next screen the same image is shown (in a UIImageView) but now has a "signature" overlayed on it. The issue I am having is that it seems that the x,y coordinates returned from the touches do not seem to map properly on the UIImageView on the second screen.
For instance when I touch the bottom of the screen I get: X: 157.000000 Y: 358.000000
but the bottom of the screen should be 480 ? since my screen dimensions are: SCREEN HEIGHT AND WIDTH AT X: 320.000000 Y: 480.000000. This causes the signature to be placed at a DIFFERENT spot than what the user intended.
I use the following code to get my touch coordinates:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"TOUCH HAPPENED");
UITouch *touch = [touches anyObject];
CGPoint touchCoordinates = [touch locationInView:self.view];
NSLog(#"X: %f Y: %f",touchCoordinates.x, touchCoordinates.y);
}
I am using the following code to place the "signature" with the values I get from the touch:
CGRect screenRect = [[UIScreen mainScreen] bounds];
UIGraphicsBeginImageContext(screenRect.size);
[pageImage drawInRect:CGRectMake(0, 0, screenRect.size.width, screenRect.size.height)]; // this is the original image
NSLog(#"SCREEN HEIGHT AND WIDTH AT X: %f Y: %f",screenRect.size.width, screenRect.size.height);
NSLog(#"PLACING IT AT TOUCH COORDINATES X: %f Y: %f",sigLocation.x, sigLocation.y);
UIImage *shieldLogo = [UIImage imageNamed:#"shield_bw.gif"];
[shieldLogo drawInRect:CGRectMake(sigLocation.x, sigLocation.y, shieldLogo.size.width/2, shieldLogo.size.height/2)];
[theSignature drawAtPoint:CGPointMake(sigLocation.x+24.000000, sigLocation.y) withFont:[UIFont fontWithName:#"Arial" size:8.0]];
[theSignature2 drawAtPoint:CGPointMake(sigLocation.x+24.000000, sigLocation.y+ 8.000000) withFont:[UIFont fontWithName:#"Arial" size:8.0]];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[image setImage:resultingImage];
The issue might be with the view that the user is touching. You might try (as seen in this link)
UITouch * touch = [touches anyObject];
CGPoint pos = [touch locationInView: [UIApplication sharedApplication].keyWindow];
NSLog(#"Position of touch: %.3f, %.3f", pos.x, pos.y);
This should give the location of the touch relative to the screen.

draw a line from a stationary point to a moving point on iphone

How can I draw a line between one point (the center of one UIView) to a point that moves (touch location), and the line moves the 2nd point as the touch moves.
In your custom view:
in touchesMoved:withEvent store current point into a variable, and call [self setNeedsDisplay] so that the view would redraw
implement drawing of a line in drawRect:, use core graphics to draw a line
Let's say you store the touched point into property self.touchedPoint, then drawing might look like this:
#property (nonatomic, assign) CGPoint touchedPoint;
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSaveGState(context);
CGContextTranslateCTM(context, 0.0, rect.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
CGContextSetShouldAntialias(context, YES);
CGContextSetLineWidth(context, 1.0f);
CGContextSetRGBStrokeColor(context, 0.7, 0.7, 0.7, 1.0);
CGContextMoveToPoint(context, rect.size.width/2, rect.size.height/2);
CGContextAddLineToPoint(context, self.touchedPoint.x, self.touchedPoint.y);
CGContextDrawPath(context, kCGPathStroke);
CGContextRestoreGState(context);
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
self.touchedPoint = [[touches anyObject] locationInView:self];
[self setNeedsDisplay];
}
I voted Michal's answer up. But I would also suggest looking at the Touches sample project. It is easy to get it running - which may be helpful if you are still just putting together your project.