drawing on the iPhone in objective c [duplicate] - iphone

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Drawing on the iPhone in objective c
I have written this...
In the .h file:
#import <UIKit/UIKit.h>
#interface DrawView : UIView {
CGPoint gestureStartPoint,currentPosition;
CGContextRef c;
UIBezierPath *currentPath;
}
#property(nonatomic,retain)UIBezierPath *currentPath;
#end
And in the .m file:
#import "DrawView.h"
#implementation DrawView
#synthesize currentPath;
- (id)initWithFrame:(CGRect)frame {
[self drawRect:CGRectMake(0, 0, 320, 480)];
self = [super initWithFrame:frame];
if (self) {
currentPath = [[UIBezierPath alloc]init];
currentPath.lineWidth=3;
}
return self;
}
- (void)drawRect:(CGRect)rect {
[[UIColor redColor] set];
[currentPath strokeWithBlendMode:kCGBlendModeNormal alpha:1.0];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
currentPosition = [touch locationInView:self];
[currentPath addLineToPoint:(currentPosition)];
[self setNeedsDisplay];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
gestureStartPoint = [touch locationInView:self];
[currentPath moveToPoint:(gestureStartPoint)];
}
- (void)dealloc {
[super dealloc];
}
#end
I want to let the user draw an image on the iPhone screen and then use that image for the game... but this doesn't draw anything...

This is because your currentPath is not allocated.
If you instantiate the view in the resource [UIView initWithFrame:] will never be called.
Implement [UIView initWithCoder:] and allocate it there.
- (void)commonInit
{
currentPath = [[UIBezierPath alloc] init];
currentPath.lineWidth=3;
}
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self)
{
[self commonInit];
}
return self;
}
- (id)initWithCoder:(NSCoder*)aDecoder
{
self = [super initWithCoder:aDecoder];
if (self)
{
[self commonInit];
}
return self;
}
Alternatively you can make the path in the touchesBegan if it is nil.
- (UIBezierPath*)currentPath
{
if (currentPath = nil)
{
currentPath = [[UIBezierPath alloc] init];
currentPath.lineWidth=3;
}
return currentPath;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
gestureStartPoint = [touch locationInView:self];
[[self currentPath] moveToPoint:(gestureStartPoint)];
}
However, there are some problems in the code.
You call [self drawRect:CGRectMake(0, 0, 320, 480)] before self = [super initWithFrame:frame]. You must not call any method there. You can write your code only inside
if (self ! =nil)
{
// your code
}
currentPath leaks. Release it in dealloc.
Never call drawRect directly. Call [UIView setNeedsDisplay] instead.

Related

Making one slate in XCode

I am beginning developing with XCode, and I am starting doing some exercises and practises. I am trying to do an easy one, one drawing example in iPhone, an slate with one touch event, but I am doing something wrong; I am a little bit lost.
I create my HDPizarra class and I am triying to keep the moves. I have got one Erase button, with one IBAction that must clean the slate and restart the view; I put the lines in red colour and of 2 of width and I put the touch LocationInView of star and end. But it doesn’t go; it starts the view but when I touch the sreen nothing happens. This is the code:
-(IBAction)borrar: (id)sender
{
[lineas removeAllObjects];
}
- (id)initWithFrame: (CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
}
return self;
}
-(NSMutableArray *)lineas{
if (_lineas == nil) {
_lineas = [NSMutableArray new];
}
return _lineas;
}
- (void)drawRect: (CGRect)rect
{
if (!lineas) {
lineas = [[NSMutableArray alloc] initWithCapacity:0];
}
CGContextRef contexto = UIGraphicsGetCurrentContext();
UIColor * rojo = [UIColor redColor];
CGContextSetStrokeColorWithColor(contexto, rojo.CGColor);
CGContextSetLineWidth(contexto, 2);
CGContextMoveToPoint(contexto, inicio.x, inicio.y);
CGContextAddLineToPoint(contexto, fin.x, fin.y);
CGContextStrokePath(contexto);
}
-(void)touchesBegan: (NSSet *)touches withEvent: (UIEvent *)event{
UITouch * touch = [touches anyObject];
inicio = [touch locationInView:self];
fin = [touch locationInView:self];
[self setNeedsDisplay];
}
-(void)touchesMoved: (NSSet *)touches withEvent: (UIEvent *)event{
UITouch * touch = [touches anyObject];
fin = [touch locationInView:self];
HDLinea * linea = [[HDLinea alloc] initWithRect:CGRectMake(inicio.x, inicio.y, fin.x, fin.y)];
[self.lineas addObject: linea];
inicio.x = fin.x;
inicio.y = fin.y;
[self setNeedsDisplay];
}
The moves are from an HDLinea object, in one NSMutableArray; the code of .h is:
#import <Foundation/Foundation.h>
#interface HDLinea : NSObject
#property (readwrite) CGRect puntos;
-(id) initWithRect: (CGRect) puntos;
#end
And the one of the implementation is:
#import "HDLinea.h"
#implementation HDLinea
#synthesize puntos = _puntos;
- (id)initWithRect: (CGRect)puntos
{
self = [super init];
if (self) {
self.puntos = puntos;
}
return self;
}
#end
Any idea? Thanks a lot!!

How to add a magnifier to custom control?

How to add a magnifier to custom control? Control is a child of UIView. (It's a UIWebView - but native magnification functionality doesn't work at some pages.)
UPDATED:
Maybe it's possible to force a draw of magnifier on UIWebView?
1. Add the following files to your project:
MagnifierView.h:
//
// MagnifierView.h
// SimplerMaskTest
//
#import <UIKit/UIKit.h>
#interface MagnifierView : UIView {
UIView *viewToMagnify;
CGPoint touchPoint;
}
#property (nonatomic, retain) UIView *viewToMagnify;
#property (assign) CGPoint touchPoint;
#end
MagnifierView.m:
//
// MagnifierView.m
// SimplerMaskTest
//
#import "MagnifierView.h"
#import <QuartzCore/QuartzCore.h>
#implementation MagnifierView
#synthesize viewToMagnify;
#dynamic touchPoint;
- (id)initWithFrame:(CGRect)frame {
return [self initWithFrame:frame radius:118];
}
- (id)initWithFrame:(CGRect)frame radius:(int)r {
int radius = r;
if ((self = [super initWithFrame:CGRectMake(0, 0, radius, radius)])) {
//Make the layer circular.
self.layer.cornerRadius = radius / 2;
self.layer.masksToBounds = YES;
}
return self;
}
- (void)setTouchPoint:(CGPoint)pt {
touchPoint = pt;
// whenever touchPoint is set, update the position of the magnifier (to just above what's being magnified)
self.center = CGPointMake(pt.x, pt.y-66);
}
- (CGPoint)getTouchPoint {
return touchPoint;
}
- (void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
CGRect bounds = self.bounds;
CGImageRef mask = [UIImage imageNamed: #"loupe-mask#2x.png"].CGImage;
UIImage *glass = [UIImage imageNamed: #"loupe-hi#2x.png"];
CGContextSaveGState(context);
CGContextClipToMask(context, bounds, mask);
CGContextFillRect(context, bounds);
CGContextScaleCTM(context, 1.2, 1.2);
//draw your subject view here
CGContextTranslateCTM(context,1*(self.frame.size.width*0.5),1*(self.frame.size.height*0.5));
//CGContextScaleCTM(context, 1.5, 1.5);
CGContextTranslateCTM(context,-1*(touchPoint.x),-1*(touchPoint.y));
[self.viewToMagnify.layer renderInContext:context];
CGContextRestoreGState(context);
[glass drawInRect: bounds];
}
- (void)dealloc {
[viewToMagnify release];
[super dealloc];
}
#end
TouchReader.h:
//
// TouchReader.h
// SimplerMaskTest
//
#import <UIKit/UIKit.h>
#import "MagnifierView.h"
#interface TouchReader : UIView {
NSTimer *touchTimer;
MagnifierView *loop;
}
#property (nonatomic, retain) NSTimer *touchTimer;
- (void)addLoop;
- (void)handleAction:(id)timerObj;
#end
TouchReader.m:
//
// TouchReader.m
// SimplerMaskTest
//
#import "TouchReader.h"
#import "MagnifierView.h"
#implementation TouchReader
#synthesize touchTimer;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
self.touchTimer = [NSTimer scheduledTimerWithTimeInterval:0.5 target:self selector:#selector(addLoop) userInfo:nil repeats:NO];
// just create one loop and re-use it.
if (loop == nil) {
loop = [[MagnifierView alloc] init];
loop.viewToMagnify = self;
}
UITouch *touch = [touches anyObject];
loop.touchPoint = [touch locationInView:self];
[loop setNeedsDisplay];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self handleAction:touches];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[self.touchTimer invalidate];
self.touchTimer = nil;
[loop removeFromSuperview];
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self.touchTimer invalidate];
self.touchTimer = nil;
[loop removeFromSuperview];
}
- (void)addLoop {
// add the loop to the superview. if we add it to the view it magnifies, it'll magnify itself!
[self.superview addSubview:loop];
// here, we could do some nice animation instead of just adding the subview...
}
- (void)handleAction:(id)timerObj {
NSSet *touches = timerObj;
UITouch *touch = [touches anyObject];
loop.touchPoint = [touch locationInView:self];
[loop setNeedsDisplay];
}
- (void)dealloc {
[loop release];
loop = nil;
[super dealloc];
}
#end
Based on: http://coffeeshopped.com/2010/03/a-simpler-magnifying-glass-loupe-view-for-the-iphone
2. Add the following images:
Used images on the code:
loupe-hi#2x.png:
loupe-mask#2x.png:
Original but centered images with a shadow (not used at this moment):
loupe-shadow-hi#2x.png:
loupe-shadow-mask#2x.png:
3. Replace the main UIView on your xib-file by TouchReader
The magnifier will work automaticaly except controls that captures touch events themselfs (for example, UIWebView). And the code above doesn't support the images with a shadow. Please add new answer to the qustion if you successfully fix this issue.
UPDATED:
Change the following code to add UIWebView support. UIView should remain UIView.
#interface TouchReader : UILongPressGestureRecognizer
And add a gesture to webView:
TouchReader* gestureMagnifier = [[[TouchReader alloc] initWithTarget:self action:#selector(handleMagnifier:)] autorelease];
gestureMagnifier.webView = editSource;
gestureMagnifier.delegate = self;
gestureMagnifier.minimumPressDuration = 0.5;
[webView addGestureRecognizer:gestureMagnifier];
TouchReader.h:
//- (void)handleAction:(id)timerObj;
-(void) handleGestureAction:(CGPoint)location;
TouchReader.m:
-(void)awakeFromNib
{
UILongPressGestureRecognizer * longPressGesture = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleGesture:)];
[self addGestureRecognizer:longPressGesture];
}
-(void)handleGesture:(UILongPressGestureRecognizer *)longPressGesture
{
CGPoint location = [longPressGesture locationInView:self];
switch (longPressGesture.state) {
case UIGestureRecognizerStateBegan:
self.touchTimer = [NSTimer scheduledTimerWithTimeInterval:0.5
target:self
selector:#selector(addLoop)
userInfo:nil
repeats:NO];
// just create one loop and re-use it.
if(loop == nil){
loop = [[MagnifierView alloc] init];
loop.viewToMagnify = self;
}
loop.touchPoint = location;
[loop setNeedsDisplay];
break;
case UIGestureRecognizerStateChanged:
[self handleGestureAction:location];
break;
case UIGestureRecognizerStateEnded:
[self.touchTimer invalidate];
self.touchTimer = nil;
[loop removeFromSuperview];
loop=nil;
break;
default:
break;
}
}
- (void)addLoop {
// add the loop to the superview. if we add it to the view it magnifies, it'll magnify itself!
[self.superview addSubview:loop];
}
-(void) handleGestureAction:(CGPoint)location
{
loop.touchPoint = location;
[loop setNeedsDisplay];
}
You don't need the touches... methods any more.

UIImage can not detect touch

I used codes below to detect the touch on UIImage
//
#import <UIKit/UIKit.h>
#interface KUIImageView : UIImageView
{
CGPoint startLocation;
}
#end
#implementation KUIImageView
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
}
return self;
}
- (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event {
// Retrieve the touch point
CGPoint pt = [[touches anyObject] locationInView:self];
startLocation = pt;
[[self superview] bringSubviewToFront:self];
}
- (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event {
// Move relative to the original touch point
CGPoint pt = [[touches anyObject] locationInView:self];
CGRect frame = [self frame];
frame.origin.x += pt.x - startLocation.x;
frame.origin.y += pt.y - startLocation.y;
[self setFrame:frame];
}
#end
I set the breakpoint at touchesBegan
But it was not triggered
the UIImage was on a UIScrollView by using
[aScrollView addObject:aUIImageView];
Welcome any comment
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// here
self.userInteractionEnabled = YES;
}
return self;
}

Schedule Update from Another Layer

I have an sprite the I need to rotate it with touch but it is located in a different layer. Is it possible to update it's position?
E.G.
Sprite has it's own layer but it's position needs to be updated within the main gamescene
here is what I have so far.
#import <Foundation/Foundation.h>
#import "cocos2d.h"
#import "GameScene.h"
#interface G : CCLayer {
CCSprite *g;
CGFloat gRotation;
}
#end
------------------------------------------
#import "G.h"
#implementation G
-(id) init
{
if ((self = [super init]))
{
CCLOG(#"%#: %#", NSStringFromSelector(_cmd), self);
g = [CCSprite spriteWithFile:#"g.png"];
[self addChild:g z:-1];
//[self scheduleUpdate];
if (g.rotation == 360)
{
[self unscheduleUpdate];
}
}
return self;
}
- (void)update:(ccTime)delta
{
g.rotation = gRotation;
}
- (void)ccTouchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void)ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint firstLocation = [touch previousLocationInView:[touch view]];
CGPoint location = [touch locationInView:[touch view]];
CGPoint touchingPoint = [[CCDirector sharedDirector] convertToGL:location];
CGPoint firstTouchingPoint = [[CCDirector sharedDirector] convertToGL:firstLocation];
CGPoint firstVector = ccpSub(firstTouchingPoint, g.position);
CGFloat firstRotateAngle = -ccpToAngle(firstVector);
CGFloat previousTouch = CC_RADIANS_TO_DEGREES(firstRotateAngle);
CGPoint vector = ccpSub(touchingPoint, g.position);
CGFloat rotateAngle = -ccpToAngle(vector);
CGFloat currentTouch = CC_RADIANS_TO_DEGREES(rotateAngle);
gRotation += currentTouch - previousTouch;
}
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
}
- (void) dealloc
{
CCLOG(#"%#: %#", NSStringFromSelector(_cmd), self);
[super dealloc];
}
#end
GameScene
#import "GameScene.h"
#import "MainMenu.h"
#import "G.h"
#implementation GameScene
+(CCScene *) scene
{
CCScene *scene = [CCScene node];
GameScene *layer = [GameScene node];
[scene addChild: layer];
return scene;
}
-(void) tapG: (id) sender
{
G *gView;
gView = [[G alloc] init];
gView.position = ccp(100, 100);
[self.parent addChild:gView z:1001];
[gView schedule:#selector(update:)];
[gView release];
}
-(id) init
{
if ((self = [super init]))
{
tG = [CCMenuItemImage itemFromNormalImage:#"tp.png" selectedImage:#"tp.png" disabledImage:#"tpaperd.png" target:self selector:#selector(tapG:)];
gt = [CCMenu menuWithItems:tG, nil];
gt.position = ccp(210, 80);
[gt alignItemsHorizontallyWithPadding:10];
[self addChild:gt z:0];
}
return self;
}
- (void) dealloc
{
CCLOG(#"%#: %#", NSStringFromSelector(_cmd), self);
[super dealloc];
}
It brings up the sprite but the sprite does not rotate.
you can schedule updates for another layer.
are you sure your G layer is set to be touch enabled? try gView.isTouchEnabled = YES; after allocating gView. you may need to confirm to the protocol. Add <CCStandardTouchDelegate> to your CCLayer interface: #interface G : CCLayer <CCStandardTouchDelegate> { ....
also you can schedule -(void)update:(ccTime)delta method using [gView scheduleUpdate]; it will give the same result but its more convenient.

how to create a path using coregraphics?

i want to create a path.something like i touch the screen and draw line in touchmove event.when line intersect from starting point.fill that path using any colour.
now see in the image i've drawn a line.i just want to detect if line intersects again to start point.then fill that path with my own desired color.also i m using core graphics to draw line but it's very slow on real device.could you tell me a way to improve speed?
Header:
#import <UIKit/UIKit.h>
#interface myView : UIView {
CGMutablePathRef path;
CGPathRef drawingPath;
CGRect start;
BOOL outsideStart;
}
#end
Implementation:
#import "myView.h"
#implementation myView
- (id) init {
if (self = [super init]) {
self.userInteractionEnabled = YES;
self.multipleTouchEnabled = NO;
}
}
- (void) finishPath {
if (drawingPath) {
CGPathRelease(drawingPath);
}
CGPathCloseSubpath(path);
drawingPath = CGPathCreateCopy(path);
CGPathRelease(path);
path = NULL;
[self setNeedsDisplay];
return;
}
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
path = CGPathCreateMutable();
UITouch *t = [touches anyObject];
CGPoint p = [t locationInView:self];
start = CGRectZero;
start.origin = p;
start = CGRectInset(start,-10, -10);
outsideStart = NO;
CGPathMoveToPoint(path, NULL, p.x, p.y);
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if (!path) {
return;
}
UITouch *t = [touches anyObject];
CGPoint p = [t locationInView:self];
if (CGRectContainsPoint(start,p)) {
if (outsideStart) {
[self finishPath];
}
} else {
outsideStart = YES;
}
CGPathAddLineToPoint(path,NULL,p.x,p.y);
[self setNeedsDisplay];
}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (!path) {
return;
}
[self finishPath];
}
- (void) touchesCanceled:(NSSet *)touches withEvent:(UIEvent *)event {
if (!path) {
return;
}
CGPathRelease(path);
path = NULL;
}
- (void) drawInRect:(CGRect)rect {
CGContextRef g = UIGraphicsGetCurrentContext();
if (drawingPath) {
CGContextAddPath(g,drawingPath);
[[UIColor redColor] setFill];
[[UIColor blackColor] setStroke];
CGContextDrawPath(g,kCGPathFillStroke);
}
if (path) {
CGContextAddPath(g,path);
[[UIColor blackColor] setStroke];
CGContextDrawPath(g,kCGPathStroke);
}
}
- (void) dealloc {
if (drawingPath) {
CGPathRelease(drawingPath);
}
if (path) {
CGPathRelease(path);
}
[super dealloc];
}
#end
Note that you will probably want to do something so you aren't actually calling setNeedsDisplay every time the path changes. This can get very slow. Suggestions include having an NSTimer that fires every x milliseconds to check if it needs to redisplay and do so if it does, or only redrawing if the touch has moved a significant distance.
Maybe it can be useful to you link text
Core Graphics should definitely not be using [self setNeedsDisplay] every time the image changes, which is probably why your code is so slow on the device. Drawing is slow. If you use OpenGL ES to draw the lines, it will be much quicker, at the expense of being more difficult to understand.