I have an UIImage which is shown in an UIImageView. I also have another image in an UIImageView which lays above the first image. I want to be able to drag the second image only within the borders of the first image. To make my goal a bit more clearer look at this image:
.
The green pin should be dragable but it should not be possible to drag the pin into the blue (outside of the map).
At the moment the pin is dragable, but I don't know how to check if the pin is outside of the map.
EDIT:
I used this method in my UIImageView subclass for the drag able pin:
- (UIColor *)colorAtPosition:(CGPoint)position {
CGRect sourceRect = CGRectMake(position.x, position.y, 1.f, 1.f);
CGImageRef imageRef = CGImageCreateWithImageInRect([[MapViewController sharedMapViewController]getImage].CGImage, sourceRect);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *buffer = malloc(4);
CGBitmapInfo bitmapInfo = kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big;
CGContextRef context = CGBitmapContextCreate(buffer, 1, 1, 8, 4, colorSpace, bitmapInfo);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0.f, 0.f, 1.f, 1.f), imageRef);
CGImageRelease(imageRef);
CGContextRelease(context);
CGFloat r = buffer[0] / 255.f;
CGFloat g = buffer[1] / 255.f;
CGFloat b = buffer[2] / 255.f;
CGFloat a = buffer[3] / 255.f;
free(buffer);
return [UIColor colorWithRed:r green:g blue:b alpha:a];
}
The MapViewController is the Viewcontroller where the UIIImageView for the map is. So i made this class a singleton to get the map-image. But again the values i get for the color are totally wired. Also i updated the photo because my ui got slighty different.
Simply check for the point where the drag is and determine the color at that point using this method.
Depending on your setup, you can do this e.g. in touchesMoved:.
Have you tried implementing the UIResponder's touch methods in your custom UIViewController and then referencing the 2 UIViews as follows?
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* t = [touches anyObject];
if (t.tapCount == 1 && [yourPinView pointInside:pt withEvent:nil])
{
CGPoint pt = [t locationInView:yourMapView];
if ([self getColorOfPt:pt] != <blue>)
{
state = MOVING;
}
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if (MOVING == state)
{
UITouch* t = [touches anyObject];
// Move the pin only if the current point color is not blue.
if ([self getColorOfPt:pt] != <blue>)
{
CGRect r = yourPinView.frame;
r.origin.x = <new point based on diff>
r.origin.y = <new point based on diff>
yourPinView.frame = r;
}
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* t = [touches anyObject];
if (t.tapCount == 1 && MOVING == state)
{
state == NOT_MOVING;
}
}
You could try an approach with picking the color at a CGPoint of the UIImage: iPhone Objective C: How to get a pixel's color of the touched point on an UIImageView? and detect if it is 'a blue color'.And if it's a blue color, don't (re)position the pin
You can use the Answers of this Questions to get the pixel Color.
Get Pixel color of UIImage
How to get the RGB values for a pixel on an image on the iphone
I also found one Beautiful Tutorial : What Color is My Pixel? Image based color picker on iPhone
Then Detect if it's a Blue Color. And if it's a Blue Color then as #basvk said just don't (re)position the pin.
Hope you get something from this.
Am just giving you an idea ..
I hope the map is an SVG file or if its jpeg it can be easily converted to SVG using Adobe Illustrator. Or there are many other options to do it.
and here you can find how to covert svg to a path..
here you can find how to convert EPS (illustrator paths) to CGPath
then you can check for the touch point to lie within the path
if([CGPathContainsPoint:myPoint]){
}
else{
}
Related
this is my first question so please bear with me!
Im trying to write up a simple drawing app basically, I was using Core Graphics before, and the only problem was it was too slow, and when I drew with my finger it lagged, a hell of alot!
So, now I'm trying to use UIBezier paths to draw, as I understood to be alot faster, which it is!
When I was using Core Graphics, to keep the drawing speed up I was drawing to a custom bitmap context I created, which was constantly being updated as I drew.
So, I drew to my custom Bitmap context, then a CGImageRef was set to what was drawn in that context using -
cacheImage = CGBitmapContextCreateImage(imageContext);
and that was then drawn back into the bitmap context using -
CGContextDrawImage(imageContext, self.bounds, cacheImage); )
I also did this so when I changed the colour of the line being drawn, the rest of the drawing stayed as it was previously drawn, if that makes sense.
Now the problem Ive come across is this.
Im trying to draw the UIBezier path to my image context using -
imageContext = UIGraphicsGetCurrentContext();
UIGraphicsPushContext(imageContext);
[path stroke];
if(imageContext != nil){
cacheImage = CGBitmapContextCreateImage(imageContext); //invalid context here so added to solve
}
CGContextScaleCTM(imageContext, 1, -1); // using this as UIBezier
CGContextDrawImage(imageContext, self.bounds, cacheImage); // draws the current context;
[path removeAllPoints];
CGImageRelease(cacheImage); // releases the image to solve memory errors.
with path being my UIBezierPath. All the path set up is done in touches began and touched moved then calling [self setNeedsDisplay]; to call drawRect.
What's happening is when I draw, its either not drawing the CGImageRef to the context properly, or it is, but when its capturing the cache image its capturing a white background from somewhere, instead of just the path, and so its pasting over the entire image with the last path drawn together with a white background fill, so you cant see the last path that was drawn to build the image up, even though the views background colour is clearColor.
I really hope I'm making sense, I've just spent too many hours on this and Its drained me completely. Heres the drawing method I'm using -
This to create the image context -
-(CGContextRef) myCreateBitmapContext: (int) pixelsWide:(int) pixelsHigh{
imageContext = NULL;
CGColorSpaceRef colorSpace; // creating a colorspaceref
void * bitmapData; // creating bitmap data
int bitmapByteCount; // the bytes per count
int bitmapBytesPerRow; // number of bytes per row
bitmapBytesPerRow = (pixelsWide * 4); // calculating how many bytes per row the context needs
bitmapByteCount = (bitmapBytesPerRow * pixelsHigh); // how many bytes there are in total
colorSpace = CGColorSpaceCreateDeviceRGB(); // setting the colourspaceRef
bitmapData = malloc( bitmapByteCount ); // calculating the data
if (bitmapData == NULL)
{
//NSLog(#"Memory not allocated!");
return NULL;
}
imageContext = CGBitmapContextCreate (bitmapData,
pixelsWide,
pixelsHigh,
8, // bits per component
bitmapBytesPerRow,
colorSpace,kCGImageAlphaPremultipliedLast);
if (imageContext== NULL)
{
free (bitmapData);
NSLog(#"context not created allocated!");
return NULL;
}
CGColorSpaceRelease( colorSpace ); //releasing the colorspace
CGContextSetRGBFillColor(imageContext, 1.0, 1.0, 1.0, 0.0); // filling the bitmap with a white background
CGContextFillRect(imageContext, self.bounds);
CGContextSetShouldAntialias(imageContext, YES);
return imageContext;
}
And heres my drawing -
-(void)drawRect:(CGRect)rect
{
DataClass *data = [DataClass sharedInstance];
[data.lineColor setStroke];
[path setLineWidth:data.lineWidth];
imageContext = UIGraphicsGetCurrentContext();
UIGraphicsPushContext(imageContext);
[path stroke];
if(imageContext != nil){
cacheImage = CGBitmapContextCreateImage(imageContext);
}
CGContextScaleCTM(imageContext, 1, -1); // this one
CGContextDrawImage(imageContext, self.bounds, cacheImage); // draws the current context;
[path removeAllPoints];
CGImageRelease(cacheImage); // releases the image to solve memory errors.
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
DataClass *data = [DataClass sharedInstance];
CGContextSetStrokeColorWithColor(imageContext, [data.lineColor CGColor]);
ctr = 0;
UITouch *touch2 = [touches anyObject];
pts[0] = [touch2 locationInView:self];
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self];
ctr++;
pts[ctr] = p;
if (ctr == 4)
{
pts[3] = CGPointMake((pts[2].x + pts[4].x)/2.0, (pts[2].y + pts[4].y)/2.0); // move the endpoint to the middle of the line joining the second control point of the first Bezier segment and the first control point of the second Bezier segment
[path moveToPoint:pts[0]];
[path addCurveToPoint:pts[3] controlPoint1:pts[1] controlPoint2:pts[2]]; // add a cubic Bezier from pt[0] to pt[3], with control points pt[1] and pt[2]
//[data.lineColor setStroke];
[self setNeedsDisplay];
// replace points and get ready to handle the next segment
pts[0] = pts[3];
pts[1] = pts[4];
ctr = 1;
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
[path removeAllPoints];
[self setNeedsDisplay];
ctr = 0;
}
'path' is my UIBezierPath
'cacheImage' is a CGImageRef
'imageContext' is a CGContextRef
Any Help is much appreciated! And if you can think of a better way to do it, please let me know! I do however need the cache image to have a transparent background, so its just the paths visible, as I'm going to apply something later on when I get this working!
EDIT Also I'm removing the points everytime to keep the drawing speed up, just so you know!
Thanks in advance :)
Well, this is a big question. One potential lead would be to verify that you draw exactly what you need (not the whole image all the time), and to divide the invariant bitmap from those regions/rects which actively mutate across multiple layers.
I am trying to develop a game using openGL where i have used GLTextureLoader class to load images and these sprites are moving from left to right with some calculated velocity , i need to detect touch on these images.
Since your purpose is very simple, all you have to do is draw whatever object you have twice, the visible one and another one with just color on an invisible buffer. then you check for the location where the user pressed in the invisible buffer, see what color it is and there you have your object.
http://www.lighthouse3d.com/opengl/picking/index.php3?color1
That is the basic theory.
OpenGL is a rendering API. It only draws stuff.
Techniques like lighthouse3d do work, but glReadPixels is slow.
You should check this on the CPU; that is, for each drawn sprite, test if the touch position is inside.
I found out how to do it ,as per my requirement , As i said i am not the expert in openGL but managed to do it some way around.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[touches allObjects] objectAtIndex:0];
CGPoint touchLocation = [touch locationInView:self.view];
touchLocation = touchLocation = CGPointMake(touchLocation.x, 320 - touchLocation.y);
//self.player.moveVelocity = GLKVector2Make(50, 50);
//NSLog(#"Location in view %#", NSStringFromCGPoint(touchLocation));
//NSLog(#"bounding box is %#",NSStringFromCGRect([self.player boundingBox]));
GameSprite *temp;
for (GameSprite *tempSprite in self.children) {
if (CGRectContainsPoint([tempSprite boundingBox], touchLocation)) {
NSLog(#"touched the player");
temp =tempSprite;
}
}
[self.children removeObject:temp];
}
- (CGRect)boundingBox {
CGRect rect = CGRectMake(self.position.x, self.position.y, self.contentSize.width, self.contentSize.height);
return rect;
}
I have been making paint application and i have implemented successfully drawing with paint brush..My problem is I do not want color to be redrawn on the screen where color is already been painted..Have a look in screenshot..I am using OpenGl ES.. I tried to match screens pixels' color with current brush's color but I could not get it right..Can anyone tell me where i am going wrong?? Thanks..
Screenshot for probliem : http://imageshack.us/photo/my-images/842/screenshot20110726at601.png/
I am pasting my code here:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
CGRect bounds =[self bounds];
UITouch *touch = [[event touchesForView:self] anyObject];
//glDisable(GL_BLEND);
glEnable(GL_BLEND);
Byte pixel[4];
glReadPixels(location.x,location.y, 1, 1, GL_RGBA, GL_UNSIGNED_BYTE, &pixel);
NSLog(#"%d %d %d %d",pixel[0],pixel[1],pixel[2],pixel[3]);
r_comp=(float)pixel[0]/255;
g_comp=(float)pixel[1]/255;
b_comp=(float)pixel[2]/255;
artbrushAppDelegate *app=(artbrushAppDelegate *)[[UIApplication sharedApplication]delegate];
if(r_comp==app.rg && g_comp==app.gg && b_comp==app.bg)
// app.rg,app.bg,app.gg are brush colors saved in global varable
{
glEnable(GL_BLEND);
}
else
{
glDisable(GL_BLEND);
}
You can apply "layers" mechanism, where the paint is opaque but they layer as a whole is partly transparent.. I read it in forum somewhere..
I would like to develop an app when the user can draw lines... but I do not want to draw straight lines but want to show the line as the users draws it. When the user gets from point A to B I would like to straighten the line (if the users wants this).
To be able to do this I want to change my view into a grid starting at 0,0 (top left) and ending at 320,480 (for iPhone) and 768,1024 (for iPad) (bottom right).
For this question I have point A at 10,10 and point B at 100,100.
My question:
- How do I create this grid?
- How do I create these points?
- How do I draw this line without straightening it?
- How do I draw the straighten line?
My problem is that I am familiar with creating "normal" UI apps. I am not familiar with Open-GL ect.
I hope someone can help me with this.
Best regards,
Paul Peelen
You subclass your UIView and override the - (void)drawRect:(CGRect)rect method.
In there you grab a graphics context:
CGContextRef context = UIGraphicsGetCurrentContext();
And you use that to make Core Graphics calls, like:
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextBeginPath (context);
for (k = 0; k < count; k += 2) {
CGContextMoveToPoint(context, s[k].x, s[k].y);
CGContextAddLineToPoint(context, s[k+1].x, s[k+1].y);
}
CGContextStrokePath(context);
Look up the Quartz 2D Programming Guide for all the details.
You can drag straight line when user drag it based on starting and ending point draw a line using UIBezierPath and CAShapeLayer:
- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
startingPoint = [touch locationInView:baseHolderView];
}
- (void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
endingPoint = [touch locationInView:baseHolderView];
[self makeLineLayer:baseHolderView.layer lineFromPointA:startingPoint toPointB:endingPoint];
}
-(void)makeLineLayer:(CALayer *)layer lineFromPointA:(CGPoint)pointA toPointB:(CGPoint)pointB
{
CAShapeLayer *line = [CAShapeLayer layer];
UIBezierPath *linePath=[UIBezierPath bezierPath];
[linePath moveToPoint: pointA];
[linePath addLineToPoint:pointB];
line.path=linePath.CGPath;
line.fillColor = nil;
line.opacity = 2.0;
line.strokeColor = [UIColor blackColor].CGColor;
[layer addSublayer:line];
}
Hope this will help to achieve your goal.
I am trying to implement an application in which I display a color image as backgroundColor of the custom View.Once the user touches a particular area on the image (dispalyed as background color of the view), the pixels where the touch has occured should be retrieved and the corresponding color should be created using UIColor.
I have a code, in order to access the particular pixel from an ImageView as follows:
CFDataRef CopyImagePixels(CGImageRef inImage)
{
return CGDataProviderCopyData(CGImageGetDataProvider(inImage));
}
From the above code , I got the raw pixel data of the image. Then I created a custom UIView , and the particular Images is set as the background color of the custom view. Then I captured the Touches using the method "handleTouchesBegan" . Then i got the x and y co-ordinates from the parameters. The code is as follows :
Getting the pixel value :
NSData *rawData = (NSData *)CGDataProviderCopyData(CGImageGetDataProvider(image1.CGImage));
uint *x = malloc (rawData.length);
NSLog(#"Length of raw data :%d", rawData.length);
[rawData getBytes: x];
getting the touches in touchesBegan method :
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches)
{
// Sends to the dispatch method, which will make sure the appropriate subview is acted upon
CGPoint pt = [touch locationInView:upperImage];
indexValue = pt.x * pt.y;
int alpha = x[indexValue];
int red = x[indexValue + 1];
int green = x[indexValue + 2];
int blue = x[indexValue + 3];
self.view.backgroundColor = [ UIColor colorWithRed:red green:green blue:blue alpha:alpha];
// TODO : compare the values with raw image Bytes ...
}
}
The issue is I cant create a color as above .. Any help would be greatly appreciated ...
Best Regards,
Mohammed Sadiq.
The alpha value may be at the beginning or the end depending on the format of the bitmap. The RGB values may also be premultiplied with the alpha value. You can use CGImageGetBitmapInfo to get more info on the format of the bitmap.
Print out the RGB values and see if they look right.
Sounds like an issue similar to this question.